Mark Ainsworth
Mark Ainsworth

Reputation: 857

json_encode will not encode more than 6670 rows

Someone else had this issue back in '07, but it was not answered: in PHP, json_encode($someArray) silently fails (returns null) if the array has more than 6,670 rows. Does someone have a workaround. Limiting the size of the array to 6600, for example produces the expected result. I can do some do multiple json calls with partial arrays and concatenate results, but that involves some funky string manipulation to get them to stitch together properly, and I would like to avoid that.

Upvotes: 3

Views: 797

Answers (2)

Geert van Dijk
Geert van Dijk

Reputation: 136

Possibly depends on your PHP version and the memory allowed for use by PHP (and possibly the actual size of all the data in the array, too). What you could do if all else fails, is this: Write a function that checks the size of a given array, then split off a smaller part that will encode, then keep doing this until all parts are encoded, then join them again. If this is the route you end up taking, post a comment on this answer, and perhaps I can help you out with it. (This is based on the answer by Nytrix) EDIT, example below:

function encodeArray($array, $threshold = 6670) {
    $json = array();
    while (count($array) > 0) {
        $partial_array = array_slice($array, 0, $threshold);
        $json[] = ltrim(rtrim(json_encode($partial_array), "]"), "[");
        $array = array_slice($array, $threshold);
    }

    $json = '[' . implode(',', $json) . ']';
    return $json;
}

Upvotes: 1

Nytrix
Nytrix

Reputation: 1139

You could always just first slice the array into 2 parts (assuming it's not bigger then 2 times those rows). After that encode it, and add them together again. In case that isn't a solution, you need to increase your memory limit.

Here is an example, test it here. On @GeertvanDijk suggestion, I made this a flexible function, in order to increase functionality!

<?php
  $bigarray = array(1,2,3,4,5,6,7,8,9);

function json_encode_big($bigarray,$split_over = 6670){
  $parts = ceil((count($bigarray) / $split_over));

  $start = 0;
  $end = $split_over;

  $jsoneconded = [];

  for($part = 0; $part < $parts; $part++){
    $half1 = array_slice($bigarray,$start,$end);
    $name = "json".$part;
    $$name = ltrim(rtrim(json_encode($half1),"]"),"[");
    $jsoneconded[] = $$name;
    $start += $split_over;
  }

  return "[".implode($jsoneconded,",")."]";
}

print_r(json_encode_big($bigarray));
?>

I have now test this with more rows then 6,670. You can test it online here as well.

Now, I have to mention that I tested the normal json_encode() with a million rows, no problem. Yet, I still hope this solves your problem...


In case that you run out of memory, you can set the memory_limit to a higher value. I would advice against this, instead I would retrieve the data in parts, and procces those in parts. As I don't know how you retrieve this data, I can't give an example how to regulate that. Here is how you change the memory in case you need to (sometimes it is the only option, and still "good").

ini_set("memory_limit","256M");

In this case, it is the dubbel from the default, you can see that it is 128M in this documentation

Upvotes: 3

Related Questions