Sam Munroe
Sam Munroe

Reputation: 1418

Handling data that is too large for json_encode (PHP)

I am trying to grab some database data via ajax in my symfony project. I grabbed all estimates in our database and there are over 60k. When I try and json_encode the results, I get an out of memory error. I know it has to do with the size because I can json_encode one result and it returns fine.

$results = new \stdClass;
$results->jobs = $epms->getEstimates();

// JSON encode results to send to the view. The view just echos them out for jQuery to process.
$json_encoded = json_encode($results->jobs[1]); //Works
$json_encoded = json_encode($results->jobs); //Doesn't work
$headers = array(
    'Content-Type' => 'application/json'
);
$response = new Response($json_encoded, 200, $headers);
return $response;

I'm a PHP noob but my idea was to split up the results and encode the smaller pieces which I could then join together before returning to my ajax call. What is the best approach to this? I don't want to increase my php memory limit.

Upvotes: 1

Views: 3166

Answers (1)

Death-is-the-real-truth
Death-is-the-real-truth

Reputation: 72269

You need to increase memory limit dynamically:-

ini_set('memory_limit','16M');// change 16M to your desired number

Note:-

The above code will increase the maximum amount of memory available to PHP to 16 MB and this setting is only adjusted for the running script.

Upvotes: 2

Related Questions