Robi Kumar Tomar
Robi Kumar Tomar

Reputation: 3404

Allowed memory size of 1073741824 bytes exhausted (tried to allocate 80 bytes) while fetching json data from server in php

I am trying to fetch the huge data from server around 500000 rows from database in json format, and getting memory exhausted error. Every thing works fine when I tries with 100 rows, but i need to fetch the 500000 rows in a one go. I am using postman extension to fetch this data. Guys I am not the champ of database but i tried a whole day and found some useful hint this this this and many more but no luck.

I know this is not a good solution but I also tried this ini_set('memory_limit', '-1');

This is what I am doing :

 $response["data"] = array();

        $result = $db->getData();
        while ($data= $result->fetch_assoc()) {
         array_push($response["data"], $data);
             }

          file_put_contents('newData.json', json_encode($response,128));
        echoRespnse(200, $response);

I have given my 1 day and tried almost 100's of solution but no luck, I also know this is posted many times, but I thinks my problem is a bit different. Please give your valuable suggestion, every suggestion will be appreciated.

Upvotes: 1

Views: 1610

Answers (4)

Robi Kumar Tomar
Robi Kumar Tomar

Reputation: 3404

Thanks for all who gave their time here, Now my one part of problem solved. So I think solution must be here, by this solution you can fetch(Or show into browser) 1 lac to 15 lac rows in the form of json in a single go.

$result = mysql_query($sql);
if(mysql_num_rows($result)){
    echo '{"data":[';

    $rkt= true;
    $my_row=mysql_fetch_assoc($result);
    while($my_row=mysql_fetch_row($result)){
        if($rkt) {
            $rkt= false;
        } else {
            echo ',';
        }
        echo json_encode($my_row);
    }
    echo ']}';
} else {
    echo '[]';
}

Now trying to solve 2nd part, "How to copy this all json data in .json file", So i could make a download to it.

Here file_put_contents('MyFileName', json_encode($my_row)); not working, with above approach.

Edited(Solved): May be it can be refine in more robust and clean manner. But This one also works for me, here is complete code to fetch huge data(around 12 lac rows) and write it to in .json file without any failure.

$result = mysql_query($sql);
$arr = array();

if(mysql_num_rows($result)){
    echo '{"data":[';
      $arr[] = '{"data":[';

    $rkt= true;

    while($my_row=mysql_fetch_row($result)){
        //  cast results to specific data types

        if($rkt) {
            $rkt= false;
        } else {
            echo ',';
 $arr[] = ',';

        }
$robi_json = json_encode($my_row,200);
$arr[] = $robi_json;

echo $robi_json;


    }

    echo ']}';
$arr[] = ']}';
file_put_contents('data_file.json', $arr);
} else {
    echo '[]';
}

Thanks all for giving a right direction, specially thanks for Paul Dixon.

Upvotes: 0

Paul Dixon
Paul Dixon

Reputation: 301085

It would use far less memory if you output the JSON directly to a file as you read the rows from the database. For example, you could first output {"data":[ to begin the json, then output each row encoded by json_encode, but separated with commas. Finally, close off the JSON with ]}

Now you have your json in a file, you can rewind to the start, and use fpassthru to send it as the response.

This way, you can deal with arbitrarily large responses as you'll never need to hold the whole thing in memory.

Dealing with such a large JSON file is another matter entirely :)

Upvotes: 1

Amarnasan
Amarnasan

Reputation: 15579

Here you have some pseudo code for you:

while (there are rows to get) {
   rows = get the next 1000 rows
   write down the rows
}

It works for any language, just adapt id :D

Upvotes: 0

vacsora
vacsora

Reputation: 207

Whatever you try to do with half a million records on client side, try to do on the server instead. If you just need to display them, you might consider paging. Get something like the first 1000 records (e.g. LIMIT 1000) and let the user navigate the rest.

Upvotes: 1

Related Questions