Willow
Willow

Reputation: 1040

How do you process large amounts of data without using a database?

I am loading data from 365 csv files into the highstock graphing api. I am using PHP to read through the csv files and create arrays to hold the information. However I've encountered the:

Fatal error: Allowed memory size of 67108864 bytes exhausted

How do I work around this?


Hoping to create this graph:

http://www.highcharts.com/stock/demo/compare

Upvotes: 2

Views: 1351

Answers (1)

Orangepill
Orangepill

Reputation: 24655

Instead of representing everything in memory as arrays it might be better just to go straight to json files with it. I'm going to make the assumption that the data you need is a 2 dimensional multidimensional array that contains a timestamp + 6 floating point fields.

Without knowing a lot of detail about how the information has to be served up to the charting api here is a first stab.

$tmpFile = tempnam("tmp/","highchart_");
$out = fopen($tmpFile, "w");
// we are not going to use json encode because it requires us to hold the array in memory. 
fputs($out, "[");
for($i=0;$i<count($files);$i++){
    if (($handle = fopen($files[$i], "r")) !== FALSE) {
        while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
            // You may be able to get arround the timestamp calculation by just saying
            $timestamp = strtotime($data[0]." ".$data[1]);
            fputs($out, "[".(int)$timestamp.",".(float)$data[2].",".
                            (float)$data[3].",".(float)$data[4].",".(float)$data[5].",".(float)$data[13]."]");
        }
        fclose($handle);
    }
}
fputs($out, "]");
fclose($out);

Now the file at $tmpFile would contain a json encoded array which you can then just use readfile or fpassthru to get to the browser. Also I would urge you to use some sort of caching mechanism for this instead of just storing them in a temp file. +67MB of data is quite a beefy amount to chugging through with each request.

Upvotes: 2

Related Questions