Yasiru G
Yasiru G

Reputation: 7144

PHP script stops working when handling large size of string

I'm testing a php script to create csv file which containing large amount of data. This is the way I do this task:

$csvdata = "ID,FN,LN,ADDR,TEL,PRO\n
            1,fn1,ln1,addr1,tel1,pro1\n...etc,";
$fname = "./temp/filename.csv";
$fp = fopen($fname,'w');
fwrite($fp,$csvdata);
fclose($fp);

I have notice that when the string ($csvdata) contain like 100,000 data rows the script work fine. But when it gets about more that 10,00,000 data rows it stop in the middle where I build the string $csvdata (I'm building $csvdata string by concatenating data in a for loop, data from database).

Could someone let me know what's went wrong when we use large string value?

Thank you Kind regards

Upvotes: 2

Views: 1478

Answers (3)

Kishor
Kishor

Reputation: 1513

It should be at the $csvdata= part where your script probably gives out a Memory Exhausted error.

When you save 10 chars to your a variable, it takes 10 bits, and its keeps getting bigger. And the limit is when it reaches the allocated memory or php.

So this is how you move on :

  1. Set Memory Limit - Increase your PHP memory limit

`ini_set('memory_limit', '256M');

2.Write line by line

Write each piece of data to the file rightaway instead of piling them up. Also, If you write array[0] to the file, and then stores the next piece of data to array[1] and write again and continue this, it would be of the same effect of what you do now.

So either

while(blah blah){
$var = "text";
fwrite($file,$var);

or in a for loop

for($i=0;blahblah){
$var[$i] = "query";
fwrite($file,$var[$i]);
unset($var);

For loop comes in handy when the database queries are conditionals with WHERE id='$i'

Good luck.

Upvotes: 1

Nanne
Nanne

Reputation: 64399

Check out your error log. It wil probably say something about

  1. you trying to allocate memory exceeding some maximum. This means you are using too much memory -> you can change the amount of memory php is allowed to use (memory limit in php.ini)

  2. The execution time is longer then the allowed time. Again, you can increase this time.

Upvotes: 1

codaddict
codaddict

Reputation: 454960

I'm building $csvdata string by concatenating data in a for loop, data from database.

From this I gather that you are preparing the entire file to be written as a string and finally writing it to the file.

If so, then this approach is not good. Remember that the string is present in the main memory and every time you append, you are consuming more memory. The fix for this is to write the file line by line that is read a record from the database, transform/format if you want to, prepare the CSV row and write it to the file.

Upvotes: 5

Related Questions