Reputation: 1379
I have a problem with exporting large amount of data to csv file using php.
Information:
My solution(what have i tried)
Get data part by part(from database) process this data(fputcsv
) write this part to the temporary file - and send information to user via Ajax
(show him the amount of processed Percentage). After last part of data has been processed just give user link to download this file. All is fine i have did this and this solution works for me - on my local enviroment, but
the problem is - project I'm working on working with multiple servers
so I ran into a problem that temporary file can be stored on different servers.
For Example:
I have 3 servers: Server1
, Server2
and Server3
.
First time i read data from db with limit 0 50000
- process it and save it to File.csv
on Server1
, next iteration limit 50000, 50000
can be saved on another server Server2
- this is the problem.
So my question is:
Where i can store my processed temporary csv data, or maybe i am missing something, i am stuck here, looking for advice. Every suggestion or solution will be appreciated! Thanks.
UPDATE
PROBLEM IS SOLVED
Later i will post my solution
Upvotes: 1
Views: 3378
Reputation: 2683
you can increase the execution time of your php code using ini_set('max_execution_time', seconds in numbers);
Upvotes: 0
Reputation: 271
You can use the mysql query with limits, to dircly export the records into csv file from mysql database.
SELECT id, name, email INTO OUTFILE '/tmp/result.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
ESCAPED BY ‘\\’
LINES TERMINATED BY '\n'
FROM users WHERE 1
Upvotes: 1
Reputation: 775
It would really be helpful if you posted your code. The reason I'm saying that is because it doesn't sound like you're looping row after row which is will save you heaps of memory - no huge array to keep in the RAM. If you're not looping row by row and committing to the CSV file as you go, then I suggest you modify your code to do just that and it might solve the issue altogether.
If indeed, even committing to the CSV row by row is not enough. Then the issue you're running into is your servers setup relies on the code to be stateless, but your code isn't.
You can solve this issue using either of the following ways:
Hope this helps.
Upvotes: 0