Reputation: 3410
I have to read CSV files line by line which can be 10 to 20 Meg. file()
is useless and I have to find the quickest way.
I have tried with fgets()
, which runs fine, but I don't know if it reads a small block each time I call it, or if it caches a bigger one and optimize file I/O.
Do I have to try the fread()
way, parsing EOL by myself?
Upvotes: 2
Views: 863
Reputation: 20271
You should have a look at fgetcsv()
, it automatically parses the coma seperated line into an array.
As for the runtime efficiency, I have no idea. You will have to run a quick test, preferably with a file of the size you are expecting to handle later on. But I would be surprised if the fget??? and fput??? functions were not I/O optimised.
Upvotes: 0
Reputation: 18848
stream_get_line is apparently more efficient than fgets for large files. If you specify a sensible maximum length for the read I don't see any reason why PHP would have to 'read ahead' to read a line in, as you seem to be worrying.
If you want to use CSVs then fgetcsv will return results in a slightly more sensible format.
Upvotes: 2