Cédric Girard
Cédric Girard

Reputation: 3410

The best way to read large files in PHP?

I have to read CSV files line by line which can be 10 to 20 Meg. file() is useless and I have to find the quickest way.

I have tried with fgets(), which runs fine, but I don't know if it reads a small block each time I call it, or if it caches a bigger one and optimize file I/O. Do I have to try the fread() way, parsing EOL by myself?

Upvotes: 2

Views: 863

Answers (4)

Treb
Treb

Reputation: 20271

You should have a look at fgetcsv(), it automatically parses the coma seperated line into an array.

As for the runtime efficiency, I have no idea. You will have to run a quick test, preferably with a file of the size you are expecting to handle later on. But I would be surprised if the fget??? and fput??? functions were not I/O optimised.

Upvotes: 0

Ciaran McNulty
Ciaran McNulty

Reputation: 18848

stream_get_line is apparently more efficient than fgets for large files. If you specify a sensible maximum length for the read I don't see any reason why PHP would have to 'read ahead' to read a line in, as you seem to be worrying.

If you want to use CSVs then fgetcsv will return results in a slightly more sensible format.

Upvotes: 2

gnud
gnud

Reputation: 78518

You ought to be using fgetcsv() if possible.

Otherwise, there is always fgets().

Upvotes: 7

Greg
Greg

Reputation: 321588

fgets() should be perfectly fine for your needs. Even file() should be ok - 20mb isn't very big unless you're doing this a lot of times concurrently.

Don't forget you can tune fgets() with its second parameter.

Upvotes: 1

Related Questions