Dython
Dython

Reputation: 134

php getting big data from url - optimise

I am using file_get_contents to get 1 million records from URL and output the results which is in json format and I can't go for pagination and currently working by increasing my memory. Is there any other solution for this?

Upvotes: 1

Views: 110

Answers (1)

Mahesh Gareja
Mahesh Gareja

Reputation: 1812

If you're processing large amounts of data, fscanf will probably prove valuable and more efficient than, say, using file followed by a split and sprintf command. In contrast, if you're simply echoing a large amount of text with little modification, file, file_get_contents, or readfile might make more sense. This would likely be the case if you're using PHP for caching or even to create a makeshift proxy server.

More The right way to read files with PHP

Upvotes: 1

Related Questions