Reputation: 35
I'm currently working on a fansite(just for my purposes) which gets all it's data from different APIs like this:
$newjson = file_get_contents(LINK_TO_JSON_HERE);
$newarr = json_decode($newjson);
My Problem with this: currently the site loads 13 different (huge)JSON-files, which slows the site down very much. The loading takes up to 30-45 seconds, which isn't really acceptable, but I tried to get it working before optimizing it. Is it possible to speed this up by using other functions? Seeing other fansites loading the same content within 1 second made me asking.
About the functionality: I load the whole JSON into the array and then take out the info I need, which is often less than 1% of the JSON-file's text. Is there a way to filter the things I need before loading the whole document into an array?
Thanks in advance
Upvotes: 0
Views: 3350
Reputation: 20737
Odds are very high that the JSON is not slowing you down but just the file_get_contents()
. Benchmark your issue properly so that you are not wasting your time optimizing the wrong thing.
<?php
$start = microtime(true);
$newjson = file_get_contents(LINK_TO_JSON_HERE);
echo 'file_get_contents('.htmlentities($LINK_TO_JSON_HERE).'): '.(microtime(true) - $start).' seconds<br>';
$start = microtime(true);
$newarr = json_decode($newjson);
echo 'json_decode(): '.(microtime(true) - $start).' seconds<br><br>';
Anyways, the best way to maintain real-time data and get better speed is to switch to curl_multi_exec()
I'm not sure how often the data is updated on these APIs so you could also develop some sort of caching mechanism which fetches the data a few times per day and saves it locally to a JSON file.
Upvotes: 3
Reputation: 41
This is not a problem of the speed of functions, but about waiting for a response from the opposite server. If its possible you can cache the data.
Upvotes: 0