Reputation: 317
I'm trying to make a site for my battlefield 1 clan, on one of the pages i'd like to display our team and some of their stats.
This API allows me to request just what I need, I decide to use php curl requests to get this data on my site. It all works perfectly fine, but it is super slow, sometimes it even reaches the 30s max of php.
Here is my code
<?php
$data = $connection->query("SELECT * FROM bfplayers");
while($row = mysqli_fetch_assoc($data)){
$psnid = $row['psnid'];
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "https://battlefieldtracker.com/bf1/api/Stats/BasicStats?platform=2&displayName=".$psnid);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_HEADER, FALSE);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
$headers = [
'TRN-Api-Key: MYKEY',
];
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
$response = curl_exec($ch);
curl_close($ch);
$result = json_decode($response, true);
print($result['profile']['displayName']);
}
?>
I have no idea why it's going this slow, is it because I am using xamp on localhost or because the requests are going through a loop?
Thanks in advance
Upvotes: 0
Views: 1975
Reputation: 21483
your loop is not optimized in the slightest, i believe if you optimized your loop code, your code could run A LOT faster. you create and delete the curl handle on each iteration, when you could just keep re-using the same curl handle on each player (this would use less cpu and be faster), you don't use compressed transfer (enabling compression would probably make the transfer faster), and most importantly, you run the api calls sequentially, i believe if you did the api requests in parallel, it would load much faster. also, you don't urlencode psnid, that's probably a bug. try this
<?php
$cmh = curl_multi_init ();
$curls = array ();
$data = $connection->query ( "SELECT * FROM bfplayers" );
while ( ($row = mysqli_fetch_assoc ( $data )) ) {
$psnid = $row ['psnid'];
$tmp = array ();
$tmp [0] = ($ch = curl_init ());
$tmp [1] = tmpfile ();
$curls [] = $tmp;
curl_setopt_array ( $ch, array (
CURLOPT_URL => "https://battlefieldtracker.com/bf1/api/Stats/BasicStats?platform=2&displayName=" . urlencode ( $psnid ),
CURLOPT_ENCODING => '',
CURLOPT_SSL_VERIFYPEER => false,
CURLOPT_SSL_VERIFYHOST => false,
CURLOPT_HTTPHEADER => array (
'TRN-Api-Key: MYKEY'
),
CURLOPT_FILE => $tmp [1]
) );
curl_multi_add_handle ( $cmh, $ch );
curl_multi_exec ( $cmh, $active );
}
do {
do {
$ret = curl_multi_exec ( $cmh, $active );
} while ( $ret == CURLM_CALL_MULTI_PERFORM );
curl_multi_select ( $cmh, 1 );
} while ( $active );
foreach ( $curls as $curr ) {
fseek ( $curr [1], 0, SEEK_SET ); // https://bugs.php.net/bug.php?id=76268
$response = stream_get_contents ( $curr [1] );
$result = json_decode ( $response,true );
print ($result ['profile'] ['displayName']) ;
}
// the rest is just cleanup, the client shouldn't have to wait for this
// OPTIMIZEME: apache version of fastcgi_finish_request() ?
if (is_callable ( 'fastcgi_finish_request' )) {
fastcgi_finish_request ();
}
foreach ( $curls as $curr ) {
curl_multi_remove_handle ( $cmh, $curr [0] );
curl_close ( $curr [0] );
fclose ( $curr [1] );
}
curl_multi_close ( $cmh );
also, if mysqli_fetch_assoc() are causing slow roundtrips to your db, it would probably be even faster to replace it with mysqli_fetch_all()
also, something that would probably be much faster than this, would be to have a cronjob run every minute (or every 10 seconds?) that caches the results, and show a cached result to the client. (even if the api calls lags, the client pageload wouldn't be affected at all.)
Upvotes: 2