Reputation: 1
I am very new to coding in PHP/MySQL, for fun I've been downloading some World of Worldcraft data from their API to test what I've learnt about PHP.
Sorry for the long post, I'm trying to give as much info as needed, whilst being concise :)
The short version is "my code takes 25 minutes to run. Can I do anything (things) better to speed this up?
LONG VERSION BELOW
Blizzard have a lot of data held about their servers, in a JSON format & there's a JSON file per server (I've a list 123 servers in total). I've written a basic JSON file that holds the name & URL of the 123 servers (serv_gb.json).
The code should run as follows:-
There is more data in the JSON file than I need. I've tried two ways to filter this.
a) Just try to send all of the data in the JSON & if it fails to & would normally error, hide it with "error_reporting(0);". (Lazy, yes, because this is a pet project just for me, I thought I'd try it!).
b) Adding the following If loop around the SQL to only send the correct data to the database.
foreach($auctions as $val)
if ($val['item'] == 82800) {
{
$sql = "INSERT INTO `petshop`.`ah` (datedatacollected, seller, petSpeciesId, server, buyout, region, realrealm) VALUES('".$cTime."','".$val['owner']."','".$val['petSpeciesId']."', '".$servername."', '".$val['buyout']."','gb','".$val['ownerRealm']."')";
$mysqli->query($sql);
}
}
Idea A is about 15% slower than Idea B. Idea B takes the 25 minutes to run through all 123 JSON's.
I run this on a local machine (my own desktop pc) running Apache2.4, MySQL57 & PHP Version 5.6.32
Any advice is appreciated, I think the following are good questions that I need help learning more about.
If you've got this far in the post. A huge THANK YOU for taking the time to read it, I hope it was clear and made sense!
MY PHP CODE
$servjson = '../servers/serv_gb.json';
$servcontents = file_get_contents($servjson);
$servdata = json_decode($servcontents,true);
$jrc = count($servdata['servers']);
echo "Json Row Count: ".$jrc."<p>";
$sTime = (new \DateTime())->format('Y-m-d H:i:s');
for($i=0; $i<$jrc; $i++){
$json = $servdata['servers'][$i]['url'];
$contents = file_get_contents($json);
$data = json_decode($contents,true);
$servername = $data['realms'][0]['slug'];
$auctions = $data['auctions'];
foreach($auctions as $val)
{
$sql = "INSERT INTO `petshop`.`ah` (datedatacollected, seller, petSpeciesId, server, buyout, region, realrealm) VALUES('".$cTime."','".$val['owner']."','".$val['petSpeciesId']."', '".$servername."', '".$val['buyout']."','gb','".$val['ownerRealm']."')";
$mysqli->query($sql);
}
}
$eTime = (new \DateTime())->format('Y-m-d H:i:s');
$dTime = abs(strtotime($eTime) - strtotime($sTime));
Upvotes: 0
Views: 926
Reputation: 142528
Please echo
a sample INSERT
statement. There may be some issues with it.
Around the INSERT
loop, add BEGIN
and COMMIT
. This will put all the inserts into a single transaction, thereby making it faster.
I prefer microtime(true)
for measuring time.
25 minutes? How long does it take if you comment out the INSERTs
? This is to see whether it is MySQL or not.
Upvotes: 0
Reputation: 86
There are at least two topics here:
A) you are downloading feeds from remote servers to your local machine. This may take some time of course. Compressing content by the server should be used if possible. This limits the amount of data which needs to be transfered to the client. However this is not straightforward with built-in PHP
functions. You need to take a look at PHP cURL
: curl_setopt with CURLOPT_ENCODING
. If remote server can compress data, request it to do so.
B) database performance. As far as MySQL
you can choose MyISAM
or InnoDB
engine. Inserting thousands of records could work better with InnoDB
and a transaction, which you commit as soon as all queries are successful. Tuning your PHP
and MySQL
is a serious topic and it cannot be fully covered here.
Upvotes: 1