Reputation: 37
I have an array in php which is populated via XML. this array holds roughly 21000 items.
I am currently looping through the array, checking if the name node exists in the database (mysql) if it does update it, else insert new data and store the row id of the inserted/updated row, i then in the same loop insert more data into another table and link it to the first table: http://pastebin.com/iiyjkkuy
the array looks like this: http://pastebin.com/xcnHxeLk
Now due to the large amount of nodes in the array (21000) this is exceeding the 300 sec (5 mins) max execution time on my dev system..
What is the best way to loop through an array of this size and insert data?
just some more information on this. I am using expression engine 1.8.6 (work reasons) and i have to use its built in database class.
the reason for the select statements before each insert/update is to grab the row ID for future statements. The data has to be structured in the DB in a certain way, for example:
each source node has a papergroup node - this needs inserting / updating first then each paper name node needs to be linked to the paper group in the same table sourceid etc are then inserted into a sources table with a link to there parent paper in the papers table so basic db schema is this: papergroup inserted into papers table paper name inserted into papers table with papers.PID as the link to the papger group papers.ID sources are inserted into the sources table and linked to papers table on source.paperID
the basic structure of the XML source that populates the array is as follows:
<sources>
<source>
<sourceid>1</sourceid>
<papername>test</papername>
<papergroup>test group</papergroup>
<papertype>Standard</papertype>
<sourcename> test source</sourcename>
<sourcesize>page</sourcesize>
</source>
</sources>
the above is not a full segment but it shows the point about all the information being sent in one section. Hope this helps.
Ok I manage to get some timings. It takes 1:35:731 to get the XML it then takes between 0:0:025 and 0:0:700 to do an array loop (select, insert/update)
Upvotes: 0
Views: 449
Reputation: 314
I guess the simplest and one of the way is create batch array for insertion of say around 1500 records and do a batch insert. I tried this with 2k insert in while loop and single insert then it took 27 secs to insert 2000 records but with a single insert batch it took only .7 sec...
Upvotes: 0
Reputation: 174967
I'll write it as algorithm.
Store the first array inside of a new variable. $xmlArray;
SELECT the table to compare against from the databse and store it in a variable. $tableArray
foreach through $xmlArray and compare against $tableArray
Save the needed updates into a new array, $diffArray;
Prepare a statement using PDO prepare() and bindParam()
foreach through $diffArray, change the parameters only and execute()
This should be the most efficient way to do what you need.
Upvotes: 2
Reputation: 308763
Every time you insert a record is another round trip to the database.
I wonder if your life would be better if you could batch those SQL commands into a single round trip and execute them all at once? You'd cut down on the network latency that way.
The best way to figure out how to optimize anything is to have some hard data as to where the time is being spent. Find out what's taking the most time, change it, and re-measure. Repeat the exercise until you get acceptable performance.
I don't see any data from you. You're just guessing, and so is everyone else who answers here (including me).
Upvotes: 3