davis
davis

Reputation: 1951

PHP json_decode Big(O) and optimization

I am trying to write php code which takes an NSArray of NSDictionaries and adds the records to a database. Unfortunately, this is taking around 20 seconds to process around 500 records with a total size of around 2mb.

function uploadTracks($tracks, $partyID, $pName) {
$tracks = json_decode($tracks, true);

$itemInfo = array();

foreach($tracks as $itemInfo){
    $track = $itemInfo['songTitle'];
    $artist = $itemInfo['artistName'];
    $album = $itemInfo['albumName'];
    $artwork = $itemInfo['artwork'];
    $result = query("INSERT INTO partyRecords(Pkey,songName,songArtist,imageBinary,partyName) VALUES ('%s','%s','%s','%s','%s')" ,$partyID,$track,$artist,$artwork,$pName);
}


}

Is there anyway to optimize the above code? Could json_decode be what is taking the most time?

Upvotes: 0

Views: 406

Answers (2)

Alex Boyko
Alex Boyko

Reputation: 221

You can insert multiple records in single insert query. In this case your db indexes will be updated once and it gain you performance boost:

INSERT INTO partyRecords
  (Pkey, songName, songArtist, imageBinary, partyName)
VALUES
  (1, 'Name 1', 'Artist 1', 'Image 1', 'Party 1'),
  (2, 'Name 2', 'Artist 2', 'Image 2', 'Party 2'),
  (3, 'Name 3', 'Artist 3', 'Image 3', 'Party 3'),
  (4, 'Name 4', 'Artist 4', 'Image 4', 'Party 4');

Upvotes: 1

user1122069
user1122069

Reputation: 1817

I would say 95% it is the queries. You need to compile the queries and insert them 100 or so at a time. It would be best to escape the query variables yourself (mysql: mysql_real_escape_string() ).

INSERT INTO partyRecords(Pkey,songName,songArtist,imageBinary,partyName) VALUES (2,'12312','12312321','12312332423','23423432');INSERT INTO partyRecords(Pkey,songName,songArtist,imageBinary,partyName) VALUES (2,'12312','12312321','12312332423','23423432');...

Upvotes: 0

Related Questions