Reputation: 2822
I have an array named prices
which has 1000 elements, I used this code to add them to database:
for (int i = 0; i < 1000; i++) {
$query = "INSERT INTO TABLE data (amount) VALUES";
$query .= " (" . $prices[i] . ");";
mysqli_query($link, $query);
}
Although it works, But it takes more than 15second to get done, What can I do to optimize it?
Upvotes: 1
Views: 919
Reputation: 76397
Technically speaking, you don't even need the loop. You can simply insert all the values in one single array:
$query = 'INSERT INTO data (amount) VALUES ';
$valueSets = array_fill(0, count($prices), '(?)');//creates an array containing a (?) string for each price
$pdo = new PDO($dsn, $user, $pass, $attr);
$query .= implode(', ', $valueSets);//concatenate onto query string
$stmt = $pdo->prepare($query);
$stmt->execute($prices);
That's all. Of course, that's not a very nice way to do things, so I'd probably use something like this:
$stmt = $pdo->prepare('INSERT INTO data (amount) VALUES (:price)');
foreach ($prices as $price) {
$stmt->execute([':price' => $price]);
$stmt->closeCursor();//optional
}
Or, if you really want to use Mysqli:
$stmt = $link->prepare('INSERT INTO data (amount) VALUES (?)');
$price = null;
$stmt->bind_param('i', $price);//bind by reference
foreach ($prices as $price) {
$stmt->execute();//insert price
}
But honestly: this, to me looks like DB fixture related things. I'd simply put the prices in a CSV file or something, and then run a LOAD DATA LOCAL INFILE
query instead of writing a script to insert all the values
Upvotes: 5
Reputation: 2154
Making database request is a costly task. You should reduce the number of requests. You could for instance insert multiple values at once with a request like so.
INSERT INTO TABLE data (amount) VALUES ('v1'), ('v2'), ('v3')
Be aware that the settings of your database may limit the length of the query. So 10 requests of 100 insertions each or 20 requests of 50 insertions (20x50=1000) may do the job. And it will be a lot faster.
Upvotes: 0