remyremy
remyremy

Reputation: 3758

long UPDATE query in MySQL

I have to run some mysql update queries and they take quite a long time to run. I use a PHP script but each update takes between 40 and 600ms, generally around 250ms.

I tried to optimize as much as I could but the script still takes more than 30 minutes to run (for 20000 entries in both tables).

This is a part of my PHP script and the UPDATE query:

if (mysql_num_rows($Rmaster) == 1) {
//Start time
    $Qchange_name = 'UPDATE slave SET NewAreaName = "'.$Amaster['AreaName'].'" WHERE Dialcode = "'.$Amain['Dialcode'].'" ORDER BY Dialcode ASC LIMIT 1 ';
    $Rchange_name = mysql_query($Qchange_name, $link);
//end time = 40 to 600ms
}

My SELECT queries are fast and I have an INDEX on Dialcode in both tables

These are my tables (simplified) :

CREATE TABLE IF NOT EXISTS `master` (
  `Dialcode` varchar(11) DEFAULT NULL,
  `AreaName` varchar(45) DEFAULT NULL,
  KEY `Dialcode` (`Dialcode`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

--
-- Table structure for table `slave`
--
CREATE TABLE IF NOT EXISTS `slave` (
  `AreaName` varchar(47) DEFAULT NULL,
  `NewAreaName` varchar(47) NOT NULL,
  `Dialcode` varchar(15) DEFAULT NULL,
  KEY `Dialcode` (`Dialcode`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

Anything else I could try to make it faster?

PHP: 5.3.4

MySQL: 5.1.53

(I will first try the PDO solution this weekend and update the post) thanks!

UPDATE: I tried with PDO these 2 different queries:

$Qchange_name = $pdo->prepare('UPDATE slave SET NewAreaName = ? WHERE DIalcode = ?');
$Qchange_name = $pdo->prepare('REPLACE INTO slave SET NewAreaName = ? AND Dialcode = ?');

But it takes around 18 seconds just to do the foreach for 150 rows!

In my main loops I do:

$L4 = array(); // Before the loops
else if ($countQmaster1 == 1) {
   $L4[] = array($Amaster1['AreaName'], $Amain['Dialcode']);        
}

and then at the end of the loops:

foreach ($L4 as $b){
    $Qchange_name->execute($b); 
}

Also I have Dialcode as PRIMARY INDEX

Is there something I do wrong?

UPDATE: Isolated the query.

Here are the files to create the tables and the code. I isolated the query and reduced the script at the minimum but it still takes 17 seconds to run 150 rows.

Do you see anything wrong? Do you get the same results?

PHP code here or here

Mysql tables to download here or here

I was thinking about a buffer problem? I'am running on my local machine and I get some fast and slow updates.

1 was executed in 38.87 ms 
2 was executed in 33.05 ms 
3 was executed in 33.17 ms 
4 was executed in 91.42 ms 
5 was executed in 36.17 ms 
6 was executed in 30.23 ms 
7 was executed in 33.15 ms 
8 was executed in 33.21 ms 
9 was executed in 33.22 ms 
10 was executed in 42.72 ms 
11 was executed in 32.12 ms 
12 was executed in 33.04 ms 
13 was executed in 33.14 ms 
14 was executed in 33.2 ms 
15 was executed in 33.2 ms 
16 was executed in 33.16 ms 
17 was executed in 33.25 ms 
18 was executed in 33.2 ms 
19 was executed in 33.12 ms 
20 was executed in 33.25 ms 
21 was executed in 33.15 ms 
22 was executed in 33.21 ms 
23 was executed in 33.16 ms 
24 was executed in 41.54 ms 
25 was executed in 74.79 ms 
26 was executed in 129.35 ms 
27 was executed in 77.65 ms 
28 was executed in 34.06 ms 
29 was executed in 33.21 ms 
30 was executed in 33.2 ms 
31 was executed in 33.17 ms 
32 was executed in 140.5 ms 
33 was executed in 34.04 ms 
34 was executed in 41.53 ms 
35 was executed in 33.23 ms 
36 was executed in 33.14 ms 
37 was executed in 158.06 ms 
38 was executed in 149.8 ms 
39 was executed in 278.14 ms 
40 was executed in 154.52 ms 
41 was executed in 149.71 ms 
42 was executed in 91.45 ms 
43 was executed in 274.55 ms 
44 was executed in 149.77 ms 
45 was executed in 278.46 ms 
46 was executed in 1055.73 ms 
47 was executed in 280.42 ms 
48 was executed in 149.85 ms 
49 was executed in 154.48 ms 
50 was executed in 178.14 ms 
51 was executed in 158.06 ms 
52 was executed in 433.23 ms 
53 was executed in 149.22 ms  
54 was executed in 166.41 ms 
55 was executed in 41.5 ms 
56 was executed in 33.21 ms 
57 was executed in 41.52 ms 
58 was executed in 33.16 ms 
59 was executed in 33.19 ms 
60 was executed in 49.8 ms 
61 was executed in 33.21 ms 
62 was executed in 33.16 ms 
63 was executed in 41.55 ms 
64 was executed in 33.13 ms 
65 was executed in 58.25 ms 
66 was executed in 33.12 ms 
67 was executed in 41.52 ms 
68 was executed in 33.15 ms 
69 was executed in 33.21 ms 
70 was executed in 41.51 ms 
71 was executed in 33.18 ms 
72 was executed in 33.19 ms 
73 was executed in 41.52 ms 
74 was executed in 33.25 ms 
75 was executed in 33.15 ms 
76 was executed in 41.58 ms 
77 was executed in 33.1 ms 
78 was executed in 33.17 ms 
79 was executed in 41.53 ms 
80 was executed in 33.18 ms 
81 was executed in 41.49 ms 
82 was executed in 33.15 ms 
83 was executed in 33.22 ms 
84 was executed in 41.52 ms 
85 was executed in 33.19 ms 
86 was executed in 33.19 ms 
87 was executed in 66.5 ms 
88 was executed in 33.16 ms 
89 was executed in 75.42 ms 
90 was executed in 57.55 ms 
91 was executed in 33.27 ms 
92 was executed in 41.43 ms 
93 was executed in 33.14 ms 
94 was executed in 33.19 ms 
95 was executed in 41.54 ms 
96 was executed in 33.15 ms 
97 was executed in 33.24 ms 
98 was executed in 41.45 ms 
99 was executed in 33.23 ms 
100 was executed in 33.15 ms 
101 was executed in 41.55 ms 
102 was executed in 33.22 ms 
103 was executed in 274.54 ms 
104 was executed in 291.28 ms 
105 was executed in 149.72 ms 
106 was executed in 91.42 ms 
107 was executed in 274.68 ms 
108 was executed in 278.38 ms 
109 was executed in 154.23 ms 
110 was executed in 432.75 ms 
111 was executed in 424.47 ms 
112 was executed in 309.66 ms 
113 was executed in 1101.32 ms 
114 was executed in 327.63 ms 
115 was executed in 116.23 ms 
116 was executed in 34.24 ms 
117 was executed in 33.18 ms 
118 was executed in 65.39 ms 
119 was executed in 34.36 ms 
120 was executed in 30.48 ms 
121 was executed in 35.85 ms 
122 was executed in 33.18 ms 
123 was executed in 41.54 ms 
124 was executed in 33.13 ms 
125 was executed in 33.23 ms 
126 was executed in 41.49 ms 
127 was executed in 83.13 ms 
128 was executed in 33.13 ms 
129 was executed in 41.58 ms 
130 was executed in 33.15 ms 
131 was executed in 33.17 ms 
132 was executed in 41.5 ms 
133 was executed in 33.18 ms 
134 was executed in 41.54 ms 
135 was executed in 33.13 ms 
136 was executed in 33.19 ms 
137 was executed in 41.55 ms 
138 was executed in 33.14 ms 
139 was executed in 33.21 ms 
140 was executed in 41.5 ms 
141 was executed in 33.22 ms 
142 was executed in 33.17 ms 
143 was executed in 41.51 ms 
144 was executed in 33.13 ms 
145 was executed in 33.22 ms 
146 was executed in 41.51 ms 
147 was executed in 75.34 ms 
148 was executed in 39.76 ms 
149 was executed in 34.38 ms  
150 was executed in 33.12 ms 

This script was executed in 13.45 seconds

This is the config:

# The MySQL server
[wampmysqld]
port        = 3306
socket      = /tmp/mysql.sock
query_cache_size=32M
# key_buffer = 16M
max_allowed_packet = 1M
table_cache = 64
sort_buffer_size = 512K
net_buffer_length = 8K
read_buffer_size = 256K
read_rnd_buffer_size = 512K
myisam_sort_buffer_size = 8M

Edit: Solution

Finally after changing ENGINE=InnoDB to MyISAM the query was executed in less than 3 seconds for 20000 rows!

Thanks @JOHN for your help! PDO helped a lot!

Upvotes: 0

Views: 2287

Answers (4)

Schiavini
Schiavini

Reputation: 2939

First of all, you should use primary keys in every single table you have.

If you change master.DialCode to a not null primary key, it will already help a lot.

Is slave.DialCode also unique? if not, why would you have order by and limit in the query?

When you have primary keys, you can try to bundle more records (like 100 or something) and update them at once with replace into.

REPLACE INTO slave (DialCode, NewAreaName) Values
('code1','name1'),
('code2','name2'),
('code3','name3');

Upvotes: 0

J0HN
J0HN

Reputation: 26941

Use prepared statements (e.g. mysqli or PDO implementation) always if possible, they are much more efficient, saves you from sql-injections and so on. Also, consider using mysqli extension, mysql is considered outdated.

One more thing to consider: how do you get those $Amaster and Amain values? Are they stored in script or retrieved for each row individually?

In more detail:

Each time you execute a query with mysql_query, database perform request planning. For queries that affect small number of rows, planning can take much time compared to the execution itself (don't have any statistics here, but I believe it might be times of magnitude in case of complicated queries and short output). So, if you are to execute a huge number of similar queries - it's quite inefficient since planning is performed for each query.

To avoid this, you should use prepared statements. The idea behind this is to create a named object that can cache the plan for the query and skip the planning process, jumping right to the execution step.

In practice I've faced a situation when using prepared statements increased the speed of execution about hundred times compared to raw sql queries (about 100000 records)

There should be support for prepared statements in the DBMS, but as far as I know they are supported in Mysql 5.0.

Some samples

$pdo = new PDO(/*your database connection properties here*/);
$update_stmt = $pdo->prepare("UPDATE slave SET NewAreaName = ? WHERE Dialcode = ? ORDER BY Dialcode ASC LIMIT 1");

/*Iterate over $Rmaster  as you do it now{*/
    $update_stmt->execute(array($Amaster['AreaName'], $Amain['AreaCode']));
/*}*/

Upvotes: 1

kkatusic
kkatusic

Reputation: 94

please note to do some optimize: http://dev.mysql.com/doc/refman/5.1/en/optimize-table.html

Upvotes: 0

Sirko
Sirko

Reputation: 74076

Your problems probably result from issuing >20000+1 queries to your database.

If possible try to generalize your update statement, to cover multiple rows at once. Thus you could reduce your overall number of queries.

Upvotes: 0

Related Questions