pangpang
pangpang

Reputation: 8821

mysql millions of rows data import from one table to another table

I have a table, it has two millions rows data. For each row, it has a body column, it store a JSON format data. For example:

table_a:

id user_id  body
1  1        {'tel': '13678031283', 'email': '[email protected]', 'name': 'test'....}
2  2        {'tel' : '1567827126', 'age': '16'....}
......

I have another table, named table_b:

table_b:

id     user_id    tel        email         name
1      1          13678019   [email protected]   test1
2      2          15627378   [email protected]  test2  
.....

table_a has 2 million rows data, I want to import all table_a data to table_b, each row of table_a should be process.

I want to deal with it like this:

for row in table_a_rows:
    result = process(row)
    insert result to table_b
.....

But i think it is not a good idea. it there a better way to make it?

Upvotes: 0

Views: 204

Answers (1)

javier_domenech
javier_domenech

Reputation: 6253

You can select the data you need from table_a directly with JSON_EXTRACT. For example, getting the email would be something like this:

mysql> SELECT JSON_EXTRACT(body, '$.email') from table_a;

So you could replace directly into table_b all the data you have in table_a:

mysql> REPLACE INTO table_b SELECT user_id, 
JSON_EXTRACT(body, '$.tel'),
JSON_EXTRACT(body,'$.email'), 
JSON_EXTRACT(body,'$.name') from table_a

Upvotes: 1

Related Questions