Reputation: 3237
I have a 30 million record mysql table.
Has about 20 columns of which I will use 15 to insert into another table.
Now I can't use PHP to load this large dataset (selecting 30 million rows and loading into memory isn't feasible), what would be the best method of loading all these records? MySQL 5.X
I'm using EMS to connect to the database.
Upvotes: 2
Views: 3427
Reputation: 651
There are few ways you can including one user M_M provided above. I have not used EMS and not sure what it can and can't do. But I have extensively used Workbench.
A.
B.
Upvotes: 0
Reputation: 1507
you can do this
INSERT INTO new_table (`col1`,`col2`,`col3`) SELECT `oldcol1`,`oldcol2`,`oldcol3`
FROM old_table LIMIT 0,100000
and repeat it by php loop (with changing limit start value)
Upvotes: 0
Reputation: 23982
What about doing an INSERT INTO MySmallerTable SELECT Col1, col2, col3... FROM MyBiggerTable
It might be worth breaking it into multiple INSERT
Like:
INSERT INTO ... SELECT ... WHERE ID between 1 and 100000;
INSERT INTO ... SELECT ... WHERE ID between 100001 and 200000;
etc.
Upvotes: 4