Intellect
Intellect

Reputation: 21

Large data import using PHP and mysql to multiple tables with foreign keys

What is the better way to insert large data into multiple tables in DB from CSV with foreign keys?

I tried to import large csv file into multiple tables and I use the normal Mysql query with looping csv data, but it was very slow. Then I tried 'LOAD DATA' method.But it is only possible to insert data into a single table with same column order.

However, it will not work for because I need to insert data to multiple tables and need to use last

inserted id of one table to join another table.

Can anyone suggest a better way to upload data into multiple tables?

Here, I am using Zend Framework with Doctrine for handling database operations. However, from my understanding Doctrain is taking more time than normal MySQL query, that is the reason why I used direct MySQL Query.

Upvotes: 1

Views: 783

Answers (2)

baogege
baogege

Reputation: 1

-- import the csvfile to this temp table , the column just like your csv header
create table IF NOT EXISTS my_db.csv_file (

    id int auto_increment primary key,
    vendor_name varchar(200),
    product_name varchar(200),
    product_price double
    
);

-- table one
create table if not exists my_db.product(
    id int auto_increment primary key,
    vendor_id int, -- foreign
    name varchar(200),
    price double
)

-- table two
create table if not exists my_db.vendor(

    id int auto_increment primary key,
    name varchar(200)
)

-- import csv data ,you can use LOAD command 
insert into my_db.csv_file(vendor_name,product_name,product_price)values
('A','book',1.00),
('B','computer',2.00),
('C','phone',3.00);


-- step 1:

insert into my_db.vendor(name)
(select vendor_name from my_db.csv_file group by vendor_name);

-- step 2:
insert into my_db.product(vendor_id , name , price)
(select vendor.id,temp.product_name,temp.product_price from my_db.csv_file as temp left join my_db.vendor as vendor on vendor.name = temp.vendor_name);

Upvotes: 0

g8bhawani
g8bhawani

Reputation: 674

You can import large files around millions of records in one shot from below method, please use mysql Load DATA

for ex-

LOAD DATA LOCAL INFILE  
'/PATH_TO_YOUR_CSV/YOUR_CSV.csv'
INTO TABLE MY_TABLE
FIELDS TERMINATED BY ',' 
ENCLOSED BY '"'
LINES TERMINATED BY '\n';

Please remember your table structure and your csv file structure (no of columns) should be same

Upvotes: 1

Related Questions