Reputation:
I can import my csv-file via phpmyadmin (but it takes very long). I can't load it via SSH and MySQL as a command, because I don't have the permission.
So is there some possibility to load the csv data like this:
INSERT INTO table LOAD_CSV(MY_CSV_STRING)
If not i have to parse it into individual rows.
Upvotes: 1
Views: 2470
Reputation: 1
CREATE PROCEDURE load_csv(IN CSV_INPUT LONGTEXT)
/*
CALL load_csv(@csv:='a1,b1,c1\na2,b2,c2\na3,b3,c3')
*/
BEGIN
SET CSV_INPUT := @csv;
SET @sql = concat('INSERT INTO mytable (field1, field2, field3)
VALUES ("', replace(replace(CSV_INPUT, ',', '","'), '\n', '"),("'), '");');
PREPARE stmt1 FROM @sql;
EXECUTE stmt1;
DEALLOCATE PREPARE stmt1;
END;
Upvotes: 0
Reputation: 361
Use LOAD DATA LOCAL INFILE command to import as aleroot suggests.
If there is a permission error you need to allow it by connecting to mysql instance with the following command : mysql --local-infile -uroot -p
Upvotes: 0
Reputation: 26
It is possible to use a stored procedure for this combined with SUBSTRING_INDEX. Keep in mind that if your CSV is too large you may want to use LOCATE and SUBSTRING instead. Here is a simple one for reference:
CREATE PROCEDURE load_csv (IN `icsv` MEDIUMTEXT)
MOD_SP
BEGIN
DECLARE csv MEDIUMBLOB;
DECLARE csv_begin INTEGER DEFAULT 1;
DECLARE csv_end INTEGER;
DECLARE csv_size INTEGER;
SET csv = COALESCE(`icsv`,'');
SET csv_size = LENGTH(csv);
WHILE csv_begin < csv_size DO
SET csv_end = LOCATE('\n',csv,csv_begin);
SET csv_end = IF(csv_end != 0,csv_end,csv_size+1);
SET pbegin = pend+1;
INSERT INTO mytable (field1, field2, field3, field4, field5) VALUES (
SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING(policy FROM pbegin FOR pend-pbegin),',',1),-1),
SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING(policy FROM pbegin FOR pend-pbegin),',',2),-1),
SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING(policy FROM pbegin FOR pend-pbegin),',',3),-1),
SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING(policy FROM pbegin FOR pend-pbegin),',',4),-1),
SUBSTRING_INDEX(SUBSTRING_INDEX(SUBSTRING(policy FROM pbegin FOR pend-pbegin),',',5),-1)
)
END WHILE;
END //
You can make this a bit faster though with some optimizations, in particular caching the csv line and using some properties of SUBSTRING_INDEX to get the smallest string.
CREATE PROCEDURE load_csv (IN `icsv` MEDIUMTEXT)
MOD_SP
BEGIN
DECLARE csv MEDIUMBLOB;
DECLARE csv_line MEDIUMBLOB;
DECLARE csv_begin INTEGER DEFAULT 1;
DECLARE csv_end INTEGER;
DECLARE csv_size INTEGER;
SET csv = COALESCE(`icsv`,'');
SET csv_size = LENGTH(csv);
WHILE csv_begin < csv_size DO
SET csv_end = LOCATE('\n',csv,csv_begin);
SET csv_end = IF(csv_end != 0,csv_end,csv_size+1);
SET csv_line = SUBSTRING(policy FROM pbegin FOR pend-pbegin);
SET pbegin = pend+1;
INSERT INTO mytable (field1, field2, field3, field4, field5) VALUES (
SUBSTRING_INDEX(csv_line,',',1),
SUBSTRING_INDEX(SUBSTRING_INDEX(csv_line,',',2),-1),
SUBSTRING_INDEX(SUBSTRING_INDEX(csv_line,',',3),-1),
SUBSTRING_INDEX(SUBSTRING_INDEX(csv_line,',',-2),1),
SUBSTRING_INDEX(csv_line,',',-1)
)
END WHILE;
END //
Finally you can even use a mix of locate and substrings for the commas themselves if your lines are particularly long but I don't have time to write this example.
Upvotes: 0
Reputation:
There is actually no way to do what I wanted. It is only possible to load it via file.
Upvotes: 3
Reputation: 72656
You can use LOAD DATA INFILE in this way :
LOAD DATA LOCAL INFILE '/importfile.csv'
INTO TABLE test_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
Upvotes: 1