Reputation: 55
I have to read large(say 20mb) pipe delimited file from a FTP location and store into DB. What would be the best way to do so.
Some approaches: 1. Copy file locally and then read it and save to DB. 2. Reading directly from stream and saving into DB.
Please suggest some efficient way to do so.
Upvotes: 0
Views: 1064
Reputation: 1467
I think option-1 will be better ,
In case of option 2 -What if there is some exception - while reading the file using stream or exception while writing the file in DB , how will you come to know exactly where it have failed ,
with option 1 you can store file temp directory , then write the file in DB and then delete or archive from the temp directory.
Archiving file on file system will be preferable - it will be of use in the future references .
There are many examples around for both -
For file download from ftp you can follow - http://www.codejava.net/java-se/networking/ftp/java-ftp-file-download-tutorial-and-example
Upvotes: 1
Reputation: 186
When its a sql or csv file you can try it via ssh.
mysql -u root -p
set global net_buffer_length=1000000; --Set network buffer length to a large byte number
set global max_allowed_packet=1000000000; --Set maximum allowed packet size to a large byte number
SET foreign_key_checks = 0; --Disable foreign key checking to avoid delays,errors and unwanted behaviour
source file.sql --Import your sql dump file
SET foreign_key_checks = 1; --Remember to enable foreign key checks when procedure is complete!
Upvotes: 0