Reputation: 19723
I have a CSV file in the assets folder with more than 10000 lines of data. I want to insert this data into a SQLite database when the database is created. I cannot include a database because it is a very generic application and the model demands a CSV file instead. I don't want to read all 10000 lines of data and insert it from the memory in one stretch. How do I accomplish the task effectively and efficiently?
Upvotes: 2
Views: 2452
Reputation: 1108577
Just insert immediately once you've read the line. So, just don't store the lines in some arraylist in memory.
E.g.
while ((line = reader.readLine()) != null) {
insert(line);
}
You might want to do this in a single transaction so that you can rollback whenever it fails halfway. In the JDBC side, you may want to consider PreparedStatement#addBatch()
/#executeBatch()
to execute the inserts in batches. An example can be found in this answer.
Update: as a completely different but more efficient alternative, you can also leave Java outside the story and use CSV import facility provided by SQLite.
Upvotes: 6
Reputation: 3082
IMPORT filename tablename
(from http://old.nabble.com/How-can-I-load-big-file-into-a-sqlite-database--td19202500.html - like LOAD DATA in mysql).
Other: disable auto commit, auto flush; read 20 lines, commit, flush.
Upvotes: 1
Reputation:
the simplest and most efficient in this case is to read 20-30-50(or maybe 100?!) lines at a time from the file and insert into the database, however you also need to run a few tests to see what is the best number of lines that you can read in 1 go and then do it :-)
Upvotes: 3