Klausos Klausos
Klausos Klausos

Reputation: 16060

How to import 3000 CSV files into SQLite DB?

I have around 3000 CSV files, each one have 200 entries. So, in total, 600,000 entries.

I want to import all these files into SQLite DB (I am using SQLite Manager in Firefox).

What is the best approach to do this? Later on I will need to read SQLite tables into R code, so I should take it into account when importing the files into DB. For instance, is it a good idea to import all files into the same table? Or, is it better to import each file into a separate table (in this case how to automate this process)?

Upvotes: 1

Views: 2035

Answers (1)

Dominic Comtois
Dominic Comtois

Reputation: 10421

Probably easiest to put everything into a single table, but that depends on your needs down the road.

All in different tables using filenames (without extension) as table names:

library(RSQLite)
con <- dbConnect(SQLite(), "mydb.sqlite")

for(filename in list.files()) {
  tmp <- read.table(filename, [...])
  dbWriteTable(con, name = strsplit(filename,"\\.")[[1]][1], value = tmp)
}

All in one table (using RSQLite):

library(RSQLite)
con <- dbConnect(SQLite(), "mydb.sqlite")

for(filename in list.files()) {
  tmp <- read.table(filename, [...])
  dbWriteTable(con, name = "mytable", value = tmp, append = TRUE)
}

All in one table (using sqldf):

library(sqldf)

for(filename in list.files()) {
  tmp <- read.table(filename, [...])
  sqldf("INSERT INTO mytable SELECT * FROM tmp;", dbname = "mydb.sqlite", drv = "SQLite")
}

Side-note: a possibly more convenient tool than the Firefox plugin could be http://sqlitebrowser.org/ -- it's a matter of preference, but it's worth a shot! Multiplatform as well.

Upvotes: 4

Related Questions