Reputation: 2263
I have a 9GB .csv file and would like to convert it to an sqlite data base.
I have followed https://datacarpentry.org/R-ecology-lesson/05-r-and-databases.html#Connecting_to_databases and it works on my local machine but on a server it says the disk/database is full. In any case, on the local machine, the DB file appears to be 0KB and contains no tables. Any thoughts why?
This is what I do:
library(dplyr)
library(dbplyr)
#Test data
df<-as.data.frame(x1=runif(1e7),x2=runif(1e7))
#Make DB
PassengerData <- src_sqlite("FINAL_data.sqlite", create = TRUE)
#Copy dataframe to DB
copy_to(PassengerData,df)
# add my data.frame as a table
PassengerData<-tbl(PassengerData,"df")
Then I close R and open a new session:
To Look at the DB I do:
df<-DBI::dbConnect(RSQLite::SQLite(), "FINAL_data.sqlite")
src_dbi(df)
src: sqlite 3.34.1 [Data/FINAL_data.sqlite]
tbls:
There are no tables in it. Why?
Upvotes: 3
Views: 1352
Reputation: 160437
I suggest that the prospect of loading an entire (9GB) file into R for the sole purpose of ingesting into a SQLite3 database is a little flawed (you may not have sufficient memory to load into R). Instead, use sqlite3
by itself.
I have pre-made a file mt.csv
from mtcars
.
$ sqlite3 -csv mt.sqlite3 '.import mt.csv mtcars'
$ ls -l mt.sqlite3
-rw-r--r-- 1 r2 r2 8192 Feb 4 12:19 mt.sqlite3
$ sqlite3 -column -header mt.sqlite3 'select * from mtcars limit 3'
mpg cyl disp hp drat wt qsec vs am gear carb
---------- ---------- ---------- ---------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
21 6 160 110 3.9 2.62 16.46 0 1 4 4
21 6 160 110 3.9 2.875 17.02 0 1 4 4
22.8 4 108 93 3.85 2.32 18.61 1 1 4 1
The sqlite3
binary is not installed by default on many (any?) systems, but it is an easy and free download/installation (see https://www.sqlite.org/download.html).
Upvotes: 4