Reputation: 117
I am trying to insert data to sqlite
database from R data.frame
but I failed.
Here is the code in R studio I used.
So there two loops using which I'm trying to load all dbf
files listed in the specified folders (working directory). And then I'm trying to insert data from R data.frame
(df[[1]]
) to the sqlite
database (I have already created it) by sqldf
or by dbExecute
functions.
In case of dbExecute
the function cannot read table from R data.frame
(in this case df[[1]]
).
In case of sqldf
the function doesn't see all_banks
table created earlier in the database.
Any ideas of how to handle this problem? Thank to all.
library(sqldf)
library(DBI)
library(foreign)
library("RSQLite")
setwd("F:~/Data")
con <- dbConnect(RSQLite::SQLite(), dbname = "banks.db")
for(path in c("F:~/123-20190901",
"F:~/123-20190801")){
setwd(path)
ldf <- list()
listdbf <- dir(pattern = "*.DBF")
for (k in 1:length(listdbf)){
ldf[[k]] <- read.dbf(listdbf[k])
}
df1 <- ldf[[1]]
df2 <- ldf[[2]]
dbExecute(con, "insert into all_banks select DT, REGN, name_b from df1")
sqldf("insert into all_banks select DT, REGN, name_b from df1")
}
dbDisconnect(con)
Error: no such table: df1
Error: no such table: all_banks
Upvotes: 1
Views: 2720
Reputation: 107587
Fundamentally, sqldf
and dbExecute
are two different processes where the former works in local environment and latter works in an external database (albeit with noted exceptions).
Per docs,
sqldf is an R package for running SQL statements on R data frames, optimized for convenience.
Therefore, sqldf
, runs on R data frames in local environment. By default, sqldf
is not used to make changes to a persistent, external database. Technically, it runs an in-memory SQLite database which does not save after R session. Consequently, if all_banks is not a data frame in global environment, it will not be recognized by sqldf
. With that said, there is an advanced way of using sqldf
with a permanent SQLite database. But below solution is arguably easier.
Per docs, DBI::dbExecute
Executes a statement and returns the number of rows affected.
With first argument:
conn
: A DBIConnection object, as returned bydbConnect()
.
Therefore, dbExecute
runs commands on an external database and within its scope does not use local environment objects. Consequently, if df1 is not a table in database, it will not be recognized by dbExecute
.
Solution
To resolve your database append needs, simply use DBI's dbWriteTable
which pushes a local R data frame into an external database table (assuming same structure on both end points unless using overwrite=TRUE
). Such changes to table will remain permanent even after closing database connection or R session.
dbWriteTable(con, name="all_banks", value=df1, append=TRUE)
Upvotes: 3