Reputation: 5907
I have a table ("my_table") located on a SQL database server.
This table has 1000 rows - I am trying to select 100 rows of this table at a time (making sure that no row is selected twice and all rows are selected), and then append all these mini tables into a single table.
For example:
I tried to do this with the falling code:
library(dplyr)
library(DBI)
con <- dbConnect(RSQLite::SQLite(), ":memory:")
sequence = seq(from = 1, to = 1000, by = 100)
the_list = list()
for (i in 1:10)
{
for (j in 1:sequence)
{
result_i = DBI::dbGetQuery(con, "select * from my_table ORDER BY ID limit 100 OFFSET J;")
the_list[[i]] = result_i
}
}
final = do.call(rbind.data.frame, the_list)
I thought I could do this with a loop, but I don't think that SQL is recognizing my loop index.
Can someone show me how to fix this?
Thank you!
Upvotes: 0
Views: 91
Reputation: 269644
Use dbFetch
as shown.
library(RSQLite)
# Create test database in memory with one 6 row table.
m <- dbDriver("SQLite")
con <- dbConnect(m, dbname = ":memory:")
dbWriteTable(con, 'BOD', BOD)
res <- dbSendQuery(con, "select * from BOD")
# fetch 4 rows at a time
L <- NULL; i <- 0
while(!dbHasCompleted(res)) {
i <- i + 1
L[[i]] <- dbFetch(res, n = 4)
}
dbClearResult(res)
dbDisconnect(con)
library(dplyr)
Data <- bind_rows(L)
Upvotes: 3
Reputation: 1614
This should work...
library(dplyr)
library(DBI)
library(data.table)
con <- dbConnect(RSQLite::SQLite(), ":memory:")
sequence <- seq(from = 1, to = 1000, by = 100)
results_list <- lapply(sequence, function(x) {
qstr <- sprintf('SELECT * FROM my_table ORDER BY ID LIMIT 100 OFFSET %s', x)
res <- DBI::dbGetQuery(con, qstr)
res
})
results_df <- rbindlist(results_list) |> as.data.frame()
print(results_df)
Upvotes: 1