Reputation: 4481
library(dplyr, warn.conflicts = FALSE)
con <- DBI::dbConnect(RSQLite::SQLite(), ":memory:")
copy_to(con, mtcars)
mtcars2 <- tbl(con, "mtcars")
mtcars2
#> # Source: table<mtcars> [?? x 11]
#> # Database: sqlite 3.25.3 [:memory:]
#> mpg cyl disp hp drat wt qsec vs am gear carb
#> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
#> 1 21 6 160 110 3.9 2.62 16.5 0 1 4 4
#> 2 21 6 160 110 3.9 2.88 17.0 0 1 4 4
#> 3 22.8 4 108 93 3.85 2.32 18.6 1 1 4 1
#> 4 21.4 6 258 110 3.08 3.22 19.4 1 0 3 1
#> 5 18.7 8 360 175 3.15 3.44 17.0 0 0 3 2
#> 6 18.1 6 225 105 2.76 3.46 20.2 1 0 3 1
#> 7 14.3 8 360 245 3.21 3.57 15.8 0 0 3 4
#> 8 24.4 4 147. 62 3.69 3.19 20 1 0 4 2
#> 9 22.8 4 141. 95 3.92 3.15 22.9 1 0 4 2
#> 10 19.2 6 168. 123 3.92 3.44 18.3 1 0 4 4
#> # … with more rows
How do I view a table that is in memory in R? I'm utilizing DBI and dbplyr packages. I'm looking for the spreadsheet view in R Studio that pops up when you issue a command such as View(mtcars)
. And I really want to only view the first 10 or 20 rows of a table. A lot of the SQL tables I work with are millions of rows, hundreds of columns, and I don't want all that data to display.
Things just seem different enough going from dplyr to dbplyr to slightly confuse me. And if you're wondering why I'm not happy with the console view presented above, because it usually cuts off at about a dozen variables, or however wide your console is. I need to browse all variables (columns), even when my tables are hundreds of columns wide.
# Doesn't really work
mtcars2 %>% View()
Let me also mention I want to avoid having to use direct SQL queries in the answer, if it's even possible. I'm somewhat open to utilizing the collect()
function, but it seems very slow when the tables are around size 250K+ rows, 100+ columns.
Upvotes: 1
Views: 494
Reputation: 4481
@DiceboyT pointed out the solution
mtcars2 %>% head(10) %>% collect() %>% View()
Upvotes: 2