Reputation: 41
I'm having some trouble to read big csv files with my R, so i'm trying to use the package sqldf to read just some column or lines from the csv.
I tried this:
test <- read.csv.sql("D:\\X17065382\\Documents\\cad\\2016_mar\\2016_domicilio_mar.csv", sql = "select * from file limit 5", header = TRUE, sep = ",", eol = "\n")
but i got this problem:
Error in connection_import_file(conn@ptr, name, value, sep, eol, skip) : RS_sqlite_import: D:\X17065382\Documents\cad\2016_mar\2016_domicilio_mar.csv line 198361 expected 1 columns of data but found 2
Upvotes: 0
Views: 2068
Reputation: 20302
This works for me.
require(sqldf)
df <- read.csv.sql("C:\\your_path\\CSV1.csv", "select * from file where Name='Asher'")
df
Upvotes: 0
Reputation: 1771
Like Shinobi_Atobe said, the fread()
function from data.table is working really well. If you prefer to use base R you could also use : read.csv()
or read.csv2()
.
i.e.:
read.csv2(file_path, nrows = 5)
Also what do you mean by "big files" ? 1GB, 10GB, 100GB?
Upvotes: 0
Reputation: 1973
If you're not too fussy about which package you use, data.table
has a great function for doing just what you need
library(data.table)
file <- "D:\\X17065382\\Documents\\cad\\2016_mar\\2016_domicilio_mar.csv"
fread(file, nrows = 5)
Upvotes: 2