Reputation: 1779
I have a very long comma separated string that I want to read as a column vector in R. I tried to read it as a table, wondering if I could just transpose that, but reading into table takes forever.
col = read.table("filename.csv", sep = ",", header=FALSE)
colT = t(col)
The string is huge and has 2.7M entries separated by commas. Therefore it cannot be handled in a text editor and trying to replace ',' by '\n' was futile. Is there a way I can do that in R?
Upvotes: 2
Views: 1107
Reputation: 887531
We can use fread
library(data.table)
fread("filename.csv", header=FALSE)
tmp <- paste(paste0(letters,1:2.7e6),collapse="\n")
system.time(fread(tmp, header=FALSE))
# user system elapsed
# 0.87 0.00 0.88
If the OP's data have ,
and not \n
, we can use gsub
as @thelatemail mentioned
fread(gsub(",","\n",tmp),header=FALSE)
Upvotes: 3
Reputation: 93908
?scan
will be quite quick for this sort of thing.
tmp <- paste(paste0(letters,1:2.7e6),collapse=",")
system.time(scan(text=tmp, what=character(1), sep=","))
#Read 2700000 items
# user system elapsed
# 1.15 0.00 1.16
Upvotes: 3