Reputation: 1204
I've been searching to find a suitable answer for my problem, but haven't found the exact or suitable answer.
I have a large character vector (about 4 million elements, well over 3GB in size). I want to output/export this large character vector as a CSV file in which each element represents a row.
Exporting this file with a write.table
, write.csv
or write.csv2
results in memory allocation issues.
For now, I have tried the RSQLite
package and the fwrite
function from the data.table
package. Both don't seem to work for different reasons.
RQSLite
approach took +4 hours to process and I eventually had to kill it. fwrite()
function asks for a data.frame
as input. Trying to coerce the large character vector into a data.frame, I ran into memory issues again.Does anyone know a good approach to this problem?
(I am on a Windows 64bit machine, 16GB RAM, 500GB SSD and run R version 3.2.5)
Upvotes: 0
Views: 1929
Reputation: 70336
As commented, you might be able to convert your character vector x
into a list
and then use data.table::setDT
to convert it to a data.table
by reference, i.e. without copy. So it would be:
x <- list(x)
library(data.table)
setDT(x)
Now you can use for example data.table
's new fwrite
function to create the csv-file.
Upvotes: 1