Reputation: 41
I have a data which has a size similar to "a" below
library(openxlsx)
a <- list()
names(a) <- paste("sheet", seq_along(fulldata), sep="_") ### name for each sheet
for (i in 1:172) {
a[[i]] <- matrix(i,30,60)
}
write.xlsx(a, "a.xlsx")
If I run the code above, few seconds later, R closes automatically.
library(xlsx)
options(java.parameters = "-Xmx4000m")
a <- list()
for (i in 1:172) {
a[[i]] <- matrix(i,30,60)
}
n <- paste("sheet", seq_along(fulldata), sep="_") ### name for each sheet
for (i in 1:172) {
write.xlsx(a[[i]], "c.xlsx", sheetName=n[[i]], append=TRUE)
}
If I run the code above, after 10 minutes, it returns an error about lack of memory. I used
options(java.parameters = "-Xmx4000m")
to increase the memory to be used but still, it says lack of memory.
Both of them works fine with small data but it doens't work when I try to export 172 sheets all at once. I need all the 172 sheets to be included in one excel file.
Upvotes: 0
Views: 932
Reputation: 3369
Creating the sheets using lapply may help alleviate the memory issue.
library(xlsx)
# Create the list of matrices
a <- list()
for (i in 1:172) {
a[[i]] <- matrix(i,30,60)
}
# Set names for the matrices
names(a) <- seq_along(a)
# Create a workbook object
wb <- createWorkbook()
# Add each matrix to it's own worksheet inside of the workbook
lapply(seq_along(a), function(matrices, matrix.names, i){
ws <- createSheet(wb, matrix.names[[i]])
addDataFrame(matrices[[i]], ws)
}, matrices = a, matrix.names = names(a))
# Set the file path to save the workbook to
(output.path <- file.path(tempdir(), "output.xlsx"))
# Save the workbook
saveWorkbook(wb, output.path)
Upvotes: 1