Reputation: 157
I'm having a bit of a weird issue.
x <- Sys.time()
x <- character(x)
This for some reason throws an error "Error: cannot allocate vector of size 11.9 Gb".
I faced a similar issue recently: I'm running code that accumulates information to a dataframe. Usually, this dataframe is about 6 MB. Today, this same dataframe from the same code is over 44 MB.
I'm new to parallel computing in R. Both these issues arose after I used doParallel and foreach so I'm guessing this has to do with one of these.
The parallelized segment looks like this:
doParallel::registerDoParallel(cores = detectCores() - 1)
<foreach code>
stopImplicitCluster()
However, only the issue with the dataframe is related to this segment.
x <- Sys.time()
x <- character(x)
This still throws an error saying the character vector is too big even when I'm not running this part of the code (or even in a new R session).
Could my parallel code have changed something about how R behaves (permanently)?
Upvotes: 1
Views: 104
Reputation: 887048
If the intention is to convert to character
, it should be
as.character(x)
Or wrap with strftime
which can also take format
, tz
arguments
strftime(Sys.time())
But, character(n)
returns an blank vector of length 'n'. As Sys.time
is stored as double, it will be coerced to a numeric value and that many times the ""
is repeated i.e.
as.integer(Sys.time())
#[1] 1593382112
In my system, it returns something like
# ....
#99937] "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" #"" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" "" ""
#[99985] "" "" "" "" "" "" "" "" "" "" "" "" "" "" ""
# [ reached getOption("max.print") -- omitted 1593281954 entries ]
character(5)
#[1] "" "" "" "" ""
According to ?character
, the usage is
character(length = 0)
where
length - A non-negative integer specifying the desired length. Double values will be coerced to integer: supplying an argument of length other than one is an error
Upvotes: 2