Reputation: 263
I have run a rather large bootstrap in R with the boot package.
When I first ran boot() I got this:
Error: cannot allocate vector of size 2.8 Gb
So, to get the boot object I had to use 'simple=TRUE', which tells boot() to not allocate all the memory at the beginning (according to ?boot). This worked fine, though it took a few minutes.
Now I need to get confidence intervals:
> boot.ci(vpe.bt, type="bca", simple=TRUE)
Error: cannot allocate vector of size 2.8 Gb
Same problem! but according to ?boot.ci, there is no 'simple=TRUE' flag that you can use with this function (I've tried it).
So, is there any way around this using boot.ci()?
And, if not, what can I do to increase the amount of memory it can use?
Upvotes: 1
Views: 2541
Reputation: 31
Calculating bca (adjusted bootstrap percentile) confidence intervals in R requires the creation of an 'importance array' which has dimensions (number of observations) x (number of reps). If you don't have enough memory to handle at least two copies of such a matrix, the function will not work.
However, normal based (type='normal') and percentile based confidence intervals (type='percent') should work.
Upvotes: 3
Reputation: 5467
I don't know about the boot.ci but the I've had similar problems with large vectors in my 32-bit Ubuntu system. The 32-bit systems have a limited address space which is resolved in 64-bit system.
There are some downsides with 64-bits, the main one being that it still isn't standard and that not every software provider has a 64-bit compiled version of their software, the Flash player has the last I've heard only a beta-version for 64-bit. This is usually amendable though through installing a library that allows you to run 32-bit software on a 64-bit system (although with a performance penalty).
These resources might perhaps add shed some more light on the issue:
Upvotes: 1