r2evans
r2evans

Reputation: 160687

when is R's `ByteCompile` counter-productive?

The R docs describe the ByteCompile field in the "DESCRIPTION file" section as:

The ‘ByteCompile’ logical field controls if the package code is to be byte-compiled on installation: the default is currently not to, so this may be useful for a package known to benefit particularly from byte-compilation (which can take quite a long time and increases the installed size of the package)

I infer the only detrimental side-effects to byte-compiling are (a) time-to-install and (b) installation size. I haven't found a package that takes too long during installation/byte-compiling, and the general consensus is that GBs are cheap (for storage).

Q: When should I choose to not byte-compile packages I write? (Does anybody have anecdotal or empirical limits beyond which they choose against it?)

Edit: As noted in the comments of an older question, the rationale that debugging is not possible with byte-compiled code has been debunked. Other related questions on SO have discussed how to do it (either manually with R CMD INSTALL --byte-compile ... or with install.packages(..., type="source", INSTALL_opts="--byte-compile")), but have not discussed the ramifications of or arguments against doing so.

Upvotes: 7

Views: 746

Answers (1)

csgillespie
csgillespie

Reputation: 60492

I have yet to find a downside for byte-compiling, other than the ones you mention: slightly increased file size and installation time.

In the past, compiling certain code could cause slow-down but in recent versions of R (version >3.3.0), this doesn't seem to be a problem.

Upvotes: 3

Related Questions