Eldar
Eldar

Reputation: 5237

Performance: global and local declarations with multiple calls

Why is no matter where to declare function in R (both have almost the same performance):

library(microbenchmark)

f1 <- function() {
    lapply(1:100000, function(x) {
        fun <- function() 1:10000
        fun()
    })
}

f2 <- function() {
    fun <- function() 1:10000
    lapply(1:100000, function(x) {
        fun()
    })
}

microbenchmark(f1(), f2(), times = 10)

# Unit: milliseconds
# expr      min       lq     mean   median       uq       max neval
# f1() 456.6720 459.2856 563.0407 507.1933 629.0231  922.8278    10
# f2() 438.5753 445.2491 616.4615 548.6700 615.3313 1048.7325    10

Why is matter where to declare variable in R (global declaration works much faster):

library(microbenchmark)

f1 <- function() {
    lapply(1:100000, function(x) {
        var <- 1:10000
        var
    })
}

f2 <- function() {
    var <- 1:10000
    lapply(1:100000, function(x) {
        var
    })
}

microbenchmark(f1(), f2(), times = 10)

# Unit: milliseconds
# expr       min        lq      mean    median        uq      max neval
# f1() 516.07492 567.71822 611.44760 630.57550 642.47586 701.3975    10
# f2()  49.30975  50.12807  72.44492  52.53448  58.85256 159.2140    10

Why I'm getting these results? So best practice is to avoid variables declaration inside of function if function should called multiple times?

Upvotes: 3

Views: 139

Answers (1)

Roland
Roland

Reputation: 132854

Defining a function has negligible performance cost. The function body is only evaluated if the function is called.

microbenchmark(fun <- function() 1:10000, 
               fun <- function() 1:100000, times = 1000)

#Unit: nanoseconds
#                      expr min  lq    mean median  uq   max neval cld
# fun <- function() 1:10000 198 506 568.462  511.5 548 54620  1000   a
# fun <- function() 1:1e+05 199 504 570.826  511.0 551 18620  1000   a

If you repeat this definition 1e5 times, you need about 50 ms, which is about the difference your benchmarks show.

Creating and filling a big variable has much higher performance cost:

microbenchmark(var <- 1:10000, times = 100)
#Unit: microseconds
#           expr   min     lq    mean median    uq    max neval
# var <- 1:10000 4.183 4.3305 4.92081 4.4135 4.538 15.283   100

Doing that 1e5 times amounts to about 0.5 s, which is about the difference you have benchmarked.

Regarding your last question: Yes, at least if the variables are big.

Upvotes: 3

Related Questions