Reputation: 509
I have a function in R which I want to optimize. It is part of the innermost loop so it runs millions of times and I know from profiling that this function takes up about 80% of the total computation time.
I have been using the profvis
package to understand where my code is slow and by making incremental improvements a single call of the function now takes less than 10 ms to run.
But at this point, profvis
stops working and does not give a useful breakdown of which lines of code are using the most time. For example:
func <- function(x){
x1 <- x ** 2
x2 <- x * 3
x3 <- sum(1:x)
x4 <- x1 + x2 + x3
}
profvis::profvis(func(10))
Error in parse_rprof(prof_output, expr_source) :
No parsing data available. Maybe your function was too fast?
Are their alternative packages or methods that work well to profile functions that take less than 10ms to run?
Upvotes: 2
Views: 497
Reputation: 1493
profvis
uses Rprof
, so you may want to read about its limits at ?Rprof
.
If all else fails, you can run small-scale experiments: change something in your function (e.g. reduce the number of assignments), and then measure the total running time of a loop in which the function is called. This is somewhat complicated by your compiler settings (see ?compiler::compile
). But you have the ultimate way to see if some change helped: run your code (many times), and see if it has become faster.
func <- function(x){
x1 <- x ** 2
x2 <- x * 3
x3 <- sum(1:x)
x4 <- x1 + x2 + x3
}
func2 <- function(x)
x*x + x*3 + sum(seq_len(x))
library("compiler")
ii <- 1:1000000
enableJIT(0)
system.time(for (i in ii) func (100))
system.time(for (i in ii) func2(100))
enableJIT(3)
system.time(for (i in ii) func (100))
system.time(for (i in ii) func2(100))
Upvotes: 1