Reputation: 7989
I have a list of filtering functions f1,f2,f3,f4,....
which take a matrix m
and a number of options as input and return a subset of the rows of matrix as output. Now I would like to be able to define in an orderly way some meta-filtering function settings metaf1, metaf2, metaf3,...
which would specify the sequential application of a specified nr of filtering functions, e.g. first f2
and then f3
, using given options for each. I would like to store these filtering settings in a list of say class "metafiltering"
, and then have another function apply the filtering steps specified in a given metafiltering
object. My idea would be able to in this way allow filtering settings to be stored and applied in an orderly way. How would I achieve this in the most elegant way in R? Or is there perhaps other convenient methods to achieve something like this?
EDIT: to give an example, say I have matrix
m=replicate(10, rnorm(20))
and filtering functions (these are just examples, obviously mine are more complicated :-) )
f1=function(m,opt1,opt2) {
return(m[(m[,2]>opt1)&(m[,1]>opt2),])
}
f2=function(m,opt1) {
return(m[(m[,3]>opt1),])
}
And I have defined the following metafiltering
settings of specific class which would specify two functions which would have to be applied sequentially to matrix m
metafilterfuncs=list(fun1=f1(opt1=0.1,opt2=0.2),fun2=f2(opt1=0.5))
class("metafilterfuncs")="metafiltering"
The question I have then is how I could apply the filtering steps of an arbitrary metafiltering
function object to given matrix m using the specified functions and settings?
Upvotes: 1
Views: 603
Reputation: 6659
pryr has a function, compose
, like what you need, but it doesn't quite cut it. The compose function requires the functions to be given one by one, not in a list, and it cannot take arguments. It's also oddly placed in that package. A similar function can be found in plyr, namely each. But this function does not apply functions sequentially, but individually and outputs a named vector (list?).
agstudy provided a solution above, but it suffers from a problem: it can only take scalar arguments because it gives the arguments in a named vector. The solution to this is to use a named list instead. So, here's an improved function to replace the one in pryr.
compose2 = function(x, funcs, args, msg_intermediate = F) {
if (length(funcs) != length(args)) stop("length of functions and arguments must match")
for (i in seq_along(funcs)) {
x = do.call(what = funcs[[i]], args = c(x, args[[i]]))
if ((i != length(funcs)) && msg_intermediate) message(x)
}
x
}
msg_intermediate
is a nice debugging argument that messages the intermediate results, so one can easier understand what happens.
Test it:
adder = function(x, n) x + n
compose2(0,
funcs = list(adder, adder, adder),
args = list(list(n = 1), list(n = 2), list(n = 3)),
msg_intermediate = T
)
Outputs:
1
3
[1] 6
This is what you get when you take 0, then add 1 (=1), then add 2 (=3), then add 3 (=6).
The args
argument for compose2 takes a list of lists, so that one can supply non-scalar function arguments. Here's an example:
add_div = function(x, n, d) (x + n) / d
compose2(0,
funcs = list(add_div, add_div, add_div),
args = list(list(n = 1, d = 1), list(n = 2, d = 2), list(n = 3, d = 3)),
msg_intermediate = T
)
Output:
1
1.5
[1] 1.5
Which is what you get when you take 0, add 1, divide by 1 (=1), then take 1, add 2 then divide by 2 (=1.5), then take 1.5, add 3 and then divide by 3 (=1.5).
Upvotes: 1
Reputation: 121578
You can do something like this :
You define a sort of functions pieplines where you give a priority for each function.
pipelines <- c(f1=100,f2=300,f3=200)
I define 3 dummy functions here for test:
f1 <- function(m,a) m + a
f2 <- function(m,b) m + b
f3 <- function(m,c) m + c
For each function , you store the argument in another list :
args <- list(f1=c(a=1),f2=c(b=2),f3=c(c=3))
Then you apply your functions :
m <- matrix(1:2,ncol=2)
for (func in names(pipelines[order(pipelines)]))
{
m <- do.call(func,list(m,args[[func]]))
}
Upvotes: 2