aneccodeal
aneccodeal

Reputation: 8913

How to get a function from a symbol without using eval?

I've got a symbol that represents the name of a function to be called:

julia> func_sym = :tanh

I can use that symbol to get the tanh function and call it using:

julia> eval(func_sym)(2)
0.9640275800758169

But I'd rather avoid the 'eval' there as it will be called many times and it's expensive (and func_sym can have several different values depending on context).

IIRC in Ruby you can say something like:

obj.send(func_sym, args)

Is there something similar in Julia?

EDIT: some more details on why I have functions represented by symbols:

I have a type (from a neural network) that includes the activation function, originally I included it as a funcion:

type NeuralLayer
  weights::Matrix{Float32}
  biases::Vector{Float32}
  a_func::Function
end

However, I needed to serialize these things to files using JLD, but it's not possible to serialize a Function, so I went with a symbol:

type NeuralLayer
  weights::Matrix{Float32}
  biases::Vector{Float32}
  a_func::Symbol
end

And currently I use the eval approach above to call the activation function. There are collections of NeuralLayers and each can have it's own activation function.

Upvotes: 5

Views: 278

Answers (2)

mbauman
mbauman

Reputation: 31342

@Isaiah's answer is spot-on; perhaps even more-so after the edit to the original question. To elaborate and make this more specific to your case: I'd change your NeuralLayer type to be parametric:

type NeuralLayer{func_type}
  weights::Matrix{Float32}
  biases::Vector{Float32}
end

Since func_type doesn't appear in the types of the fields, the constructor will require you to explicitly specify it: layer = NeuralLayer{:excitatory}(w, b). One restriction here is that you cannot modify a type parameter.

Now, func_type could be a symbol (like you're doing now) or it could be a more functionally relevant parameter (or parameters) that tunes your activation function. Then you define your activation functions like this:

# If you define your NeuralLayer with just one parameter:
activation(layer::NeuralLayer{:inhibitory}) = …
activation(layer::NeuralLayer{:excitatory}) = …
# Or if you want to use several physiological parameters instead:
activation{g_K,g_Na,g_l}(layer::NeuralLayer{g_K,g_Na,g_l} = f(g_K, g_Na, g_l)

The key point is that functions and behavior are external to the data. Use type definitions and abstract type hierarchies to define behavior, as is coded in the external functions… but only store data itself in the types. This is dramatically different from Python or other strongly object-oriented paradigms, and it takes some getting used to.

Upvotes: 7

Isaiah Norton
Isaiah Norton

Reputation: 4366

But I'd rather avoid the 'eval' there as it will be called many times and it's expensive (and func_sym can have several different values depending on context).

This sort of dynamic dispatch is possible in Julia, but not recommended. Changing the value of 'func_sym' based on context defeats type inference as well as method specialization and inlining. Instead, the recommended approach is to use multiple dispatch, as detailed in the Methods section of the manual.

Upvotes: 4

Related Questions