Reputation: 9536
For the sake of organization, I outline an ML optimizer with the rest of my config constants at the top of my file:
optimizer = torch.optim.SGD()
To use the optimizer, I have to pass in the model parameters, generated later on in the code
optimizer = torch.optim.SGD(model.parameters, lr=LEARNING_RATE)
Is there any way for me to pass arguments into the variable optimizer
?
edit: I think my question is unclear, here's a simpler example of what I was asking:
#take the square of some arbitrary number
fn = math.prod()
x = 5
#how to feed x into the variable fn?
Upvotes: 1
Views: 48
Reputation: 9536
I figured it out, seems really simple but I'll leave it up because I couldn't find it online anywhere.
You can just pass the arguments to the new variable because it's now the same object as the function
optimizer(model.parameters, lr=LEARNING_RATE)
Upvotes: 0
Reputation: 903
I'm not sure if this is exactly what you are after, but I often use functools.partial to bind values to parameters.
import functools
optimizer = functools.partial(torch.optim.SGD, model.parameters, lr=LEARNING_RATE)
then you can call optimizer as a function and it will have the first positional paramerte and the "lr" keyword parameter already set, and you can pass whatever other parameters you need when you call optimizer.
Upvotes: 0
Reputation: 3542
Sure!
>>> def add(x, y):
... return x+y
...
>>> z = add
>>> z(1,1)
2
Upvotes: 2
Reputation: 530920
It sounds like you are asking about partial application. This is a way of "wrapping" a function and some arguments into a single, new callable, which takes fewer (or even no) arguments.
>>> from functools import partial
>>> f = lambda x, y, z: x + y + z
>>> g = partial(f, 2, 3)
>>> g(4)
9
>>> h = partial(f, 1, 2, 3)
>>> h()
6
Upvotes: 0