Different arguments for objective function and jacobian using sp.optimize.minimize Python

I want to optimize a function f(x,y,z) over x with sp.optimize.minimize. The Jacobian only depends on x and y, and it is the function J(x,y). (this is just a toy example)

If I try:

import numpy as np
import scipy as sp

def f(x,y,z):
  return(x**2+x*y**3+z)

def J(x,y):
  return(2*x+y**3)

x0,y,z=0,1,4
sp.optimize.minimize(f,x0,args=(y,z),jac=J)

I get an error "J() takes 2 positional arguments but 3 were given", because optimize passes y and z to J.

Is any way to define the arguments I want to pass to f, and the ones I want to pass to J?

(one option is to define f and J such that they have the same arguments and just ignore the ones not needed by the function, but I hope there is a more elegant way)

Upvotes: 0

Views: 274

Answers (1)

kabanus
kabanus

Reputation: 25895

As per the manual, the Jacobian is a callable with signature

J(x, *args)

Where args are explicitly the fixed parameters args=(y,z) in your example. So no in general. On the other hand, nothing is preventing you from writing:

def J(x, y, z):
  return 2*x + y**3

and I do not see anything "inelegant" here. In general we write

df(x, y, z)/dx = f'(x, y, z)

anyway, and this holds for f' being independent of one of the variables - we do not know, and no one frowns on this sort of writing.

If you really want you could have:

def J(x, *args):
    return 2*x + args[0]**3

to hide the extra variables. I would not call this more elegant though.

Upvotes: 2

Related Questions