Reputation: 21
I am trying to perform minimization of the following function:
def mvqr(P, y, x, c):
s = 0
for i in xrange(1, len(y)):
summation = numpy.linalg.norm(numpy.dot(numpy.linalg.inv(P), (y[i,:] - numpy.dot(beta, x[i,:])))) + numpy.dot(numpy.dot(c.T, linalg.inv(P)), (y[i,:] - numpy.dot(beta, x[i,:])))
s = s + summation
return s
this are the lines of the main file:
fun = lambda beta: mvqr(E, Y_x, X_x, v)
result = minimize(fun, beta0, method = 'BFGS')
beta is the unknown variable of the function mvqr()
and beta0
is the initial guess, a (2,2)
array I have previously calculated.
I got an error:
NameError: global name 'beta' is not defined
.
For who is wondering if the file of the function mvqr()
has already been located in the directory of the python packages, the answer is: yes, it has.
I think the problem is with beta
in the mvqr()
function and the use of lambda
function.
Any help?
EDIT
Thanks to pv. the code now compiles with no error but when perform minimization does not iterate since the output of the function minimize
displays the message 'Optimization terminated successfully.'
but simply does not iterate and returns the initial guess.
status: 0
success: True
njev: 1
nfev: 6
hess_inv: array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 1, 0],
[0, 0, 0, 1]])
fun: 1.2471261924040662e+31
x: array([ 3.44860608e+13, -4.10768809e-02, -1.42222910e+15,
-1.22803296e+00])
message: 'Optimization terminated successfully.'
jac: array([ 0., 0., 0., 0.])
I have also tried with scipy.optimize.fmin_bfgs
but the result is pretty the same:
Optimization terminated successfully.
Current function value: 937385449919245008057547138533569682802290504082509386481664.000000
Iterations: 0
Function evaluations: 6
Gradient evaluations: 1
It could be that unfortunately beta0
is a local minimum or however a stationary point as holds jac == [0, 0, 0, 0]
and therefore the algorithm terminates, but it looks strange to me that the initial guess is the minimum of the function (even if a local one). Does anyone have idea of how to avoid it?
Any help would be appreciated.
Upvotes: 1
Views: 3069
Reputation: 35125
Change definition to def mvqr(beta, P, y, x, c):
and do fun = lambda beta: mvqr(beta.reshape(2,2), E, Y_x, X_x, v)
and minimize(fun, beta0.ravel())
if you wish to optimize value of beta
that is a 2x2 matrix.
After that, consider reading a Python tutorial, esp. on global and local variables.
Upvotes: 2