Reputation: 25
I know this is this is not a practical thing to do (this question I just so I can understand what is going on), but I am wondering why SciPy can’t minimize the following linear combination (it returns the initial weights and only does 1 iteration):
from scipy.optimize import minimize
import numpy as np
mean = np.array([[0.00149066, 0.00076633]])
def constrain1(w):
return w[0] + w[1] - 1
def minimize_func(w):
return (w[0]*mean[0,0] + w[1]*mean[0,1])*(-1)
initial_guess = [0.5,0.5]
bound = (0,1)
bounds = [bound for i in range(2)]
con1 = {"type": "eq", "fun": constrain1}
cons = [con1]
sol = minimize(minimize_func, initial_guess,
method="SLSQP", bounds=bounds, constraints=cons)
Upvotes: 2
Views: 340
Reputation: 6430
There's nothing too mysterious here. The objective function just doesn't improve much by changing the function's parameters.
minimize
estimates the Jacobian, the first derivative of the objective, numerically. It's pretty easy to see that this is just -mean
. The values in mean
are small, however. And given the further constraint that the parameters sum to 1, so the parameters must also be small, this means that the output of minimize_func
will not change greatly as minimize
searches the parameter space. That is, the objective is very similar across the parameter space.
Let's make this concrete. Consider your initial guest of (0.5, 0.5)
. The value of the objective here is:
>>> minimize_func((0.5, 0.5))
-0.001128495
minimize
will make small perturbations to those parameters, and recompute the objective, to determine if those changes improve it. Because you haven't specified any curvature information (the Hessian or second derivative), the function chooses a heuristic step size, in this case 1e-8
. (You can see this by printing w
inside your objective function.)
So how much does that change the objective?
>>> minimize_func((0.5, 0.5)) - minimize_func((0.5 + 1e-8, 0.5 - 1e-8))
7.243299926865121e-12
Not much, unfortunately. This is well below the default tolerance of 1e-6 for this solver.
We can see that minimize
would actually perform further iterations, by specifying a lower tolerance.
>>> minimize(minimize_func, (0.5, 0.5), method="SLSQP", bounds=bounds, constraints=cons, options={'disp': True}, tol=1e-8)
Optimization terminated successfully (Exit mode 0)
Current function value: -0.0014906599999999996
Iterations: 7
Function evaluations: 21
Gradient evaluations: 7
fun: -0.0014906599999999996
jac: array([-0.00149066, -0.00076633])
message: 'Optimization terminated successfully'
nfev: 21
nit: 7
njev: 7
status: 0
success: True
x: array([1.00000000e+00, 5.55111512e-16])
Upvotes: 2