Reputation: 111
I'm trying to minimize a constrained function of several variables adopting the algorithm scipy.optimize.minimize
. The function concerns the minimization of 3*N
parameters, where N
is an input. More specifically, my minimization parameters are given in three arrays H = H[0],H[1],...,H[N-1]
, a = a[0],a[1],...,a[N-1]
and b = b[0],b[1],...,b[N-1]
which I concatenated in only one array named mins
, with len(mins)=3*N
.
Those parameters are also subjected to constraints as follows:
0 <= H and sum(H) = 0.5
0 <= a <= Pi/2
0 <= b <= Pi/2
So, my code for the constraints read as:
import numpy as np
# constraints on x:
def Hlhs(mins): # left hand side
return np.diag(np.ones(N)) @ mins.reshape(3,N)[0]
def Hrhs(mins): # right hand side
return np.sum(mins.reshape(3,N)[0]) - 0.5
con1H = {'type': 'ineq', 'fun': lambda H: Hlhs(H)}
con2H = {'type': 'eq', 'fun': lambda H: Hrhs(H)}
# constraints on a:
def alhs(mins):
return np.diag(np.ones(N)) @ mins.reshape(3,N)[1]
def arhs(mins):
return -np.diag(np.ones(N)) @ mins.reshape(3,N)[1] + (np.ones(N))*np.pi/2
con1a = {'type': 'ineq', 'fun': lambda a: alhs(a)}
con2a = {'type': 'ineq', 'fun': lambda a: arhs(a)}
# constraints on b:
def blhs(mins):
return np.diag(np.ones(N)) @ mins.reshape(3,N)[2]
def brhs(mins):
return -np.diag(np.ones(N)) @ mins.reshape(3,N)[2] + (np.ones(N))*np.pi/2
con1b = {'type': 'ineq', 'fun': lambda b: blhs(b)}
con2b = {'type': 'ineq', 'fun': lambda b: brhs(b)}
My function, with the other parameters (and adopting N=3
) to be minimized, is given by (I'm sorry if it is too long):
gamma = 17
C = 85
T = 0
Hf = 0.5
Li = 2
Bi = 1
N = 3
def FUN(mins):
H, a, b = mins.reshape(3,N)
S1 = 0; S2 = 0
B = np.zeros(N); L = np.zeros(N);
for i in range(N):
sbi=Bi; sli=Li
for j in range(i+1):
sbi += 2*H[j]*np.tan(b[j])
sli += 2*H[j]*np.tan(a[j])
B[i]=sbi
L[i]=sli
for i in range(N):
S1 += (C*(1-np.sin(a[i])) + T*np.sin(a[i])) * (Bi*H[i]+H[i]**2*np.tan(b[i]))/np.cos(a[i]) + \
(C*(1-np.sin(b[i])) + T*np.sin(b[i])) * (Li*H[i]+H[i]**2*np.tan(a[i]))/np.cos(b[i])
S2 += (gamma*H[0]/12)*(Bi*Li + 4*(B[0]-H[0]*np.tan(b[0]))*(L[0]-H[0]*np.tan(a[0])) + B[0]*L[0])
j=1
while j<(N):
S2 += (gamma*H[j]/12)*(B[j-1]*L[j-1] + 4*(B[j]-H[j]*np.tan(b[j]))*(L[j]-H[j]*np.tan(a[j])) + B[j]*L[j])
j += 1
F = 2*(S1+S2)
return F
And, finally, adopting an initial guess for the values as 0, the minimization is given by:
x0 = np.zeros(3*N)
res = scipy.optimize.minimize(FUN,x0,constraints=(con1H,con2H,con1a,con2a,con1b,con2b),tol=1e-25)
My problems are:
a) Observing the result res
, some values got negative even though I have constraints for them to be positive. The success of the minimization was False
, and the message was: Positive directional derivative for linesearch
. Also, the result is very far from the minimum expected.
b) Adopting the method='trust-constr'
I got a value closer to what I was expecting but with a false success and the message The maximum number of function evaluations is exceeded.
. Is there any way to improve this?
I know that there is a minimum very close to these values:
H = [0.2,0.15,0.15]
a = [1.0053,1.0053,1.2566]
b = [1.0681,1.1310,1.3195]
where the value for the function is 123,45
. I've checked the function several times and it seems to be working properly. Can anyone help me to find where my problem is? I've tried to change xtol
and maxiter
but with no success.
Upvotes: 2
Views: 1560
Reputation: 7157
Here are a few hints:
Your initial point x0
is not feasible since it doesn't satisfy the constraint sum(H) = 0.5
. Providing a feasible initial point should fix your first problem.
Except for the constraint sum(H) = 0.5
, all constraints are simple bounds on the variables. In general, it's recommended to pass variable bounds via the bounds
parameter of minimize
. You can simply define and pass all the bounds like this
from scipy.optimize import minimize
import numpy as np
# ..your variables and functions ..
bounds = [(0, None)]*N + [(0, np.pi/2)]*2*N
x0 = np.zeros(3*N)
x0[0] = 0.5
res = minimize(FUN, x0, constraints=(con2H,), bounds=bounds,
method="trust-constr", options={'maxiter': 20000})
where each tuple contains the lower and upper bound for each variable.
from cyipopt import minimize_ipopt
# rest as above
res = minimize_ipopt(FUN, x0, constraints=(con2H,), bounds=bounds)
this gives me a solution with objective value 122.9.
Upvotes: 2