Reputation: 19
I'm trying to optimize one constrained, nonlinear model with scipy.
import numpy as np; from scipy.optimize import minimize; import math
# initial guesses
n = 2
x0 = np.zeros(n)
T = 0.1
L = 0.1
def objective(T, L):
try:
return (350 / T) + (35 * ((312.5 * (T / 2)) + (11.69 * (math.sqrt(T + L))) + (6.6256 * math.sqrt(T + L))))
except ValueError:
return None
def constraint1(T, L):
(6.67 * math.sqrt(T + L) / (1250 * (T + L))) - 0.02
# show initial objective
print('Initial Objective: ' + str(objective(T, L)))
# optimize
con1 = {'type': 'ineq', 'fun': constraint1}
cons = ([con1])
solution = minimize(objective, x0, args=cons)
x = solution.x
# show final objective
print('Final Objective: ' + str(objective(T, L)))
# print solution
print('Solution'); print('x1 = ' + str(x[0])); print('x2 = ' + str(x[1]))
When I ran the code, I get the error,
line 24, in objective return (350 / T) + (35 * ((312.5 * (T / 2)) + (11.69 * (math.sqrt(T + L))) + (6.6256 * math.sqrt(T + L)))) TypeError: unsupported operand type(s) for +: 'float' and 'dict'
on the objective(T,L), T described as,
Parameter T of scipyProject.constraint1 T: {truediv, add} pythonProject
but on constraint1(T, L)
Parameter T of scipyProject.constraint1 T: {add} pythonProject
Can you help me, why am I getting this error?
Upvotes: 0
Views: 904
Reputation: 61
I ran your code to debug it. I noticed the following and made the changes accordingly:
constraint1(T, L)
did not return anything.args=cons
will pass the dictionary you made for the constraints as args to your objective function. The scipy documentation defines the objective function to take in a NumPy array as a single argument.math.sqrt(T+L)
, and there is no constraint mentioning T+L > 0
. I got a domain error from the math.sqrt
function when I fixed the previous problem. I checked out this answer by @Peter and decided to do something similar using `safeSqrt' function.Here is your code after some changes. (Btw, I don't know much about this equation, but it looks like setting L = -T is always good. So you can eliminate some terms and optimize for a single variable.)
import numpy as np
from scipy.optimize import minimize
import math
T = 0.1
L = 0.1
x0 = np.array([T, L])
def safeSqrt(s):
# extend math.sqrt with a safe function
# ref: https://stackoverflow.com/questions/40372471/math-domain-error-due-to-disrespected-constraint-in-scipy-slsqp-minimize
return math.sqrt(s) if s > 0 else 0
def objective(x):
T, L = x[0], x[1]
linearPart = (312.5 * T / 2)
sqrtPart = (11.69 + 6.6256) * safeSqrt(T + L)
return 10 / T + (linearPart + sqrtPart) # your original objective was 35 times this value
def constraint(x):
T, L = x[0], x[1]
result = 6.67 / (1250 * .02) - safeSqrt(T+L) # equivalent to your original constraint
return result
# show initial objective
print('Initial Objective: ' + str(objective(x0)))
print("contraint: ", constraint(x0))
# optimize
solution = minimize(objective, x0, constraints={"fun": constraint, "type": "ineq"})
x = solution.x
# show final objective
print('Final Objective: ' + str(objective(x)))
# print solution
print('Solution')
print('x1 = ' + str(x[0]))
print('x2 = ' + str(x[1]))
The code output is:
Initial Objective: 115.625
contraint: -0.18041359549995795
Final Objective: 79.05694150421618
Solution
x1 = 0.25298231697349643
x2 = -1.5173265889148493
Please let me know if this is what you wanted :)
Upvotes: 2