Reputation: 11
I am trying to optimize the volume of a shape and my function returns the side length and height of the shape. However, my function is returning negative values, which do not make sense.
These negative values do indeed maximize the volume, but is there a way to have the maximum volume given that the side length and height are positive?
import math
from autograd import grad
from scipy.optimize import fsolve
z = 3
def objective (X):
x, y = X
return (x*(z**2)*(y))/(4*math.tan(math.pi/z))
def eq(X):
x, y = X
return ((x*(z**2))/(2*math.tan(math.pi/z))) + (x*y*z)-100
def F(L):
'Augmented Lagrange function'
x, y, _lambda = L
return -objective([x, y]) - _lambda * eq([x, y])
dfdL = grad(F, 0)
def obj(L):
x, y, _lambda = L
dFdx, dFdy, dFdlam = dfdL(L)
return [dFdx, dFdy, eq([x, y])]
x, y, _lam = fsolve(obj, [0.0, 0.0, 1.0])
print(f'The answer is at {x, y}')
Upvotes: 1
Views: 108
Reputation: 958
scipy.optimize
has methods that minimize a function with variables subject to bounds. For example TNC (truncated Newton algorithm) that seems totally relevant in your case. Look at the following example:
from scipy.optimize import minimize
def volume(X):
x,y = X
f = (x-3)**2+(y-4)**2
return f
def grad_volume(X):
x,y = X
gx = 2*(x-3)
gy = 2*(y-4)
return [gx,gy]
res = minimize(volume, x0=[1, 1], method='TNC', jac=grad_volume, bounds=[(0,float("inf")),(0,float("inf"))])
print('The answer is at {0}'.format(res['x']))
If you cannot estimate the jacobian yourself, you can ask scipy to approximate the gradient numerically:
res = minimize(volume, x0=[1, 1], method='TNC', jac=None, bounds=[(0,float("inf")),(0,float("inf"))])
Upvotes: 1
Reputation: 595
math.abs() is what you want to use
x, y, _lam = fsolve(obj, [0.0, 0.0, 1.0])
x, y = math.abs(x), math.abs(y)
Upvotes: 0