Reputation: 157
I would like to optimize the schedule of a pump storage plant. Basically there are 96 known prices (for each quarter of the day) and the model should decide whether to (1) pump, (2) turbine or (3) do nothing for each respective quarter. Thereby, there are some bounds for X: -100
To start I tried the following:
from scipy.optimize import minimize
import numpy as np
prices=np.array([[1.5,50,30]])
xp =np.array([[1.5,50,30]])
fun = lambda x: xp* prices #here xp and prices should be matrices
cons = ({'type': 'ineq', 'fun': lambda x: (xp*0.25)<=500},
{'type': 'ineq', 'fun': lambda x: (xp*0.25)>=0})
bnds = ((0, None), (0, None), (0, None))
res = minimize(fun, (2, 0,0), method='SLSQP', bounds=bnds, constraints=cons)
However, this throws an error:
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-17-15c05e084977> in <module>()
10 bnds = ((0, None), (0, None), (0, None))
11
---> 12 res = minimize(fun, (2, 0,0), method='SLSQP', bounds=bnds, constraints=cons)
/Users/ch/miniconda/envs/sci34/lib/python3.4/site-packages/scipy/optimize/_minimize.py in minimize(fun, x0, args, method, jac, hess, hessp, bounds, constraints, tol, callback, options)
450 elif meth == 'slsqp':
451 return _minimize_slsqp(fun, x0, args, jac, bounds,
--> 452 constraints, callback=callback, **options)
453 elif meth == 'dogleg':
454 return _minimize_dogleg(fun, x0, args, jac, hess,
/Users/ch/miniconda/envs/sci34/lib/python3.4/site-packages/scipy/optimize/slsqp.py in _minimize_slsqp(func, x0, args, jac, bounds, constraints, maxiter, ftol, iprint, disp, eps, callback, **unknown_options)
375
376 # Now combine c_eq and c_ieq into a single matrix
--> 377 c = concatenate((c_eq, c_ieq))
378
379 if mode == 0 or mode == -1: # gradient evaluation required
ValueError: all the input arrays must have same number of dimensions
I have no clue why this error appears. Can somebody give me a hint?
Upvotes: 0
Views: 6301
Reputation: 31399
I'll go through your code line by line and highlight some of the problems:
from scipy.optimize import minimize
import numpy as np
prices=np.array([[1.5,50,30]])
xp =np.array([[1.5,50,30]])
prices
and xp
are vectors, not matrices, use np.array([1.5,50,30])
to declare vectors
fun = lambda x: xp* prices #here xp and prices should be matrices
Your right side of the function does not depend on x
, therefore your function is simply constant. Also *
is element-wise in python. You can use np.dot
to calculate the scalar product.
fun = lambda x: np.dot(x, prices)
cons = ({'type': 'ineq', 'fun': lambda x: (xp*0.25)<=500},
{'type': 'ineq', 'fun': lambda x: (xp*0.25)>=0})
This is not how constraints are defined. You may want to check the docs. Inequalities are expressed by a set functions g_i(x)
. Where g_i(x) >= 0
for all i
. Also same problem as above: x
is not used in the right side of your function declaration.
cons = ({'type': 'ineq', 'fun': lambda x: -x*0.25 + 500},
{'type': 'ineq', 'fun': lambda x: x*0.25})
bnds = ((0, None), (0, None), (0, None))
This is fine, however bnds = [(0,None)] * 3
will come in handy when the vectors grow longer.
res = minimize(fun, (2,0,0), method='SLSQP', bounds=bnds, constraints=cons)
Both the function and all restrictions are linear in x
. Thus this is a linear program, and SLSQP
may not be the best way to tackle it. For this example, you may instead want to have a look at scipy.optimize.linprog
.
As a side note: I guess this is only a toy example. Obviously the result of this optimization is the zero vector.
This is the result:
njev: 3
x: array([ 0., 0., 0.])
nit: 3
status: 0
message: 'Optimization terminated successfully.'
jac: array([ 1.5, 50. , 30. , 0. ])
success: True
fun: 0.0
nfev: 15
Upvotes: 7