M. Fire
M. Fire

Reputation: 127

Optimization in Python with Multiple Functions and a Linear Constraint (Linear Programming Problem Query)

I'm a little new to optimization in Python, particularly on Scipy. If I only have one function such as this:

def prod(x):
    return -x[0] * x[1]

Then I can easily know how to do maximization through:

linear_constraint = LinearConstraint([[1,1]], [20], [20])
x0 = np.array([15,5])
result = minimize(prod, x0, constraints=linear_constraint)
print(result['x'])

This is basically maximizing x0 * x1 given the constraint that x0 + x1 = 20. The answer here is array([10., 10.]).

So now, I wish to do something more sophisticated and not just optimizing one function. Suppose I would like to optimize multiple functions. Say I have this matrix P of size n x 3:

A     B     C
3     2     7
6     3     4
8     1     5
...

Is there a way to optimize (minimize) P[i] * x, where * here is a dot product for all i = 1..n? In short, I wish to optimize:

3*x[0] + 2*x[1] + 7*x[2]
6*x[0] + 3*x[1] + 4*x[2]
8*x[0] + 1*x[1] + 5*x[2]
...

and under the constraint that x[0] + x[1] + x[2] = 1. Anyone knows how to implement this correctly on Python (I'm using Scipy by the way)? I'm still not sure where to begin. Thanks in advance!

Upvotes: 0

Views: 328

Answers (1)

Erwin Kalvelagen
Erwin Kalvelagen

Reputation: 16714

For standard optimization tools, the objective function must return a scalar. This is called a "single objective". There is a branch in optimization that deals with multiple objectives (with names like vector optimization, multiple objective optimization and multiple criteria optimization). There is a wide literature about this. I suggest doing some reading to get familiar with the concepts, ideas and algorithms for this.

Upvotes: 1

Related Questions