Reputation: 51
I have a system of 21 polynomial equations in a total of 12 unknowns a, ..., l
. Each equation has the general form V1*abc + V2*abd + ... + V64*jkl = x
, where V1, ..., V64
are each either 0 or 1, i.e., each equation contains on the left hand side the sum of some products of three different unknowns.
There is a set of constrains: a + b + c + d = 1
, e + f + g + h = 1
, i + j + k + l = 1
. The sum of all x
s (right hand sides) is equal to 1.
I have as an input a vector of x
s. Is there a solver which could provide me the values of a, ..., l
which yield a vector of x'
s as close as possible to the original x
s while adhering to the constrains ? I'm looking for a python implementation.
I looked in scipy.optimize
but I'm not able to establish which method is preferable for my problem.
Upvotes: 0
Views: 447
Reputation: 416
You might want to try this python binding for IPOPT. IPOPT is an optimization library that uses an interior-point solver for finding (local) optima of functions with generalized constraints, both equality and inequality constraints. As you've described your problem, you won't care about the inequality constraints.
A candidate function for your optimization objective would be the sum of the squared differences for your 21 polynomial equations. Let's say you start with your initial x, which is a 21-element vector, then your objective would be:
(V1_0*abc + V2_0*abd + ... + V64_0*jkl - x_0)^2 + (V1_1*abc + V2_1*abd + ... + V64_1*jkl - x_1)^2 + ...(V1_{21}*abc + V2_{21}*abd + ... + V64_{21}*jkl - x_{21})^2
To use IPOPT, you will need to compute the partial derivatives of your constraints and objective wrt all of your variable a-l.
If IPOPT won't work for you, you might even be able to use scipy.optimize with this objective function. From the docs, it looks like scipy.optimize will try to pick the method appropriate for your problem based upon how you define it; if you define your constraints and objective correctly, scipy.optimize should pick the correct method.
Upvotes: 1