Eric
Eric

Reputation: 2128

Using MS Solver Foundation with Matrix Multiplication

I am trying to use the Microsoft Solver Foundation to optimize a problem that I have involving matrix multiplication. I can do this using Excel's solver, but I am trying to integrate it in C# and am having trouble. Here is a description with an example:

Say you have a (3x3) matrix y, defined as:

Double[][] y = 
{
    new Double[] { 5, 1, 0 },
    new Double[] { 1, 9, 1 },
    new Double[] { 0, 1, 9 },
};

I want to find the (1x3) matrix x such that: x * y * x' is minimized. Further, the x values must sum to 1 and none of the x values may be less than 0.

Here is the code I have so far:

 SolverContext context = SolverContext.GetContext();             // Get context environment
 Model model = context.CreateModel();                            // Create a new model

 Decision d1 = new Decision(Domain.RealNonnegative, "d1");       // First item in "x" vector (must be >= 0)
 Decision d2 = new Decision(Domain.RealNonnegative, "d2");       // Second item in "x" vector (must be >= 0)
 Decision d3 = new Decision(Domain.RealNonnegative, "d3");       // Third item in "x" vector (must be >= 0)
 model.AddDecisions(d1, d2, d3);                                 // Add these to the model (this is where the outputs will be stored)

 model.AddConstraints("limits",                                  // Add constraints
     0 <= d1 <= 1,                                               // Each item must be between 0 and 1
     0 <= d2 <= 1,
     0 <= d3 <= 1,
     d1 + d2 + d3 == 1);                                         // All items must add up to 1

The part that I'm stuck on is when you tell it what you want to minimize:

 model.AddGoal("min", GoalKind.Minimize, /* What goes here? */);

This part would normally contain an equation (e.g. d1 * d2 + d3), but the matrix multiplication is not that simple.

I can create a function that performs the multiplication and returns a double, but AddGoal() expects a Term object and also I would have to do arithmetic on Decision objects.

Alternatively, I can factor out this multiplication into a huge string expression (which I have already done), but I would prefer if I didn't have to do it this way. (This string looks like: "d1 * 5 + d2 * 1 + d3 * 0 ...")

Any ideas? Thanks.

PS: The correct answer (according to Excel) is:

d1 = 0.503497
d2 = 0.216783
d3 = 0.27972

Note: The solution must be scalable to have n number of "Decisions"

Upvotes: 2

Views: 1412

Answers (1)

Axel Kemper
Axel Kemper

Reputation: 11322

The following yields the expected solution:

using System;
using Microsoft.SolverFoundation.Services;

namespace akMSFStackOverflow
{
    class Program
    {
        static void Main(string[] args)
        {
            Double[,] y = 
            {
                { 5, 1, 0 },
                { 1, 9, 1 },
                { 0, 1, 9 },
            };

            Term goal;
            Term[,] tx;
            Term[,] ty;

            SolverContext context = SolverContext.GetContext();             // Get context environment
            Model model = context.CreateModel();                            // Create a new model

            Decision d1 = new Decision(Domain.RealNonnegative, "d1");       // First item in "x" vector (must be >= 0)
            Decision d2 = new Decision(Domain.RealNonnegative, "d2");       // Second item in "x" vector (must be >= 0)
            Decision d3 = new Decision(Domain.RealNonnegative, "d3");       // Third item in "x" vector (must be >= 0)
            model.AddDecisions(d1, d2, d3);                                 // Add these to the model (this is where the outputs will be stored)

            model.AddConstraints("limits",                                  // Add constraints
                0 <= d1 <= 1,                                               // Each item must be between 0 and 1
                0 <= d2 <= 1,
                0 <= d3 <= 1,
                d1 + d2 + d3 == 1);                                         // All items must add up to 1

            ty = matrix(y);
            tx = new Term[,] { { d1, d2, d3 } };

            goal = matMult(matMult(tx, ty), transpose(tx))[0, 0];

            model.AddGoal("goal", GoalKind.Minimize, goal);

            // Specifying the IPM solver, as we have a quadratic goal 
            Solution solution = context.Solve(new InteriorPointMethodDirective());


            Report report = solution.GetReport();
            Console.WriteLine("x {{{0}, {1}, {2}}}", d1, d2, d3);
            Console.Write("{0}", report); 

        }


        static Term[,] matrix(Double[,] m)
        {
            int rows = m.GetLength(0);
            int cols = m.GetLength(1);
            Term[,] r = new Term[rows, cols];

            for (int row = 0; row < rows; row++)
                for (int col = 0; col < cols; col++)
                    r[row, col] = m[row, col];

            return r;
        }

        static Term[,] matMult(Term[,] a, Term[,] b)
        {
            int rows = a.GetLength(0);
            int cols = b.GetLength(1);
            Term[,] r = new Term[rows, cols];

            for (int row = 0; row < rows; row++)
                for (int col = 0; col < cols; col++)
                {
                    r[row,col] = 0;
                    for (int k = 0; k < a.GetLength(1); k++)
                    {
                        r[row, col] += a[row, k] * b[k, col];
                    }
                }

            return r;
        }

        static Term[,] transpose(Term[,] m)
        {
            int rows = m.GetLength(0);
            int cols = m.GetLength(1);
            Term[,] r = new Term[cols, rows];

            for (int row = 0; row < rows; row++)
                for (int col = 0; col < cols; col++)
                {
                    r[col, row] = m[row, col];
                }

            return r;
        }
    }
}

Upvotes: 1

Related Questions