user1054524
user1054524

Reputation: 61

how to: solver foundation quadratic least squares

I have two independent variables, GSH and Gls. Using these two variables, I'm trying to predict an outcome, prob. Using a function of the form:

prob=a*Gls^2+b*GSH^2+c*Gls+d*GSH+e // (where a,b,c,d,e are coefficients)

Sample of data:

Gls( 2.3 2.3 2.5 2.5 2.5 2.5 2.7 2.7 2.7 2.7 2.7 2.9 2.9 2.9 2.9 2.9 3.1 3.1 3.1 3.1 3.1 3.1 3.3 3.3 3.3 3.3 3.3 3.3 3.5 3.5 3.5 3.5 3.5)

GSH( 0.475 0.525 0.425 0.475 0.525 0.575 0.425 0.475 0.525 0.575 0.625 0.425 0.475 0.525 0.575 0.625 0.375 0.425 0.475 0.525 0.575 0.625 0.375 0.425 0.475 0.525 0.575 0.625 0.425 0.475 0.525 0.575 0.625)

prob( 0.263636 0.324159 0.319328 0.291295 0.286086 0.253994 0.233766 0.284644 0.273818 0.263743 0.175182 0.243986 0.284848 0.28066 0.247863 0.183468 0.181818 0.237288 0.269266 0.2555 0.240924 0.206081 0.209677 0.216949 0.263261 0.25966 0.23588 0.203252 0.239316 0.209184 0.234818 0.242424 0.192118)

I would like to find the best values of the coefficients to minimize the sum of least squares.

I have read lots on the foundation solver but i have been unable to work out how to set this problem up in the c# Solver Foundation. All code suggestions are greatly appreciated.

Thanks

Upvotes: 6

Views: 1429

Answers (3)

Gebb
Gebb

Reputation: 6556

I guess you don't need solver foundation for that. There is no need in numerical optimization, because the solution (the vector of the polynomial coefficients which minimizes the sum of squared vertical distances between the observed responses in the dataset and the predicted responses) exists in a closed form.

See wikipedia for details.

Upvotes: 2

Christian Genne
Christian Genne

Reputation: 220

The following solution is very straight forward, just trying to find the local minimum using the algorithm you describe. Using it I get the following values

a=0.02527237, b=0.04768372, c=-0.001549721, d=0.01382828, e=0.002026558

with the total square of 0.2139592.

    static void Main(string[] args)
    {
        var a = FindLocalMinimum(x => SumSq(x, 0, 0, 0, 0));
        var b = FindLocalMinimum(x => SumSq(a, x, 0, 0, 0));
        var c = FindLocalMinimum(x => SumSq(a, b, x, 0, 0));
        var d = FindLocalMinimum(x => SumSq(a, b, c, x, 0));
        var e = FindLocalMinimum(x => SumSq(a, b, c, d, x));
    }

    private static float SumSq(float a, float b, float c, float d, float e)
    {
        var gls = new[]
                      {
                          2.3, 2.3, 2.5, 2.5, 2.5, 2.5, 2.7, 2.7, 2.7, 2.7, 2.7, 2.9, 2.9, 2.9, 2.9, 2.9, 3.1, 3.1, 3.1
                          , 3.1, 3.1, 3.1, 3.3, 3.3, 3.3, 3.3, 3.3, 3.3, 3.5, 3.5, 3.5, 3.5, 3.5
                      };

        var gsh = new[]
                      {
                          0.475, 0.525, 0.425, 0.475, 0.525, 0.575, 0.425, 0.475, 0.525, 0.575, 0.625, 0.425, 0.475,
                          0.525, 0.575, 0.625, 0.375, 0.425, 0.475, 0.525, 0.575, 0.625, 0.375, 0.425, 0.475, 0.525,
                          0.575, 0.625, 0.425, 0.475, 0.525, 0.575, 0.625
                      };

        var prob = new[]
                       {
                           0.263636, 0.324159, 0.319328, 0.291295, 0.286086, 0.253994, 0.233766, 0.284644, 0.273818,
                           0.263743, 0.175182, 0.243986, 0.284848, 0.28066, 0.247863, 0.183468, 0.181818, 0.237288,
                           0.269266, 0.2555, 0.240924, 0.206081, 0.209677, 0.216949, 0.263261, 0.25966, 0.23588,
                           0.203252, 0.239316, 0.209184, 0.234818, 0.242424, 0.192118
                       };

        var res = 0.0;
        for (var i = 0; i < prob.Length; i++)
        {
            var p = a*Math.Pow(gls[i], 2) + a*Math.Pow(gsh[i], 2) + c*gls[i] + d*gsh[i] + e;
            res += Math.Pow(p - prob[i], 2);
        }
        return (float)res;
    }

    private static float FindLocalMinimum(Func<float, float> f)
    {
        float bestV = float.MaxValue;
        float bestX = 0;
        float x = 0;
        float lastV = bestV;
        float diff = 1000.0f;
        while (Math.Abs(diff) > 0.0001f)
        {
            float v = f(x);
            if (v < bestV)
            {
                bestV = v;
                bestX = x;
            }
            else if (v > lastV)
            {
                diff *= -0.5f;
            }
            lastV = v;
            x += diff;
        }
        return bestX;
    }

Upvotes: 1

Yin Zhu
Yin Zhu

Reputation: 17119

You can use solver foundation. Your regression is already nonlinear, and is actually a generalized linear regression. In R, you can use package like glm to do the regression.

In C#, I am not sure if any open source code exists. But anyway, you can solve the optimization yourself, and MSF has a nonlinear solver in it! So just write two functions:

  1. the objective function and

  2. its gradient

As a quick example, you can see my article:

Logistic Regression in F# using Microsoft Solver Foundation

But you don't need to know about logistic regression, in the article, I also include a simpler example showing how to optimize a 2-variable Rosenbrock function.

MSF also has a embedded domain specific language for C# using its implicit conversion language feature. [You can find the example in MSF's documents.]

Upvotes: 0

Related Questions