Sergio
Sergio

Reputation: 85

How to create an objective function for scipy minimize

Right, apologies for the abundance of questions, but I'm new to Python and I have been struggling.

I believe I've finally created a somewhat functional code. However, I cannot seem to define the objective function properly. The rest seems to be calculating correctly (based on the values the objective gives me). This is my objective function right now:

def objective (x):
    global sumIp, sumIm
    if (It[i-1] - d[i] + Qt[i-LT]) >= 0:
        sumIp = sumIp + x[2]
        sumIm = sumIm + 0
    else:
        sumIp = sumIp + 0
        sumIm = sumIm - x[2]
    return h*sumIp+b*sumIm

x[2] is meant to be my It[i]. sumIp and sumIm should both be >= 0.

Here is the full code if someone wants to take a look: https://pastebin.com/AxC7fTVv - I believe this is the only part I'm missing to achieve what I want, but I can't figure out how to do it for the life of me, been around this for days! Any help appreciated.

Upvotes: 0

Views: 288

Answers (2)

CrepeGoat
CrepeGoat

Reputation: 2515

Summary

Okay, so. I'm going to summarize your problem (kind of for you, but mostly to help me :p).

You have a sequence of values you want to calculate, which all revolve around figuring out Qt[i]. Those are:

  • d[i] - some list of values provided externally in a "real-world" scenario, but for your purposes are emulated with random values; most importantly, it isn't something that has to be calculated. (Another note: I'm assuming we can't "see into the future" and use d[i+1], or anything like that.)
  • It[i] - given by It[i] = It[i-1] - d[i] + Qt[i-LT] (with the Qt part omitted for i < LT); this is calculated from prior-cycle values and the d values, so this can be easily calculated
  • Ip[i], Im[i] - these are both calculated directly from It[i], so again, easy to calculate
  • NIt[i] - given by NIt[j] = NIt[j-1] - d[j] + Qt[j-1], and can easily be calculated similarly to It[i]
  • Qt[i] - ...?

In short: the only thing that needs to be figured out is Qt[i]. So if you do decide to use an optimizer like scipy.minimize, the only variable you need is x[0]. But if you only have one variable, chances are you don't even need to use an optimizer; more likely you can come up with some function/expression that gives you an optimized result directly.

Solution

...not quite sure yet :\ sorry

Upvotes: 1

CrepeGoat
CrepeGoat

Reputation: 2515

Comment 1

Note - I'm making changes off of the first pastebin copy, as linked in the question description.

Try this: remove the global statement, so the objective function looks like

def objective (x):
    # [`global` removed here]
    if (It[i-1] - d[i] + Qt[i-LT]) >= 0:
        sumIp = sumIp + x[2]
        sumIm = sumIm + 0
    else:
        sumIp = sumIp + 0
        sumIm = sumIm - x[2]
    return h*sumIp+b*sumIm

This way, the sumIp and sumIm values for the x[2] at that moment are created locally on every objective call, instead of edited globally. (You may want to rename the local variables, to avoid confusion.)

Then, after minimize is finished, you push the changes for the final, optimal x[2] value, like so:

def test(T):
    global i
    while i < T:
        # [...]
        sol = minimize(objective, x0, constraints=cons)

        if (It[i-1] - d[i] + Qt[i-LT]) >= 0:
            sumIp = sumIp + sol.x[2]
            sumIm = sumIm + 0
        else:
            sumIp = sumIp + 0
            sumIm = sumIm - sol.x[2]
        # [...]
        i += 1
    return Qt, NIt, It

Comment 2

Okay so. x[0] == Qt[i], not Qt[i-1], right? If so, then you can't swap Qt[i-1] and x[0] trivially. Also the fact that the optimizer stops doing things when you remove x[0] makes sense; the only thing it's allowed to change when trying to minimize your expressions are the x values, and it you remove them then the minimizer has less that it's allowed to do.

Regarding the general "strange[ness]", it might have something to do with the fact that the constraints use if-statements, which basically makes them piecewise functions. While there are minimization methods that work on non-linear constraints, I'm not sure if there are methods that work on non-differentiable constraints.

To fix that, see the changes I made in this paste. I replaced x[2] with two strictly-non-negative variables, x[2] and x[3], where the old value is now x[2] - x[3]. This eliminates the need for if-statements in the objective. To make the variables non-negative, I added boundary conditions to the problem with x_bounds. (Note that this obviates the need for a constraint1 function, so I removed it. There's much more room for other simplifications in the code, but as it's not necessary I left everything else alone.)

So the only part that's left that I don't quite get is constraint2: can you explain again what it's supposed to be doing?

Upvotes: 0

Related Questions