romsearcher
romsearcher

Reputation: 350

Scipy fmin optimize function with conditions

Let's say I have a function that behaves differently (i.e: by using conditions) for the values of the data. Is it possible to optimize such a function using fmin so as to find a value that maximizes the output?

This is why I currently have:

def bogus_function(x):
    if x[0] > 5 or x[1] > 5:
        return 20
    elif x[1] > 2:
        return x[0]*x[1]
    else:
        return -1

test = lambda x: -bogus_function(x)
results = fmin(test,[0,0])

This is where it gets "stuck", because printing the results and looking at the optimization process, the algorithm tries the same initial guess without changing the values.

Optimization terminated successfully.
     Current function value: 1.000000
     Iterations: 16
     Function evaluations: 63

>>> results
(array([ 0.,  0.]),
 [array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.]),
  array([ 0.,  0.])])

In this case, the bogus_function optimal values are: [5,5], which makes the final maximum value equal to 25. Any idea if that kind of optimization is possible? Thanks!

Upvotes: 0

Views: 493

Answers (1)

ev-br
ev-br

Reputation: 26030

As written, x=[0, 0] qualifies as a local minimum: you move a bit away from it, and the value of the function does not change. You probably want something non-constant.

In general, what matters for many optimization algorithms is whether your function is continuous/differentiable. Some algorithms can deal with the lack of smoothness, some not.

Upvotes: 1

Related Questions