Reputation: 1960
I have this formula:
1 - e^(log(0.5) * (x / beta) ^ alpha )
with alpha and beta that are my variable that I have to find. x is a bunch of images (my data) and I can compare that formula's output with the ground truth that comes from a user test. Basically I can generate a loss function that I would like to minimize. For finding the best alpha and beta I tried to use tensorflow but gradient descent and other optimizers appear to fail as that function is not convex (I try different initial conditions). Is there a global optimization tool in python that I can use to solve this problem?
Upvotes: 3
Views: 1286
Reputation: 7131
You could use NLopt, which has some global optimizers, e.g. DIRECT (download at gohlke). Or there is scipy's basinhopping. Another nice solution is NOMAD, a very good black box optimizier. It also has a Python interface, but it is not that user friendly and intuitive.
You can find other hints on local and global optimization in this answer or this answer.
Upvotes: 2