ThierryONERA
ThierryONERA

Reputation: 71

openmdao v1.4 optimization with metamodel

I which to perform an optimization with openmdao 1.4 on a metamodel. Using the tutorials I have build u p problem that i do not mange to solve: I think the problem is coming from a misuse of setup() and run() : I do not manage to train my metamodel and to optimize on it at the same time (perhpas I should use two differentes "groups" to do this ..) Here is my code :

    from __future__ import print_function


from openmdao.api import Component, Group, MetaModel ,IndepVarComp, ExecComp, NLGaussSeidel, KrigingSurrogate, FloatKrigingSurrogate

import numpy as np


class KrigMM(Group):
    ''' FloatKriging gives responses as floats '''

    def __init__(self):
        super(KrigMM, self).__init__()

        # Create meta_model for f_x as the response

        pmm = self.add("pmm", MetaModel())
        pmm.add_param('x', val=0.)

        pmm.add_output('f_x:float', val=0., surrogate=FloatKrigingSurrogate())
        self.add('p1', IndepVarComp('x', 0.0))

        self.connect('p1.x','pmm.x')

       # mm.add_output('f_xy:norm_dist', val=(0.,0.), surrogate=KrigingSurrogate())


if __name__ == '__main__':
    # Setup and run the model.

    from openmdao.core.problem import Problem
    from openmdao.drivers.scipy_optimizer import ScipyOptimizer
    from openmdao.core.driver import Driver

    import numpy as np
    import doe_lhs

    #prob = Problem(root=ParaboloidProblem())
###########################################################    

    prob = Problem(root=Group())
    prob.root.add('meta',KrigMM(), promotes=['*'])

    prob.driver = ScipyOptimizer()
    prob.driver.options['optimizer'] = 'SLSQP'

    prob.driver.add_desvar('p1.x', lower=0, upper=10)

    prob.driver.add_objective('pmm.f_x:float')
    prob.setup()
    prob['pmm.train:x'] = np.linspace(0,10,20)
    prob['pmm.train:f_x:float'] = np.sin(prob['pmm.train:x'])      
    prob.run()

    print('\n')
    print('Minimum of %f found for meta at %f' % (prob['pmm.f_x:float'],prob['pmm.x'])) #predicted value 

Upvotes: 1

Views: 165

Answers (1)

Justin Gray
Justin Gray

Reputation: 5710

I believe your problem is actually working fine. Its just that the sinusiod you've picked has an local optimum at 0.0, which happens to be your initial condition.

If I change the initial condition as follows:

prob.setup()
prob['p1.x'] = 5
prob['pmm.train:x'] = np.linspace(0,10,20)
prob['pmm.train:f_x:float'] = np.sin(prob['pmm.train:x'])      
prob.run()

I get:

Optimization terminated successfully.    (Exit mode 0)
        Current function value: [-1.00004544]
        Iterations: 3
        Function evaluations: 3
        Gradient evaluations: 3
Optimization Complete
-----------------------------------


Minimum of -1.000045 found for meta at 4.710483

Upvotes: 1

Related Questions