ThierryONERA
ThierryONERA

Reputation: 71

Openmdao V1.7 Sellar MDF

I foound out something strange with the MDA of sellar problem on the doc page of OpenMDAO (http://openmdao.readthedocs.io/en/1.7.3/usr-guide/tutorials/sellar.html)

If I extract the code and only run the MDA (adding counters in the disciplines), I observe that the number of calls is differents between disciplines (twice the number of d2 for d1 discipline) which is not expected . Does someone has an answer ?

Here is the results

Coupling vars: 25.588303, 12.058488 Number of discipline 1 and 2 calls (10,5)

And here is the code

# For printing, use this import if you are running Python 2.x from __future__ import print_function


import numpy as np

from openmdao.api import Component from openmdao.api import ExecComp, IndepVarComp, Group, NLGaussSeidel, \
                         ScipyGMRES

class SellarDis1(Component):
    """Component containing Discipline 1."""

    def __init__(self):
        super(SellarDis1, self).__init__()

        # Global Design Variable
        self.add_param('z', val=np.zeros(2))

        # Local Design Variable
        self.add_param('x', val=0.)

        # Coupling parameter
        self.add_param('y2', val=1.0)

        # Coupling output
        self.add_output('y1', val=1.0)
        self.execution_count = 0

    def solve_nonlinear(self, params, unknowns, resids):
        """Evaluates the equation
        y1 = z1**2 + z2 + x1 - 0.2*y2"""

        z1 = params['z'][0]
        z2 = params['z'][1]
        x1 = params['x']
        y2 = params['y2']

        unknowns['y1'] = z1**2 + z2 + x1 - 0.2*y2
        self.execution_count += 1
    def linearize(self, params, unknowns, resids):
        """ Jacobian for Sellar discipline 1."""
        J = {}

        J['y1','y2'] = -0.2
        J['y1','z'] = np.array([[2*params['z'][0], 1.0]])
        J['y1','x'] = 1.0

        return J


class SellarDis2(Component):
    """Component containing Discipline 2."""

    def __init__(self):
        super(SellarDis2, self).__init__()

        # Global Design Variable
        self.add_param('z', val=np.zeros(2))

        # Coupling parameter
        self.add_param('y1', val=1.0)

        # Coupling output
        self.add_output('y2', val=1.0)
        self.execution_count = 0
    def solve_nonlinear(self, params, unknowns, resids):
        """Evaluates the equation
        y2 = y1**(.5) + z1 + z2"""

        z1 = params['z'][0]
        z2 = params['z'][1]
        y1 = params['y1']

        # Note: this may cause some issues. However, y1 is constrained to be
        # above 3.16, so lets just let it converge, and the optimizer will
        # throw it out
        y1 = abs(y1)

        unknowns['y2'] = y1**.5 + z1 + z2
        self.execution_count += 1
    def linearize(self, params, unknowns, resids):
        """ Jacobian for Sellar discipline 2."""
        J = {}

        J['y2', 'y1'] = .5*params['y1']**-.5

        #Extra set of brackets below ensure we have a 2D array instead of a 1D array
        # for the Jacobian;  Note that Jacobian is 2D (num outputs x num inputs).
        J['y2', 'z'] = np.array([[1.0, 1.0]])

        return J



class SellarDerivatives(Group):
    """ Group containing the Sellar MDA. This version uses the disciplines
    with derivatives."""

    def __init__(self):
        super(SellarDerivatives, self).__init__()

        self.add('px', IndepVarComp('x', 1.0), promotes=['x'])
        self.add('pz', IndepVarComp('z', np.array([5.0, 2.0])), promotes=['z'])

        self.add('d1', SellarDis1(), promotes=['z', 'x', 'y1', 'y2'])
        self.add('d2', SellarDis2(), promotes=['z', 'y1', 'y2'])

        self.add('obj_cmp', ExecComp('obj = x**2 + z[1] + y1 + exp(-y2)',
                                     z=np.array([0.0, 0.0]), x=0.0, y1=0.0, y2=0.0),
                 promotes=['obj', 'z', 'x', 'y1', 'y2'])

        self.add('con_cmp1', ExecComp('con1 = 3.16 - y1'), promotes=['y1', 'con1'])
        self.add('con_cmp2', ExecComp('con2 = y2 - 24.0'), promotes=['con2', 'y2'])

        self.nl_solver = NLGaussSeidel()
        self.nl_solver.options['atol'] = 1.0e-12

        self.ln_solver = ScipyGMRES()
         from openmdao.api import Problem, ScipyOptimizer

top = Problem() top.root = SellarDerivatives()

#top.driver = ScipyOptimizer()
#top.driver.options['optimizer'] = 'SLSQP'
#top.driver.options['tol'] = 1.0e-8
#
#top.driver.add_desvar('z', lower=np.array([-10.0, 0.0]),
#                     upper=np.array([10.0, 10.0]))
#top.driver.add_desvar('x', lower=0.0, upper=10.0)
#
#top.driver.add_objective('obj')
#top.driver.add_constraint('con1', upper=0.0)
#top.driver.add_constraint('con2', upper=0.0)

top.setup()

# Setting initial values for design variables top['x'] = 1.0 top['z'] = np.array([5.0, 2.0])

top.run()

print("\n")

print("Coupling vars: %f, %f" % (top['y1'], top['y2']))


count1 = top.root.d1.execution_count 
count2 = top.root.d2.execution_count 
print("Number of discipline 1 and 2 calls (%i,%i)"% (count1,count2))

Upvotes: 1

Views: 152

Answers (1)

Kenneth Moore
Kenneth Moore

Reputation: 2202

This is a good observation. Whenever you have a cycle, the "head" component runs a second time. The reason is as follows:

If you have a model with components that contain implicit states, a single execution looks like this:

  1. Call solve_nonlinear to execute components
  2. Call apply_nonlinear to calculate the residuals.

We don't have any components with implicit states in this model, but we indirectly created the need for one by having a cycle. Our execution looks like this:

  1. Call solve_nonlinear to execute all components.
  2. Call apply_nonlinear (which caches the unknowns, calls solve_nolinear, and saves the difference in unknowns) on just the "head" component to generate a residual that we can converge.

Here, the head component is just the first component that is executed based on however it determines what order to run the cycle in. You can verify that only a single head component gets extra runs by building a cycle with more than 2 components.

Upvotes: 1

Related Questions