Reputation: 3
I'm trying to implement the ASO for the Sellar problem but the "subproblem" is not working as I expected. I have no idea how to solve this problem.
Although the Sellar problem is not a good example to understand the ASO architecture (both disciplines have pretty much the same computational cost), it's a simple formulation and I'm using it as a benchmark before trying to use in the problem I'm studying. My code is( OpenMDAO 1.7):
class low_order(Group):
def __init__(self):
super(low_order, self).__init__()
self.add('plx', IndepVarComp('x', 1.0), promotes=['x'])
self.add('plz', IndepVarComp('z', np.array([5.0, 2.0])), promotes=['z'])
self.add('d1', SellarDis1(), promotes=['x', 'z', 'y1', 'y2'])
#self.add('d2', SellarDis2(), promotes=['z', 'y1', 'y2'])
self.add('obj_cmp', ExecComp('obj = x**2 + z[1] + y1 + exp(-y2)',
z=np.array([0.0, 0.0]) ),
promotes=['obj', 'x', 'z', 'y1', 'y2'])
self.add('con_cmp1', ExecComp('con1 = 3.16 - y1'), promotes=['con1', 'y1'])
self.deriv_options['type'] = 'fd'
self.deriv_options['form'] = 'central'
class SellarDerivatives(Group):
def __init__(self):
super(SellarDerivatives, self).__init__()
self.add('ppx', IndepVarComp('x', 1.0), promotes=['x'])
self.add('pz', IndepVarComp('z', np.array([5.0, 2.0])), promotes=['z'])
self.add('d2', SellarDis2(), promotes=['z', 'y1', 'y2'])
self.add('obj_cmp1', ExecComp('obj1 = x**2 + z[1] + y1 + exp(-y2)',
z=np.array([0.0, 0.0]) ),
promotes=['obj1', 'x', 'z', 'y1', 'y2'])
#self.add('con_cmp1', ExecComp('con1 = 3.16 - y1'), promotes=['con1', 'y1'])
self.add('con_cmp2', ExecComp('con2 = y2 - 24.0'), promotes=['con2', 'y2'])
self.nl_solver = NLGaussSeidel()
self.nl_solver.options['atol'] = 1.0e-12
self.ln_solver = ScipyGMRES()
self.deriv_options['type'] = 'fd'
self.deriv_options['form'] = 'central'
if __name__ == '__main__':
from openmdao.api import Problem, ScipyOptimizer, SqliteRecorder
sub = Problem()
sub.root = low_order()
sub.driver =ScipyOptimizer()# pyOptSparseDriver()
sub.driver.options['optimizer'] = 'SLSQP'
sub.driver.options['disp'] = False
sub.driver.add_desvar('x', lower=0.0, upper=10.0)
sub.driver.add_objective('obj')
sub.driver.add_constraint('con1', upper=0.0)
#sub.driver.add_constraint('con2', upper=0.0)
top = Problem()
top.root = SellarDerivatives()
top.driver = ScipyOptimizer()#pyOptSparseDriver()
top.driver.options['optimizer'] = 'SLSQP'
top.driver.add_desvar('z', lower=np.array([-10.0, 0.0]),
upper=np.array([10.0, 10.0]))
top.root.add('subprob1', SubProblem(sub, params=['z'],unknowns=['y1','x']))
top.root.connect('z','subprob1.z')
top.driver.add_objective('obj1')
#top.driver.add_constraint('con1', upper=0.0)
top.driver.add_constraint('con2', upper=0.0)
top.setup()
top.run()
When I look the iterations, I see that only the Z1 and Z2 are varying as I expected, and I have no idea about the X. The lines below are : X | Z1 | Z2
(8.881784197001252e-16, 5.000916977558285, 1.000912901563544)
(1.0000000008881784e-06, 5.000916977558285, 1.000912901563544)
(-9.999999991118215e-07, 5.000916977558285, 1.000912901563544)
(8.881784197001252e-16, 5.000912478379066, 1.0009120015635442)
(8.881784197001252e-16, 5.000912478379066, 1.0009120015635442)
(1.0000000008881784e-06, 5.000912478379066, 1.0009120015635442)
(-9.999999991118215e-07, 5.000912478379066, 1.0009120015635442)
And the final answer is ( Z1 , Z2 , X ):
Minimum found at (5.000912, 1.000912, 1.000000)
Coupling vars: 0.000000, 6.001824
('Minimum objective: ', 2.0033861370124493)
Ps. As the Sellar problem is not my main objective, I'm using FD across the whole model, just to make it easier for me.
Thanks!
Upvotes: 0
Views: 121
Reputation: 2202
I got this to work with a few changes (it didn't work at all with the code you sent, but we may have tightened some error checking between 1.7 and 1.7.3.)
You need to pass y2 into the subproblem to complete the cycle:
top.root.add('subprob1', SubProblem(sub, params=['z', 'y2'], unknowns=['y1', 'x']), promotes=['*'])
And you need to remove the extra IndepVarComp for variable 'x' in the top "SellarDerivatives" group because you are letting the subproblem be the ultimate source.
#self.add('ppx', IndepVarComp('x', 1.0), promotes=['x'])
With those changes and using pyoptsparse, I get for the sub:
Objectives:
Name Value Optimum
obj 3.18339 0
Variables (c - continuous, i - integer, d - discrete):
Name Type Value Lower Bound Upper Bound
x_0 c 0.000004 0.00e+00 1.00e+01
and for the main problem:
Objectives:
Name Value Optimum
obj1 3.18339 0
Variables (c - continuous, i - integer, d - discrete):
Name Type Value Lower Bound Upper Bound
z_0 c 1.977638 -1.00e+01 1.00e+01
z_1 c 0.000000 0.00e+00 1.00e+01
Upvotes: 1