havakok
havakok

Reputation: 1247

example code giving AttributeError: 'AdamOptimizer' object has no attribute '_beta1_power'

I am trying to run the example ./quick_scripts/celebA_superres.sh "./images/182659.jpg" under Demo section. I am new to python and getting the following error:

AttributeError: 'AdamOptimizer' object has no attribute '_beta1_power'

As best I understand, an instance of an object called 'AdamOptimizer' does not know what to do with this type of variable. The code is the following:

def get_opt_reinit_op(opt, var_list, global_step):
    opt_slots = [opt.get_slot(var, name) for name in opt.get_slot_names() for var in var_list]
    if isinstance(opt, tf.train.AdamOptimizer):
        opt_slots.extend([opt._beta1_power, opt._beta2_power])  #pylint: disable = W0212
    all_opt_variables = opt_slots + var_list + [global_step]
    opt_reinit_op = tf.variables_initializer(all_opt_variables)
    return opt_reinit_op

Where the line opt_slots.extend([opt._beta1_power, opt._beta2_power]) #pylint: disable = W0212 produces the error.

I do not see an 'AdamOptimizer'. I am guessing it is hiding in opt? How do I debug through such a thing? is there a good practice for such debugs?

I should mention I used 2to3 to convert the code to python3. Is this of any importance?

Attaching full traceback:

Traceback (most recent call last): File "./src/compressed_sensing.py", line 177, in main(HPARAMS) File "./src/compressed_sensing.py", line 21, in main estimators = utils.get_estimators(hparams) File "/home/erezsh/Projects/CSGM/csgm3/src/utils.py", line 98, in get_estimators estimators = {model_type: get_estimator(hparams, model_type) for model_type in hparams.model_types} File "/home/erezsh/Projects/CSGM/csgm3/src/utils.py", line 98, in estimators = {model_type: get_estimator(hparams, model_type) for model_type in hparams.model_types} File "/home/erezsh/Projects/CSGM/csgm3/src/utils.py", line 91, in get_estimator estimator = celebA_estimators.dcgan_estimator(hparams) File "/home/erezsh/Projects/CSGM/csgm3/src/celebA_estimators.py", line 185, in >dcgan_estimator opt_reinit_op = utils.get_opt_reinit_op(opt, var_list, global_step) File "/home/erezsh/Projects/CSGM/csgm3/src/utils.py", line 545, in >get_opt_reinit_op opt_slots.extend([opt._beta1_power, opt._beta2_power]) #pylint: disable = >W0212 AttributeError: 'AdamOptimizer' object has no attribute '_beta1_power'

Upvotes: 0

Views: 2916

Answers (1)

shmee
shmee

Reputation: 5101

The AdamOptimizer is certainly hiding in opt: the line raising the error is only called when isinstance has determined that opt is an instance of tf.train.AdamOptimizer

if isinstance(opt, tf.train.AdamOptimizer):
    opt_slots.extend([opt._beta1_power, opt._beta2_power])

I don't think that 2to3 is to blame here, I rather expect your installed version of TensorFlow to be too new. The requirements for the csgm list TensorFlow 1.0.1. In this version _beta1_power and _beta2_power have still been attributes of the AdamOptimizer. This has been changed in version 1.6, now beta1_power and beta2_power are assigned locally in the functions that require them. You probably could get the values by calling _get_beta_accumulators() on the oprimizer, but then it's gonna break on the next best occasion where newer versions behave differently, I guess.

Upvotes: 3

Related Questions