seni
seni

Reputation: 711

WARNING:tensorflow:AutoGraph could not transform <function <lambda> at 0x7fca141a6d08> and will run it as-is

I implement the code of TFF of image classification. TFF version 0.18.0, I write this :

iterative_process = tff.learning.build_federated_averaging_process(model_fn, server_optimizer_fn=lambda: tf.keras.optimizers.SGD(learning_rate=1.0), client_optimizer_fn=lambda: tf.keras.optimizers.SGD(learning_rate=0.001))

state = iterative_process.initialize()

But I find this warning:

WARNING:tensorflow:AutoGraph could not transform <function <lambda> at 0x7fca141a6d08> and will run it as-is.
Cause: could not parse the source code of <function <lambda> at 0x7fca141a6d08>: found multiple definitions with identical signatures at the location. This error may be avoided by defining each lambda on a single line and with unique argument names.
Match 0:
(lambda : tf.keras.optimizers.SGD(learning_rate=1.0))

Match 1:
(lambda : tf.keras.optimizers.SGD(learning_rate=0.001))

To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert

So please how can I avoid this warning. Thanks

Upvotes: 2

Views: 1711

Answers (1)

Keith Rush
Keith Rush

Reputation: 1405

First, this warning does not seem problematic to me. TFF needs functions which construct optimizers in case these optimizers rely on internal variables (e.g. Adagrad and Adam, which are stateful and use variables to track preconditioning and momentum terms)--TFF needs to be able to capture construction of these variables to ensure the correct code can run on device, and therefore does not need autograph to be converting these functions--Python functions are sufficient for this purpose.

Second, one simple option that I believe would silence the warning would be to use a named function for your optimizer fns. That is, if you used something like


def server_optimizer_fn():
  return tf.keras.optimizers.SGD(learning_rate=1.)

def client_optimizer_fn():
  return tf.keras.optimizers.SGD(learning_rate=0.001)

iterative_process = tff.learning.build_federated_averaging_process(
    model_fn,
    server_optimizer_fn=server_optimizer_fn,
    client_optimizer_fn=client_optimizer_fn)

Autograph should no longer complain.

Upvotes: 1

Related Questions