pte
pte

Reputation: 61

Custom Layers in tensorflow

I am trying to make some changes to the inbuilt dropout function in tensorflow. What is the best procedure to do so?

I'd like to make some changes in forward and backpropogation steps. In Tensorflow Implementation I can only find forward pass not backward pass. I'd like to modify both forward and backward pass.

Upvotes: 0

Views: 499

Answers (1)

William
William

Reputation: 181

You can use tf.custom_gradient to define your own forward and backprop step in a single method. Here is a simple example:

import tensorflow as tf

tf.InteractiveSession()

@tf.custom_gradient
def custom_multiply(a, x):
  # Define your own forward step
  y = a * x
  # Define your own backward step
  def grads(dy): return dy * x, dy * a + 100
  # Return the forward result and the backward function
  return y, grads

a, x = tf.constant(2), tf.constant(3)
y = custom_multiply(a, x)
dy_dx = tf.gradients(y, x)[0]
# It will print `dy/dx = 102` instead of 2 if the gradient is not customized
print('dy/dx =', dy_dx.eval())

If your want to customize your own layer, simply replace the core function used in tf.layers.Dropout.call with your own's.

Upvotes: 1

Related Questions