Reputation: 524
For learning purposes I have a task to make a linear and sigmoid operations in tensorflow. I managed to do the linear op:
def linear_op_forward(X, W):
''' linear operation'''
return np.dot(X, W.T)
def linear_op_backward(op, grads):
''' Linear gradient realization '''
X = op.inputs[0]
W = op.inputs[1]
dX = tf.multiply(grads, W)
dW = tf.reduce_sum(tf.multiply(X, grads),
axis = 0,
keep_dims = True)
return dX, dW
But I'm stuck with sigmoid operation:
Is that correct?
def sigmoid_op_forward(X):
return 1 / (1 + np.exp(-X))
And I have hard time understandind sigmoid gradient
def sigmoid_op_backward(op, grads):
???
Can someone please help with this?
Upvotes: 2
Views: 551
Reputation: 53758
Try this:
def sigmoid_op_backward(op, grads):
sigmoid = op.outputs[0]
return sigmoid * (1 - sigmoid) * grads
Upvotes: 2