Reputation: 71
import tensorflow as tf
import numpy as np
tf.enable_eager_execution()
x_data = [[1,2,1,1],[2,1,3,2],[3,1,3,4],[4,1,5,5],[1,7,5,5],[1,2,5,6],[1,6,6,6],[1,7,7,7]]
y_data = [[0,0,1],[0,0,1],[0,0,1],[0,1,0],[0,1,0],[0,1,0],[1,0,0],[1,0,0]]
x_data = np.asarray(x_data, dtype=np.float32)
y_data = np.asarray(y_data, dtype=np.float32)
nb_classes = 3
W = tf.Variable(tf.random_normal([4, nb_classes]), name = 'weight')
b = tf.Variable(tf.random_normal([nb_classes]), name = 'bias')
variables = [W,b]
def hypothesis(X):
hypo = tf.nn.softmax(tf.matmul(X,W) + b)
return hypo
def cost_fn(X,Y):
logits = hypothesis(X)
cost = -tf.reduce_sum(Y * tf.log(logits), axis = 1)
cost_mean = tf.reduce_mean(cost)
return cost_mean
def grad_fn(X,Y):
with tf.GradientTape as tape:
cost = cost_fn(X,Y)
grads = tape.gradient(cost, variables)
return grads
So i was trying about classifications and was making a gradient function for gradient descent optimizer. and the error occured at the last part of the code
with tf.GradientTape as tape:
AttributeError : enter occured and i don't understand why. Can i get a reason for the error or way to solve it?
Upvotes: 3
Views: 1634
Reputation: 11333
You are missing the parenthesis in your GradientTape
. It should be as follows.
def grad_fn(X,Y):
with tf.GradientTape() as tape:
cost = cost_fn(X,Y)
grads = tape.gradient(cost, variables)
return grads
Upvotes: 8