listout
listout

Reputation: 97

Tensorflow No gradients provided for any variable

I'm new to Tensorflow and Machine learning in general. I'm trying to create a model to detect brain tumor through MRIs.

I'm splitting the data using validation_split. After compiling the model when when I try to fitting using the .fit function I get this Error. After googling I have found I might be because I'm not passing the y parameter when calling the fit function.

Code:

datagen = ImageDataGenerator(validation_split=0.2, rescale=1. / 255)

train_generator = datagen.flow_from_directory(
    TRAIN_DIR,
    target_size=(150, 150),
    batch_size=32,
    class_mode='binary',
    subset='training'
)

val_generator = datagen.flow_from_directory(
    TRAIN_DIR,
    target_size=(150, 150),
    batch_size=32,
    class_mode='binary',
    subset='validation'
)

model = tf.keras.models.Sequential()
model.add(
    tf.keras.layers.Conv2D(
        16,
        (3, 3),
        activation='relu',
        input_shape=(150, 150, 3)
    )
)
model.add(
    tf.keras.layers.MaxPool2D(2, 2)
)

...
# some more layers
...

model.compile(
    optimizer='adam',
    loss=None,
    metrics=['accuracy'],
)

print(model.summary())

Test = model.fit(
    train_generator,
    epochs=2,
    verbose=1,
    validation_data=val_generator
)

What am I doing wrong ?

Folder structure for the images:

images
|
├── training
│   ├── no
│   ├── yes
├── testing
│   ├── no
│   ├── yes

Exact Error Message:

ValueError: No gradients provided for any variable: ['conv2d/kernel:0', 'conv2d/bias:0', 'conv2d_1/kernel:0', 'conv2d_1/bias:0', 'conv2d_2/kernel:0', 'conv2d_2/bias:0', 'dense/kernel:0', 'dense/bias:0', 'dense_1/kernel:0', 'dense_1/bias:0'].

Output of model.summary():

Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
conv2d (Conv2D)              (None, 148, 148, 16)      448
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 74, 74, 16)        0
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 72, 72, 32)        4640
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 36, 36, 32)        0
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 34, 34, 64)        18496
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 17, 17, 64)        0
_________________________________________________________________
flatten (Flatten)            (None, 18496)             0
_________________________________________________________________
dense (Dense)                (None, 512)               9470464
_________________________________________________________________
dense_1 (Dense)              (None, 2)                 1026
=================================================================
Total params: 9,495,074
Trainable params: 9,495,074
Non-trainable params: 0

Upvotes: 0

Views: 181

Answers (1)

krenerd
krenerd

Reputation: 791

This is because you set the loss to None, no gradient is provided from the loss function back to your model. Modify

model.compile(
    optimizer='adam',
    loss=None,
    metrics=['accuracy'],
)

to

model.compile(
    optimizer='adam',
    loss='mse', # or some other loss
    metrics=['accuracy'],
)

Upvotes: 1

Related Questions