Lleims
Lleims

Reputation: 1353

When fit my model I obtain ValueError: Input 0 of layer sequential is incompatible with the layer

I'm trying to emulate LeNet using mnist dataset,

I'm doing the following,

import tensorflow as tf
import tensorflow_datasets as tfds
from tensorflow.keras import layers, models
from tensorflow.keras.utils import plot_model
from tensorflow.keras.optimizers import SGD

import matplotlib.pyplot as plt
import numpy as np

# Import dataset
(train_ds, test_ds), info_ds = tfds.load('mnist', split=['train','test'], 
                               as_supervised = True,
                               with_info=True,
                               batch_size = -1)

train_images, train_labels = tfds.as_numpy(train_ds)
test_images, test_labels = tfds.as_numpy(test_ds)

# Split test to obtain validation dataset
val_size = int(len(test_images) * 0.8)
val_images, test_images = test_images[:val_size], test_images[val_size:]
val_labels, test_labels = test_labels[:val_size], test_labels[val_size:]

# Normalizing images between 0 to 1
train_images, test_images = train_images / 255.0, test_images / 255.0

# Create the model
model = models.Sequential()
model.add(layers.Conv2D(filters=6, kernel_size=(5,5), activation='relu', input_shape=(32,32,3)))
model.add(layers.MaxPooling2D(pool_size=(2,2)))
model.add(layers.Conv2D(filters=16, kernel_size=(5,5), activation='relu'))
model.add(layers.MaxPooling2D(pool_size=(2,2)))
model.add(layers.Flatten())
model.add(layers.Dense(120,activation='relu'))
model.add(layers.Dense(84,activation='relu'))
model.add(layers.Dense(10,activation='softmax'))

# Compile
opt = SGD(learning_rate=0.1)
model.compile(optimizer=opt,
               loss='categorical_crossentropy',
               metrics=['accuracy'])

# Fit
history = model.fit(train_images, train_labels,
                    epochs=10, batch_size=128,
                    validation_data=(val_images, val_labels),
                    verbose=2)

When fit, I obtain this error:

ValueError: Input 0 of layer sequential is incompatible with the layer: expected axis -1 of input shape to have value 3 but received input with shape (None, 28, 28, 1)

This means I have to reshape mi images?

I thought maybe I had to convert my labels to categorical like this

from tensorflow.keras.utils import to_categorical

train_labels = to_categorical(train_labels)
test_labels = to_categorical(test_labels)

But then the same error appears again,

ValueError: Input 0 of layer sequential_1 is incompatible with the layer: expected axis -1 of input shape to have value 3 but received input with shape (None, 28, 28, 1)

Can somebody help me to undertand it?

Thank you very much!

Upvotes: 1

Views: 121

Answers (1)

Innat
Innat

Reputation: 17219

There are two issues in your code. Such as

  • (Issue 1) You set input_shpae = (32,32,3) in your model whereas the mnist samples are (28, 28, 1). If you check your samples shapes, you will see:
train_images.shape, train_labels.shape
((60000, 28, 28, 1), (60000,))

But you define your input shape as not the same in the model definition.

# current: and not OK, according to the sample shapes 
...kernel_size=(5,5), activation='relu', input_shape=(32,32,3))) 

# should be, otherwise resize your input 
...kernel_size=(5,5), activation='relu', input_shape=(28,28,3))) 
  • (Issue 2) The input labels are integer (and not one-hot encoded), check train_labels[:5] but you set categorical_crossentropy, whereas it should be sparse_categorical_crossentropy for integer target.
current
model.compile(optimizer=opt,
               loss='categorical_crossentropy', # when labels are one-hot encoded 
               metrics=['accuracy'])

should be 
model.compile(optimizer=opt,
               loss='sparse_categorical_crossentropy',  # when label are integers 
               metrics=['accuracy'])

Now, as you mentioned later that you have tried to_categorical to one-hot encoded the target label, in that case, you can use categorical_crossentropy as a loss function.

Upvotes: 1

Related Questions