Milind
Milind

Reputation: 493

What do these numbers mean when training in Tensor Flow

Taking the following example:

import tensorflow as tf
data = tf.keras.datasets.mnist

(training_images, training_labels), (val_images, val_labels) = data.load_data()

training_images  = training_images / 255.0
val_images = val_images / 255.0
model = tf.keras.models.Sequential([tf.keras.layers.Flatten(input_shape=(28,28)),
                                    tf.keras.layers.Dense(20, activation=tf.nn.relu),
                                    tf.keras.layers.Dense(10, activation=tf.nn.softmax)])

model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

model.fit(training_images, training_labels, epochs=20, validation_data=(val_images, val_labels))

The result is something like this:

Epoch 1/20
1875/1875 [==============================] - 4s 2ms/step - loss: 0.4104 - accuracy: 0.8838 - 
val_loss: 0.2347 - val_accuracy: 0.9304

Where does 1875 come from? What does that number represent? I am unable to see where it is coming from. The training_images has a shape of 60000x28x28 when I look at it.

Upvotes: 0

Views: 357

Answers (2)

teddcp
teddcp

Reputation: 1624

1875 is the number of iterations the training need to complete entire dataset with batch size of 32.

1875 * 32 = 60k

Epoch An epoch describes the number of times the algorithm sees the entire data set. So, each time the algorithm has seen all samples in the dataset, an epoch has completed.

Iteration An iteration describes the number of times a batch of data passed through the algorithm. In the case of neural networks, that means the forward pass and backward pass. So, every time you pass a batch of data through the NN, you completed an iteration.

For more, you can refer link-1 and link-2

Upvotes: 1

Stanley
Stanley

Reputation: 839

1875 is the number of steps/batches trained on. For example, with the default batch size of 32, this tells us that you have 60 000 images (plus or minus 31, as the last batch may or may not be full).

Upvotes: 1

Related Questions