adrian
adrian

Reputation: 1

Any difference?

I have 2 pieces of code written using tensorflow. One is this:

import tensorflow as tf

class myCallback(tf.keras.callbacks.Callback):
    def on_epoch_end(self, epoch, logs={}):
      if(logs.get('accuracy')>0.99):
        print("\nReached 99% accuracy so cancelling training!")
      self.model.stop_training = True

mnist = tf.keras.datasets.mnist

(x_train, y_train),(x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

callbacks = myCallback()

model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(input_shape=(28, 28)),
    tf.keras.layers.Dense(512, activation=tf.nn.relu),
    tf.keras.layers.Dense(10, activation=tf.nn.softmax)])
model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

model.fit(x_train, y_train, epochs=10, callbacks=[callbacks])

The other one is this:

import tensorflow as tf

def train_mnist():

    class myCallback(tf.keras.callbacks.Callback):
        def on_epoch_end(self, epoch, logs={}):
            if(logs.get('accuracy')>99):
                print("\n Se incheie antrenamentul")
                self.model.stop_training = True

    mnist = tf.keras.datasets.mnist

    (x_train, y_train),(x_test, y_test) = mnist.load_data()

    x_train, x_test = x_train / 255.0, x_test / 255.0

    callbacks = myCallback()

    model = tf.keras.Sequential([
        tf.keras.layers.Flatten(input_shape=(28, 28)),
        tf.keras.layers.Dense(512, activation=tf.nn.relu),
        tf.keras.layers.Dense(10, activation=tf.nn.softmax)])

    model.compile(optimizer='adam',
                  loss='sparse_categorical_crossentropy',
                  metrics=['accuracy'])
    
    # model fitting
    history = model.fit(x_train, y_train, epochs = 10, callbacks=[callbacks])
    
    # model fitting
    return history.epoch, history.history['acc'][-1]

train_mnist()

The first one gives an accuracy of 0.99 after 3 or 4 epochs. The second one gives an accuracy of 0.91 after 10 epochs. Why? They both look the same to me. Any ideas?

Upvotes: 0

Views: 84

Answers (1)

Prakash Dahal
Prakash Dahal

Reputation: 4875

They both are almost identical. I have just checked both accuracy. The only reason why your accuracy is showing different is because in 2nd code you have returned

history.history['acc'][-1]

instead of

history.history['accuracy'][-1]

Also you need to save the history of 1st code for comparision like this:

history = model.fit(x_train, y_train, epochs = 10, callbacks=[callbacks])

Also figured out that you have stopped model training outside of if condition in 1st code.

class myCallback(tf.keras.callbacks.Callback):
    def on_epoch_end(self, epoch, logs={}):
      if(logs.get('accuracy')>0.99):
        print("\nReached 99% accuracy so cancelling training!")
      self.model.stop_training = True

It should be like this:

class myCallback(tf.keras.callbacks.Callback):
    def on_epoch_end(self, epoch, logs={}):
      if(logs.get('accuracy')>0.99):
        print("\nReached 99% accuracy so cancelling training!")
        self.model.stop_training = True

Both code must be giving around 0.99% accuracy.

Since your 2nd code is not showing the exact accuracy. I am posting the whole modified code for your 2nd code

import tensorflow as tf
from os import path, getcwd, chdir

path = f"{getcwd()}/../tmp2/mnist.npz"

def train_mnist():

    class myCallback(tf.keras.callbacks.Callback):
        def on_epoch_end(self, epoch, logs={}):
            if(logs.get('accuracy')>99):
                print("\n Se incheie antrenamentul")
                self.model.stop_training = True

    mnist = tf.keras.datasets.mnist


    

    (x_train, y_train),(x_test, y_test) = mnist.load_data()



    x_train, x_test = x_train / 255.0, x_test / 255.0
    

    callbacks = myCallback()
 

    model = tf.keras.Sequential([
        tf.keras.layers.Flatten(input_shape=(28, 28)),
        tf.keras.layers.Dense(512, activation=tf.nn.relu),
        tf.keras.layers.Dense(10, activation=tf.nn.softmax)])
 
    

    model.compile(optimizer='adam',
                  loss='sparse_categorical_crossentropy',
                  metrics=['accuracy'])
    
    # model fitting
    history = model.fit(x_train, y_train, epochs = 10, callbacks=[callbacks])
    
    # model fitting
    return history.epoch, history.history['accuracy'][-1]

train_mnist()

Upvotes: 1

Related Questions