Ananya S Kaligal
Ananya S Kaligal

Reputation: 11

Low Accuracy on MNIST Dataset by CNN model built using GNN Bundles in ECL

I am working on a project using ECL and GNN bundles to train a convolutional neural network (CNN) on the MNIST dataset. The model architecture includes three convolutional layers with batch normalization and ReLU activation, followed by max-pooling layers, and ending with three dense layers. Despite normalizing and reshaping the data correctly and using categorical cross-entropy as the loss function with the Adam optimizer, I am observing very low accuracy on the test set. The training accuracy does not improve significantly beyond a certain point. I have implemented the model architecture and data preprocessing steps correctly. Any suggestions on what might be causing this issue or how to improve the model's accuracy would be greatly appreciated.

I tried different learning rates, batch sizes, and number of epochs, but none of these changes significantly improved accuracy. I expected the model to perform well on the MNIST dataset, achieving an accuracy similar to that of other standard CNN models trained on this dataset. Additionally, I verified that the model architecture and data preprocessing steps were implemented correctly.

Upvotes: -1

Views: 38

Answers (1)

Bob Foreman
Bob Foreman

Reputation: 254

First, it would be helpful to post your ECL so we could analyze it. Second, the ECL GNN bundle accuracy of the different models have been verified by comparing results with the Python Tensorflow equivalent. If you could apply your settings to the Python equivalent and compare results that would be helpful. Also, it may just be that the MNIST dataset is just not a good training set for the CNN implementation you are trying. There is an excellent tutorial on the Tensorflow web site. CNN Tutorial

Regards,

Bob

Upvotes: 0

Related Questions