Skip to content

subhachandrachandra/MNIST_Notebooks

Repository files navigation

MNIST_Notebooks

ML experiments with the MNIST Dataset

These notebooks apply various techniques taught in the Deep Learning Specialization at Coursera to the MNIST dataset. The goal is to explore hyperparameter tuning and various optimization techniques and their effects on the run time and accuracy of the ML model. The notebooks are implemented using NumPy to delve into the details and understand the ML models at a fundamental level.

As a reference, the first couple of notebooks implement the TensorFlow tutorials for beginners using Softmax logistic regression and for experts using Multilayer Convolutional Network.

These note books are all trained on a MacBook Pro (Retina, 13-inch, Late 2013) with a 2.6 GHz Intel Core i5 and 8 GB 1600 MHz DDR3 and no GPU acceleration.

Notebook Epochs Train Time (Secs) Train Acc Val Acc Test Acc Comment
MNIST_TF_Softmax_Tutorial 5 3.1157 0.9248 0.9232 0.9246 Tensor Flow and Softmax Logistic Regression
MNIST_TF_CNN_Tutorial 7 1781.1032 0.9965 0.9912 0.9896 Tensor Flow and CNN
MNIST_NP_Softmax_Basic 1000 659.55 0.9256 0.9274 0.9228 Numpy and Basic Softmax Logistic Regression

About

ML experiments with the MNIST Dataset

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published