Implementation of Artificial Neural Network from Scratch using Python and Jupyter Notebook
-
Updated
Apr 22, 2024 - Jupyter Notebook
Implementation of Artificial Neural Network from Scratch using Python and Jupyter Notebook
This notebook demonstrates a neural network implementation using NumPy, without TensorFlow or PyTorch. Trained on the MNIST dataset, it features an architecture with input layer (784 neurons), two hidden layers (132 and 40 neurons), and an output layer (10 neurons) with sigmoid activation.
I build the Micrograd autogradient engine, which is a functioning neural network with forward pass, backward propagation, and stochastic gradient descent, all built from scratch. This is derived from the great @karpathy micrograd lecture. Each notebook is complete with Andrei's lecture code and speech, as well as my own code, anecdotes and addition
Add a description, image, and links to the backward-propagation topic page so that developers can more easily learn about it.
To associate your repository with the backward-propagation topic, visit your repo's landing page and select "manage topics."