The mathematics and computation that drive neural networks are frequently seen as erudite and impenetrable. A clearly illustrated example of building from scratch a neural network for handwriting recognition is presented in MLP.ipynb
. This tutorial provides a step-by-step overview of the mathematics and code used in many modern machine learning algorithms.
To view this notebook in your browser simply click the MLP.ipynb
file above.
To run this notebook locally make sure you have git, python, and Jupyter installed.
Then in a terminal window:
$ git clone https://github.com/KirillShmilovich/MLP-Neural-Network-From-Scrath
$ cd MLP-Neural-Network-From-Scrath
$ jupyter-notebook MLP.ipynb