Learn how a neural network learns through back prograpation
- All differential equations are provided and concepts clearly explained
- Code that trains and tests a neural network with some sample data
- Backpropagation through stochasitc gradient descent
- Uses a learning schedule to smoothen the training process
- Scales inputs to ensure that the network does not suffer from vanishing gradients at least for the first layer (in this case)
- Also compares the performance of the network against standard library implementations of Support Vector Machines (SVM) and Logistic Regression
Topic is dense (not as dense as the newton raphson method), get ready for some (simple) calculus.