Skip to content

Learn how a neural network learns through back prograpation (using stochastic gradient descent)

Notifications You must be signed in to change notification settings

nirupampratap/Neuralnetwork_Backpropagation

Repository files navigation

Neuralnetwork Backpropagation

Learn how a neural network learns through back prograpation

  1. All differential equations are provided and concepts clearly explained
  2. Code that trains and tests a neural network with some sample data
  3. Backpropagation through stochasitc gradient descent
  4. Uses a learning schedule to smoothen the training process
  5. Scales inputs to ensure that the network does not suffer from vanishing gradients at least for the first layer (in this case)
  6. Also compares the performance of the network against standard library implementations of Support Vector Machines (SVM) and Logistic Regression

Topic is dense (not as dense as the newton raphson method), get ready for some (simple) calculus.

About

Learn how a neural network learns through back prograpation (using stochastic gradient descent)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published