Skip to content

bartekspitza/autograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Autograd engine

Backpropagation implementation. This is hidden in all modern deep learning libraries behind one single method call: backward(), where the magic happens. I want to un-magic that magic.

Thoughts

Originally I implemented all vector and matrix arithmetics from scratch, but then realised it was simply too slow when dealing with "larger" datasets such as mnist. So bye ~60 commits and hi import numpy as np.

About

Implementation of Backpropagation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published