The objective for the Deep Learning bootcamp is to ensure that the participants have enough theory and practical concepts of building a deep learning solution in the space of computer vision and natural language processing. Post the bootcamp, all the participants would be familiar with the following key concepts and would be able to apply them to a problem.
Key Deep Learning Concept
- Theory: DL Motivation, Back-propagation, Activation
- Paradigms: Supervised Learning
- Models: Sequential Models, Pre-trained Models (Transfer Learning)
- Methods: Perceptron, Dense, Convolution, Pooling, Dropouts, Recurrent, LSTM, Embedding
- Process: Setup, Encoding, Training, Serving
- Tools: python-data-stack, keras, tensorflow
- 001: Theory - Deep Learning, Universal Approximation, MLP for tabular data
- 002: Multi-layer Perceptron - Fashion MNIST
- 003: Theory - Convolution Neural Network
- 004: Convolution Neural Network - Fashion MNIST
- 005: Transfer Learning - Fashion MNIST
- 006: Data Augmentation - Fashion MNIST
- 007: MLP & CNN - Dosa/No Dosa
- 008: Data Augmentation - Dosa/No Dosa
- 009: Transfer Learning - Dosa/No Dosa
- 010: Theory & Concept - Natural Language Processing
- 011: Recurrent Neural Network - Toxic Classification
- 012: Convolution - 1D - Toxic Classification
- 013: Pre-Trained Embedding - Words - Toxic Classification
- 014: Pre-Trained Embedding - Sentences - Toxic Classification
- Presentations
- Long Form Articles
- Logo Detection by @bargava
- Uncovering Hidden Pattern by @bargava
- How to learn Deep Learning in 6 months by @bargava
These are reference materials which have excellent explanations - visual, interactive, math or code driven in text, video, app or notebook format - about Machine Learning and Deep Learning. We have found them useful in our own learning journey. We hope they will help you in yours.
-
Basics: Python, Numpy and Math
- Don't know python. Start with a crash course from @anadology using Python Practice Book, (Text, Code)
- Don't know numpy. Start with a good introduction here from @jakevdp at Section 2. Introduction to Numpy in Python Data Science Handbook. (Notebook, Code)
- Want a refresher in Linear Algebra and Calculus. Watch these videos by @3blue1brown on Essence of Linear Algebra and Essence of Calculus. (Video, Visual)
-
Basics of Machine Learning
- Never done any Machine Learning. Start with the first four chapters of Section 5. Machine Learning in Python Data Science Handbook. (Notebook, Code)
- How do you build, select and validate a Machine Learning Model. Read these three blogs posts by Sebastian Rashcka on Model evaluation, model selection, and algorithm selection in machine learning: Part 1 - Basics, Part 2 - Holdout, Part 3 - Cross Validation & Hyper Parameter Tuning. (Text, Math & Visual)
- Want to know the math in ML? Check out our repo on HackerMath for MLwhich uses code and visuals to understand the math. (Notebook, Visual & Code)
-
Deep Learning Basics
- Want a visual understanding of Deep Learning. Start with these four videos by @3blue1brown on Neural Networks. (Video, Visual)
- Want to learn how to create a neural network? Go and play with all the knobs and options to build and train a simple neural network at Tensorflow Playground. (Website, Interactive)
- How can neural networks compute any function? Read this visual proof by Michael Nielson in Chapter 4 in Neural Network and Deep Learning. (Text, Visual)
- Why are simple neural networks (like MLP) hard to train? Here is a good explanation on the concept of vanishing and exploding gradients in Deep Learning - Chapter 5. (Text, Visual & Code)
-
Learning & Optimization
- What is this Back-Propogation stuff? Here is an easy to understand visual and math explanation on The Calculus of Backpropogation. (Text, Visual & Math)
- How do optimizer works? Start with this interactive post by Ben Fredrickson on Numerical Optimization. (Text, Visual & Interactive)
- What are all these optimizers? Read through this exhaustive explanation on SGD and its variants by Sebastian Ruder on Optimizing Gradient Descent. (Text, Visual & Math)
- Want to learn more about Stochastic Gradient Descent? Read through this interactive article on momentum in SGD: Why Momentum really works. (Text, Interative)
- Interested in more recent improvements in optimisation. Check out this article by Sebastian Ruder on DL Optimisation Trends and Fast.ai post on Adam Weight Decay. (Text, Math)
- Want the real practical stuff for building and training in DL? Read this excellent post on practical advice on building Deep Neural Nets as well as the fantastic walk-through by Andrei Karpathy on Managing the Learning Process. (Text, Visual)
-
Deep Learning for Images
- What is a Convolution? Get a basics understanding of convolution in this exemplar driven post - Understanding Convolution. (Text, Visual)
- How do you build a Convolution Neural Network? CS231N course notes on Convolution Network is a concise read-up on this. (Text, Visual)
- Want to play with convolutions filters? Check out the interactive explainer of convolution at ML4a demos site. (App, Interactive)
- Need more analogies for Convolution Neural Nets? Check out this excellent post of explaining CNNs using different lenses of image processing, fluid mechanics, statistics, and information theory: Understanding Convolution in Deep Learning. (Text, Visual)
- Why are we doing transfer learning? Here is good way to think about possible approaches to adopt for transfer learning when using CNNs. (Text)
-
Deep Learning for NLP
- Confused by all these embedding stuff? Read this post on Representation and NLP to understand of why they are so effective in Deep Learning. (Text, Visual)
- Want to understand word embeddings? Start with this elegant post on Word is worth a thousand vectors. (Text, Visual)
- How does this word2vec stuff relate to statistical methods? This article with a click-bait title - Stop using word2vec will help you put all these methods in a simple framework to understand. (Text, Visual)
- Need to deep dive more in the math of word embedding. Start with these four posts by Sebastian Ruder on word embeddings: Part 1 - Basic, Part 2 - Softmax, Part 3 - Word2Vec, Part 5 - Recent Trends. (Text, Math)
- Why are we using Recurrent Neural Networks? Karpathy's article The Unreasonable Effectiveness of RNNs is a wonderful introduction to this topic with even code to do fun things with them. (Text, Visual & Code)
- What are LSTMs? Start with this visual unpacking of what is happening within the LSTM node - Understanding LSTMs. (Text, Visual)
- Still confused by all this DL text approaches? Here is an article to understand the DL process for NLP as the four steps of Embed - Encode - Attend - Predict in this post by Spacy's creator on Deep Learning Formula for NLP. (Text, Visual)
- Want pratical steps for using Deep Learning for Text Classification? Check out how to build a DL model and consolidated best practice advice from Google's Text Classification Guide. (Text, Visual & Code)
- Doing more exotic NLP stuff? Then check out this article on current Best approaches for Deep Learning in NLP. (Text, Math)
-
Visualisation
- Why do we want to visualise & understand NNs? This post will give you a basic understanding of the process of visualising NNs for Human Beings - Visualising Representation. (Text, Visual)
- Want to visualise networks and learning? Use the Tensorboard callback to start doing that from your notebooks. (App, Interactive)
- Want to see the visualisation of DL layers? Go and check the demo's on the website of Keras.js. (App, Visual & Interactive)
- Want to understand why all these dimensionality reduction approaches? Start by reading the interactive piece by Christopher Olah on Visualising MNIST (Text, Visual & Interactive)
- Want to look at your embeddings in 2D/3D? Check out the embedding projector and you can run it on your own data using TensorBoard. (App, Interactive)
- What is the Neural Network really learning in images? Check out these articles on Feature Visualisation and The Building Block of Interpretibility. (Text & Notebooks, Visual & Interactive)
-
Continue (Your) Learning on (Deep) Learning
- Want to learn using notebooks on Deep Learning? Explore the collection of interactive ML examples at Seedbank.
- More of a book person? My guidance for an applied book is this very practical book by François Chollet (the creator of Keras) - Deep Learning in Python