A curated list of awesome deep learning techniques for deep neural networks training, testing, optimization, regularization etc.
- Weight Initialization
- Data/Input Processing
- Data Augmentation
- Decreasing/Changing Learning Rate
- Regularization
- Optimization/Gradient Descent
- Normalization
- Activation Function
- Segmentation
- Xavier Initialization
- He Initialization
- Input pipelining
- Queues
- Random cropping
- Random padding
- Random horizontal flipping
- Random RGB color shifting
- Learning rate decay
- Cyclic learning rate
- Weight decay a. L2 loss b. L1 loss
- Dropout
- Adam Optimizer
- SGD with momentum
- Nesterov Accelerated Gradient (NAG)
- Stochastic Gradient Descent (SGD)
- Batch Normalization
- Local Response Normalization (LRN)
- ReLU
- Sigmoid
- Fully Convolutional models
- Conditional Random Fields (CRF)
- Skip Connections/ Fusions
- Upsampling/ Transpose Convolutions
- Atrous/Dilated Convolutions
- Multi-scale inputs/nets with weights
- Attention to scale
- Pixel wise cross entropy loss
- Dataset and annotations