Skip to content

Latest commit

 

History

History
46 lines (37 loc) · 2.71 KB

README.md

File metadata and controls

46 lines (37 loc) · 2.71 KB

cs231n-seminars

Seminars for CV course based on cs231n at ABBYY, 2019

Seminars

Seminar 1 (backprop in numpy) Open In Colab

  • learn how backpropagation really works
  • implement backward pass for main layers (Linear, ReLU, Sigmoid, Softmax)
  • implement backward pass for most common criterion (Crossentropy)
  • implement batch stochastic gradient descent
  • train your tiny neural network on MNIST

Exercises

  • why sigmoid/tanh activations may be bad for deep networks?
  • can you train classification problem with MSE loss? Why this may not be a good idea?
  • "dead ReLU" problem - provide example when it may happen
  • derive analytically (on paper) gradients w.r.t. all inputs and parameters for small network (Linear -> ReLU -> Linear -> Softmax) with NLL loss

Homework 1

  • backward for many other layers (Conv, Pool, BatchNorm, Dropout, etc)

Seminar 2 (training your network with PyTorch) Open In Colab

  • PyTorch in 5 minutes
  • training your neural net in PyTorch
  • training on GPU
  • reproducibility
  • explore different activations and loss functions

Seminar 3 (layers & architectures) Open In Colab

  • Most confusing layers: Dropout and BatchNorm
  • Most common architectures: Alexnet, VGG, Inception, Resnet, Resnext, Densenet, SENet

Seminar 4 (high level frameworks)

  • PyTorch ecosystem
  • augmentations
  • fasi.ai vs pytorch lightning vs catalyst
  • catalyst

Seminar 5 (segmentation)

Seminar 6 (detection)

Seminar 7 (visualization) Open In Colab

  • attribution with TorchRay