Skip to content

Assessing Impact of Vicinal Risk Minimization on Teacher-Student Learning

Notifications You must be signed in to change notification settings

deepandas11/Distilling-with-VRM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowledge Distillation with VRM

This project aims to analyse the impact of various VRM techniques(applied on teacher models) on the generalization performance of a student model. The VRM techniques being analysed here are:

Work accepted at the ICML-UDL Workshop, 2020

alt text

Step 1: Replicate Conda Environment

conda create -n ml
conda install --name ml --file spec-file.txt
conda activate ml

Step 2: Train Teacher Models

Train a set of techer models with these VRM techniques.

Step 3: Train Student Models

Use dark knowledge from teacher models trained in Step 2.

Step 4: Analyse generalization performance

Use different datasets and performance metrics to analyse generalization performance of the different student models. To measure generalization, we can evaluate the models on the unseen CIFAR test set. In addition to that, we also consider the following datasets:

  • CIFAR 10.1 v6: Small natural variations in the dataset
  • CINIC (ImageNet Fold): Distributional shift in images
  • CIFAR 10H: CIFAR Test Set but with human labels - can help us in analysing prediction structure.

Releases

No releases published

Packages

No packages published