Skip to content

Latest commit

 

History

History
executable file
·
43 lines (32 loc) · 1.64 KB

README.md

File metadata and controls

executable file
·
43 lines (32 loc) · 1.64 KB

Modality-agnostic Self-Supervised Learning with Meta-Learned Masked Auto-Encoder

PyTorch implementation for "Modality-agnostic Self-Supervised Learning with Meta-Learned Masked Auto-Encoder" (accepted in NeurIPS 2023)

TL;DR: Interpreting MAE through meta-learning and applying advanced meta-learning techniques to improve unsupervised representation of MAE on arbitrary modalities.

Install

conda create -n meta-mae python=3.9
conda activate meta-mae
conda install pytorch==1.12.0 torchvision==0.13.0 torchaudio==0.12.0 cudatoolkit=10.2 -c pytorch
pip install numpy==1.21.5
conda install ignite -c pytorch
pip install timm==0.6.12
pip install librosa
pip install pandas
pip install packaging tensorboard sklearn

Download datasets

Pretraining MetaMAE

  • E.g., pamap2
python pretrain.py --logdir ./logs_final/pamap2/metamae --seed 0 --model metamae \
	--datadir [DATA_ROOT] --dataset pamap2 \
	--inner-lr 0.5 --reg-weight 1 --num-layer-dec 4 --dropout 0.1 --mask-ratio 0.85

Evaluating MetaMAE

python linear_evaluation.py --ckptdir ./logs_final/pamap2/metamae --seed 0 --model metamae \
	--datadir [DATA_ROOT] --dataset pamap2 \
	--inner-lr 0.5 --reg-weight 1 --num-layer-dec 4 --dropout 0.1 --mask-ratio 0.85