Skip to content
forked from JetRunner/TuPaTE

Code for EMNLP 2022 paper "Efficiently Tuned Parameters are Task Embeddings"

License

Notifications You must be signed in to change notification settings

SumanthRH/TuPaTE

 
 

Repository files navigation

TuPaTE

Code for EMNLP 2022 paper "Efficiently Tuned Parameters are Task Embeddings" Workflow of TuPaTE.

Setup

We conduct our experiment with Anaconda3. If you have installed Anaconda3, then create the environment by:

conda create -n tupate python=3.8.5
conda activate tupate

After we setup basic conda environment, install pytorch related packages via:

conda install -n pt2 pytorch==1.7.1 torchvision==0.8.2 torchaudio==0.7.2 cudatoolkit=11.0 -c pytorch

Finally, install other python packages we need:

pip install -r requirements.txt

Training

Run training scripts in run_script (e.g., RoBERTa for RTE):

bash run_script/run_rte_bert.sh

Extract Task Embedding

Functions for extracting task embeddings for different parameter efficient tuning methods are provided in

extract_task_emb.py

Pre-computed Task Embedding

We also release the embeddings for each task here.

About

Code for EMNLP 2022 paper "Efficiently Tuned Parameters are Task Embeddings"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.1%
  • Shell 1.9%