This repository presents our scripts and models for the paper hyy33 at WASSA 2024 Empathy and Personality Shared Task: Using the CombinedLoss and FGM for Enhancing BERT-based Models in Emotion and Empathy Prediction from Conversation Turns for the workshop WASSA 2024 collocated with ACL 2024.
Figure: Pre-trained BERT and DeBERTa models are finetuned using the CombinedLoss and FGM adversarial training for Emotion and Empathy Prediction from Conversation Turns.
👉 Poster
Install the related dependencies as follows:
pip install -r dependencies.txt
The pre-trained BERT model could be downloaded at: bert-base-uncased
The pre-trained DeBERTa model could be downloaded at: deberta-base
Please proceed with the repository scripts for fine-tuning BERT and DeBERTa in downstream classification and regression tasks.
- bert-class-fgm-comb.py: fine-tuning BERT on classification task, with FGM and the CombinedLoss
- bert-reg-fgm-mse.py: fine-tuning BERT on regression task, with FGM and the CombinedLoss
- deberta-class-fgm-comb.py: fine-tuning DeBERTa on classification task, with FGM and the CombinedLoss
- deberta-reg-fgm-mse.py: fine-tuning DeBERTa on regression task, with FGM and the CombinedLoss
The results of the fine-tuned model submitted for the Track 2 are as follows:
Model | Emotion | Emotional Polarity | Empathy | Avg |
---|---|---|---|---|
hyy-33/hyy33-WASSA-2024-Track-2 🤗 | 0.581 | 0.644 | 0.544 | 0.590 |
@article{huiyu2024using_combined_loss_and_fgm,
title={hyy33 at WASSA 2024 Empathy and Personality Shared Task: Using the CombinedLoss and FGM for Enhancing BERT-based Models in Emotion and Empathy Prediction from Conversation Turns},
booktitle = "Proceedings of the 14th Workshop on Computational Approaches to Subjectivity, Sentiment, {\&} Social Media Analysis"
author={Huiyu, Yang and Liting, Huang and Tian, Li and Nicolay, Rusnachenko and Huizhi, Liang},
year= "2024",
month= aug,
address = "Bangkok, Thailand",
}