Skip to content

[EMNLP 2023] RethinkingTMSC: An Empirical Study for Target-Oriented Multimodal Sentiment Classification

Notifications You must be signed in to change notification settings

Junjie-Ye/RethinkingTMSC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[EMNLP 2023] RethinkingTMSC

[EMNLP 2023] RethinkingTMSC: An Empirical Study for Target-Oriented Multimodal Sentiment Classification

Dataset and codes for paper "RethinkingTMSC: An Empirical Study for Target-Oriented Multimodal Sentiment Classification"

Junjie Ye

jjye23@m.fudan.edu.cn

Oct. 12, 2023

Requirement

  • Python 3.7+
  • Run the command to install the packages required.
    pip install -r requirements.txt

Download tweet images and ResNet-152

  • Step 1: Download each tweet's associated image.
  • Step 2: Save the images to data/Twitter15/images/ and data/Twitter17/images/, respectively.
  • Step 3: Download the pre-trained ResNet-152.
  • Setp 4: Put the pre-trained ResNet-152 model under the folder named Code/resnet/.

Prepare image features

Since the files are too big to load, please extract them by yourself.

  • Step 1: Download the pre-trained Pretrained Faster R-CNN model, which is trained with Visual Genome + Res101 + Pytorch and save it to the folder Code/faster_rcnn/models/.

  • Step 2: Compile the cuda dependencies using following simple commands:

    cd Code/faster_rcnn
    python setup.py build develop
  • Step 2: Extract the features and save them:

    cd Code/faster_rcnn
    python data_process.py --source_path ../../data/Twitter15/images --save_path ../../data/Twitter15/faster_features
    python data_process.py --source_path ../../data/Twitter17/images --save_path ../../data/Twitter17/faster_features

Code Usage

Training and Analysis

  • This is the training code of tuning parameters on the dev set, and testing on the test set for all models.

    cd scripts
    bash run.sh

Reminder

  • You can find the results we report in our paper from the output/ folder directly.

Acknowledgements

Cite

  • If you find our code is helpful, please cite our paper
@inproceedings{DBLP:conf/emnlp/YeZTWZG023,
  author       = {Junjie Ye and
                  Jie Zhou and
                  Junfeng Tian and
                  Rui Wang and
                  Qi Zhang and
                  Tao Gui and
                  Xuanjing Huang},
  editor       = {Houda Bouamor and
                  Juan Pino and
                  Kalika Bali},
  title        = {RethinkingTMSC: An Empirical Study for Target-Oriented Multimodal
                  Sentiment Classification},
  booktitle    = {Findings of the Association for Computational Linguistics: {EMNLP}
                  2023, Singapore, December 6-10, 2023},
  pages        = {270--277},
  publisher    = {Association for Computational Linguistics},
  year         = {2023},
  url          = {https://aclanthology.org/2023.findings-emnlp.21},
  timestamp    = {Wed, 13 Dec 2023 17:20:20 +0100},
  biburl       = {https://dblp.org/rec/conf/emnlp/YeZTWZG023.bib},
  bibsource    = {dblp computer science bibliography, https://dblp.org}
}

About

[EMNLP 2023] RethinkingTMSC: An Empirical Study for Target-Oriented Multimodal Sentiment Classification

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published