Skip to content

Code and data for ICPC 2023 Technical Research track paper "Improving Code Search with Multi-Modal Momentum Contrastive Learning"

License

Notifications You must be signed in to change notification settings

FDUDSDE/ICPC2023MoCoCS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Dataset

The CodeSearchNet dataset we use comes from the settings of GraphCodeBERT. You can download and preprocess data using the following command.

unzip dataset.zip
cd dataset
bash run.sh 
cd ..

Dependency

You can install dependencies using the following command.

pip install torch
pip install transformers
pip install tree_sitter

Fine-tune

We fine-tuned the model on a V100-32G GPU. We provide a script for fine-tuning the model. You can change the programming language type and initial model in the script. And you can fine-tune the model using the following command.

sh run.sh

Supported programming languages: ruby, javascript, python, java, php and go.

Supported initial pre-trained models: unixcoder-base, codebert-base and graphcodebert-base.

Evaluation

We also provide a shell for evaluating the fine-tuned model on the test set. Noted that the programming language and initial model need to be consistent with the fine-tuning script.

sh test.sh

Cite

@inproceedings{shi2023mococs,
  title={Improving Code Search with Multi-Modal Momentum Contrastive Learning},
  author={Shi, Zejian and Xiong, Yun and Zhang, Yao and Jiang, Zhijie and Zhao, Jinjing and Wang, Lei and Li, Shanshan},
  booktitle={Proceedings of the 31st IEEE/ACM International Conference on Program Comprehension},
  year={2023}
  organization={IEEE/ACM}
}

Acknowledgement

We use the parser in GraphCodeBERT to convert code into AST and DFG.

About

Code and data for ICPC 2023 Technical Research track paper "Improving Code Search with Multi-Modal Momentum Contrastive Learning"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published