The CodeSearchNet dataset we use comes from the settings of GraphCodeBERT. You can download and preprocess data using the following command.
unzip dataset.zip
cd dataset
bash run.sh
cd ..
You can install dependencies using the following command.
pip install torch
pip install transformers
pip install tree_sitter
We fine-tuned the model on a V100-32G GPU. We provide a script for fine-tuning the model. You can change the programming language type and initial model in the script. And you can fine-tune the model using the following command.
sh run.sh
Supported programming languages: ruby, javascript, python, java, php and go.
Supported initial pre-trained models: unixcoder-base, codebert-base and graphcodebert-base.
We also provide a shell for evaluating the fine-tuned model on the test set. Noted that the programming language and initial model need to be consistent with the fine-tuning script.
sh test.sh
@inproceedings{shi2023mococs,
title={Improving Code Search with Multi-Modal Momentum Contrastive Learning},
author={Shi, Zejian and Xiong, Yun and Zhang, Yao and Jiang, Zhijie and Zhao, Jinjing and Wang, Lei and Li, Shanshan},
booktitle={Proceedings of the 31st IEEE/ACM International Conference on Program Comprehension},
year={2023}
organization={IEEE/ACM}
}
We use the parser in GraphCodeBERT to convert code into AST and DFG.