We provide the official Tensorflow & Keras implementation of training our PiTE: TCR-epitope binding affinity prediction pipeline using Transformer-based Sequence Encoder.
PiTE: TCR-epitope Binding Affinity Prediction Pipeline using Transformer-based Sequence Encoder
Pengfei Zhang1,2, Seojin Bang2, Heewook Lee1,2
1 School of Computing and Augmented Intelligence, Arizona State University, 2 Biodesign Institute, Arizona State University
Published in: Pacific Symposium on Biocomputing (PSB), 2022.
Paper | Code | Poster | Slides | Presentation (YouTube)
- Linux
- Python 3.6.13
- Keras 2.6.0
- TensorFlow 2.6.0
git clone https://github.com/Lee-CBG/PiTE
cd PiTE/
pip install -r requirements.txt
- Data for baseline model can be downloaded here. The size is 5.16 GB.
- Data for other models such as Transformer, BiLSTM, and CNNs can be download here. The size is 68.93 GB. Preprocess this data using preprocessing.ipynb file before intiating the training.
An example for training the transformer-based model
python -W ignore main.py \
--nns transformer \
--split tcr \
--gpu 0 \
--run 0 \
--seed 42
If you use this code or use our PiTE for your research, please cite our paper:
@inproceedings{zhang2022pite,
title={PiTE: TCR-epitope Binding Affinity Prediction Pipeline using Transformer-based Sequence Encoder},
author={Zhang, Pengfei and Bang, Seojin and Lee, Heewook},
booktitle={PACIFIC SYMPOSIUM ON BIOCOMPUTING 2023: Kohala Coast, Hawaii, USA, 3--7 January 2023},
pages={347--358},
year={2022},
organization={World Scientific}
}
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.