we want to create a repo to illustrate usage of transformers in chinese
-
Updated
Aug 18, 2024 - Shell
we want to create a repo to illustrate usage of transformers in chinese
Code for the paper "Are Sixteen Heads Really Better than One?"
🇮🇹 Italian BERT and ELECTRA models (incl. evaluation)
reference pytorch code for huggingface transformers
Attentively Embracing Noise for Robust Latent Representation in BERT (COLING 2020)
Experiments scripts of BERT and ESM-1B Models with Kmer tokenizer
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."