- Read competition information and relevant content to feel comfortable with the problem. Create a hypothesis based on the problem.
- Initial data exploration to feel comfortable with the problem and the data.
- Build the first implementation (baseline).
- Loop through [Analyze -> Approach(model) -> Implement -> Evaluate].
-
- Encoder-decoders in Transformers: a hybrid pre-trained architecture for seq2seq
- Universal Sentence Encoder - TF Hub
- Semantic Similarity with TF-Hub Universal Encoder - Colab
- Multilingual Universal Sentence Encoder for Semantic Retrieval
- Universal Sentence Encoder - Paper
- BERT - Input
- HuggingFace Transformers
- Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT
- The Illustrated Transformer
- The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)
- Hugging Face: State-of-the-Art Natural Language Processing in ten lines of TensorFlow 2.0
- Simple BERT using TensorFlow 2.0
- TF Hub - bert_en_uncased_L-12_H-768_A-12
- GloVe: Global Vectors for Word Representation
- Multilingual Universal Sentence Encoder for Semantic Retrieval
- Semantic Similarity with TF-Hub Universal Encoder