Visualization for simple attention and Google's multi-head attention.
-
Updated
Mar 8, 2018 - Java
Visualization for simple attention and Google's multi-head attention.
Sentence encoder and training code for Mean-Max AAE
Text matching using several deep models.
TensorFlow implementation of AlexNet with multi-headed Attention mechanism
Collection of different types of transformers for learning purposes
Attention-based Induction Networks for Few-Shot Text Classification
Code and Datasets for the paper "A deep learning framework for high-throughput mechanism-driven phenotype compound screening and its application to COVID-19 drug repurposing", published on Nature Machine Intelligence in 2021.
EMNLP 2018: Multi-Head Attention with Disagreement Regularization; NAACL 2019: Information Aggregation for Multi-Head Attention with Routing-by-Agreement
Pytorch Implementation of Transformers
"Attention, Learn to Solve Routing Problems!"[Kool+, 2019], Capacitated Vehicle Routing Problem solver
Code for the runners up entry on the English subtask on the Shared-Task-On-Fighting the COVID-19 Infodemic, NLP4IF workshop, NAACL'21.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Attention is all you need: Discovering the Transformer model
Deep Xi: A deep learning approach to a priori SNR estimation implemented in TensorFlow 2/Keras. For speech enhancement and robust ASR.
A Transformer Classifier implemented from Scratch.
PyTorch implementation of some attentions for Deep Learning Researchers.
This is the official repository of the original Point Transformer architecture.
Image Captioning with Encoder as Efficientnet and Decoder as Decoder of Transformer combined with the attention mechanism.
A Faster Pytorch Implementation of Multi-Head Self-Attention
Add a description, image, and links to the multi-head-attention topic page so that developers can more easily learn about it.
To associate your repository with the multi-head-attention topic, visit your repo's landing page and select "manage topics."