several types of attention modules written in PyTorch
transformers
pytorch
transformer
attention
attention-mechanism
softmax-layer
multi-head-attention
multi-query-attention
grouped-query-attention
scale-dot-product-attention
-
Updated
May 13, 2024 - Python