Attention temporal convolutional network for EEG-based motor imagery classification
-
Updated
Jun 23, 2024 - Python
Attention temporal convolutional network for EEG-based motor imagery classification
A Faster Pytorch Implementation of Multi-Head Self-Attention
BabyGPT: Build Your Own GPT Large Language Model from Scratch Pre-Training Generative Transformer Models: Building GPT from Scratch with a Step-by-Step Guide to Generative AI in PyTorch and Python
Transformer/Transformer-XL/R-Transformer examples and explanations
Transformer creation from scratch using Jax.
PyTorch implementation of transformers with multi-headed self attention
Add a description, image, and links to the multi-head-self-attention topic page so that developers can more easily learn about it.
To associate your repository with the multi-head-self-attention topic, visit your repo's landing page and select "manage topics."