Skip to content

This repository provides a simple implementation of an attention mechanism both with and without PyTorch. The primary goal is to understand the forward pass of this architecture.

License

Notifications You must be signed in to change notification settings

Edu-p/attention-mechanism

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 

Repository files navigation

attention-mechanism

image

This repository provides a simple implementation of an attention mechanism both with and without PyTorch. The primary goal is to understand the forward pass of this architecture.

Architecture Details

  • Embedding Layer: Initialized randomly using a normal distribution.
  • Model Dimension: dim_model = 64
  • Sequence Length: seq_length = 10
  • Vocabulary Size: vocab_size = 100

Requirements

  • Python 3.9
  • PyTorch (optional)

License

This project is licensed under the MIT License.

About

This repository provides a simple implementation of an attention mechanism both with and without PyTorch. The primary goal is to understand the forward pass of this architecture.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published