Skip to content

Pinned Loading

  1. flash-linear-attention flash-linear-attention Public

    šŸš€ Efficient implementations of state-of-the-art linear attention models in Torch and Triton

    Python 2.2k 144

  2. flame flame Public

    šŸ”„ A minimal training framework for scaling FLA models

    Python 94 15

  3. native-sparse-attention native-sparse-attention Public

    šŸ³ Efficient Triton implementations for "Native Sparse Attention: Hardware-Aligned and Natively Trainable Sparse Attention"

    Python 608 30

Repositories

Showing 8 of 8 repositories
  • fla-zoo Public

    Flash-Linear-Attention models beyond language

    fla-org/fla-zooā€™s past year of commit activity
    Python 9 1 0 0 Updated Apr 4, 2025
  • flash-linear-attention Public

    šŸš€ Efficient implementations of state-of-the-art linear attention models in Torch and Triton

    fla-org/flash-linear-attentionā€™s past year of commit activity
    Python 2,207 MIT 144 29 6 Updated Apr 4, 2025
  • flame Public

    šŸ”„ A minimal training framework for scaling FLA models

    fla-org/flameā€™s past year of commit activity
    Python 94 MIT 15 0 1 Updated Apr 1, 2025
  • fla-rl Public

    A minimal RL frame work for scaling FLA models on long-horizon reasoning and agentic scenarios.

    fla-org/fla-rlā€™s past year of commit activity
    4 MIT 0 0 0 Updated Apr 1, 2025
  • ThunderKittens Public Forked from HazyResearch/ThunderKittens

    Tile primitives for speedy kernels

    fla-org/ThunderKittensā€™s past year of commit activity
    Cuda 2 MIT 134 0 0 Updated Mar 27, 2025
  • native-sparse-attention Public

    šŸ³ Efficient Triton implementations for "Native Sparse Attention: Hardware-Aligned and Natively Trainable Sparse Attention"

    fla-org/native-sparse-attentionā€™s past year of commit activity
    Python 608 MIT 30 9 0 Updated Mar 19, 2025
  • fla-org/flash-hybrid-attentionā€™s past year of commit activity
    7 0 0 0 Updated Mar 5, 2025
  • flash-bidirectional-linear-attention Public

    Triton implement of bi-directional (non-causal) linear attention

    fla-org/flash-bidirectional-linear-attentionā€™s past year of commit activity
    Python 44 MIT 1 0 0 Updated Feb 4, 2025

Top languages

Loadingā€¦

Most used topics

Loadingā€¦