#
flash-attn
Here are 5 public repositories matching this topic...
Compile environment for Pytorch with CUDA
python cloud compiler docker-image cuda python3 pytorch jupyterlab cuda-toolkit code-server flash-attn sage-attention
-
Updated
Aug 31, 2025 - Python
A list of uv environments templates for LLM development.
-
Updated
Sep 19, 2025
🌐 Streamline LLM development with ready-to-use environment templates for efficient setup and deployment.
-
Updated
Sep 8, 2025
Improve this page
Add a description, image, and links to the flash-attn topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the flash-attn topic, visit your repo's landing page and select "manage topics."