You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. Adds a CI test for 1D compile + selective op AC, which used to fail silently.
2. The flag `torch._dynamo.config.inline_inbuilt_nn_modules` is enabled to accelerate compilation (for llama3 8b on 8 H100, compile time drops from 9+ seconds to 6+ seconds), per anijain2305's suggestion.
3. It seems per TransformerBlock compile now works without `dynamic=False` and `fullgraph=True`. It is good to reflect the progress and catch regressions, per bdhirsh's suggestion.
[ghstack-poisoned]
Copy file name to clipboardexpand all lines: README.md
+9-1
Original file line number
Diff line number
Diff line change
@@ -18,6 +18,14 @@ Our guiding principles when building `torchtitan`:
18
18
19
19
[](https://youtu.be/ee5DOEqD35I?si=_B94PbVv0V5ZnNKE"Welcome to torchtitan!")
20
20
21
+
### Dive into the code
22
+
23
+
You may want to see how the model is defined or how parallelism techniques are applied. For a guided tour, see these files first:
24
+
*[train.py](https://github.com/pytorch/torchtitan/blob/main/train.py) - the main training loop and high-level setup code
25
+
*[torchtitan/parallelisms/parallelize_llama.py](https://github.com/pytorch/torchtitan/blob/main/torchtitan/parallelisms/parallelize_llama.py) - helpers for applying Data / Tensor / Pipeline Parallelisms to the model
26
+
*[torchtitan/checkpoint.py](https://github.com/pytorch/torchtitan/blob/main/torchtitan/checkpoint.py) - utils for saving/loading distributed checkpoints
27
+
*[torchtitan/models/llama/model.py](https://github.com/pytorch/torchtitan/blob/main/torchtitan/models/llama/model.py) - the Llama model definition (shared for Llama2 and Llama3 variants)
28
+
21
29
## Pre-Release Updates:
22
30
#### (4/25/2024): `torchtitan` is now public but in a pre-release state and under development.
23
31
Currently we showcase pre-training **Llama 3 and Llama 2** LLMs of various sizes from scratch. `torchtitan` is tested and verified with the PyTorch nightly version `torch-2.4.0.dev20240412`. (We recommend latest PyTorch nightly).
@@ -66,7 +74,7 @@ Once you have confirmed access, you can run the following command to download th
66
74
```bash
67
75
# Get your HF token from https://huggingface.co/settings/tokens
0 commit comments