Skip to content

Commit

Permalink
Update on "[BE] remove old pytorch version warning on strided shardin…
Browse files Browse the repository at this point in the history
…g since 2.5 is official released"


#507 added a PyTorch version check when users try to use FSDP+TP, to make sure the right PT version includes DTensor strided sharding which assures correct DTensor checkpoint. Since PyTorch 2.5 is official released and strided sharding is included in 2.5, we can safely remove this warning.



[ghstack-poisoned]
  • Loading branch information
XilunWu committed Oct 30, 2024
1 parent ab922b6 commit fd1cacc
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion torchtitan/parallelisms/parallelize_llama.py
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,6 @@
from torchtitan.config_manager import JobConfig, TORCH_DTYPE_MAP
from torchtitan.logging import logger
from torchtitan.parallelisms.parallel_dims import ParallelDims
from torchtitan.parallelisms.utils import check_strided_sharding_enabled


def parallelize_llama(
Expand Down

0 comments on commit fd1cacc

Please sign in to comment.