Skip to content

Commit

Permalink
Don't import torch.distributed when it's not available
Browse files Browse the repository at this point in the history
This is a continuation of 217c47e but
for another module. This issue was spotted in nixpkgs (again) when
building lm-eval package that used a different path in transformers
library to reach the same failure.

Related: #35133
  • Loading branch information
booxter committed Jan 19, 2025
1 parent 5fa3534 commit db864b5
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions src/transformers/integrations/fsdp.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,11 @@ def is_fsdp_managed_module(module: nn.Module) -> bool:
if not is_torch_available():
return False

import torch

if not torch.distributed.is_available():
return False

import torch.distributed.fsdp

return isinstance(module, torch.distributed.fsdp.FullyShardedDataParallel) or getattr(
Expand Down

0 comments on commit db864b5

Please sign in to comment.