Skip to content

Commit

Permalink
Don't import torch.distributed when it's not available (huggingface#3…
Browse files Browse the repository at this point in the history
…5777)

This is a continuation of 217c47e but
for another module. This issue was spotted in nixpkgs (again) when
building lm-eval package that used a different path in transformers
library to reach the same failure.

Related: huggingface#35133
  • Loading branch information
booxter authored and bursteratom committed Jan 28, 2025
1 parent c3bb29f commit 7c5c486
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions src/transformers/integrations/fsdp.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,11 @@ def is_fsdp_managed_module(module: nn.Module) -> bool:
if not is_torch_available():
return False

import torch

if not torch.distributed.is_available():
return False

import torch.distributed.fsdp

return isinstance(module, torch.distributed.fsdp.FullyShardedDataParallel) or getattr(
Expand Down

0 comments on commit 7c5c486

Please sign in to comment.