Skip to content

Commit

Permalink
Fix torch include in op_builder/mlu/fused_adam.py and update no-tor…
Browse files Browse the repository at this point in the history
…ch workflow triggers (#6584)

Changes from #6472 caused the no-torch workflow that is an example of
how we build the DeepSpeed release package to fail (so we caught this
before a release, see more in #6402). These changes also copy the style
used to include torch in other accelerator op_builder implementations,
such as npu
[here](https://github.com/microsoft/DeepSpeed/blob/master/op_builder/npu/fused_adam.py#L8)
and hpu
[here](https://github.com/microsoft/DeepSpeed/blob/828ddfbbda2482412fffc89f5fcd3b0d0eba9a62/op_builder/hpu/fused_adam.py#L15).

This also updates the no-torch workflow to run on all changes to the
op_builder directory. The test runs quickly and shouldn't add any
additional testing burden there.

Resolves: #6576
  • Loading branch information
loadams authored Sep 27, 2024
1 parent 828ddfb commit 8cded57
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 1 deletion.
1 change: 1 addition & 0 deletions .github/workflows/no-torch.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ on:
pull_request:
paths:
- '.github/workflows/no-torch.yml'
- 'op_builder/**'
schedule:
- cron: "0 0 * * *"

Expand Down
6 changes: 5 additions & 1 deletion op_builder/mlu/fused_adam.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,11 @@
# DeepSpeed Team

from .builder import MLUOpBuilder
import torch

try:
import torch
except ImportError as e:
pass


class MLUFusedAdam:
Expand Down

0 comments on commit 8cded57

Please sign in to comment.