Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Implement DeMo optimizer #301

Merged
merged 12 commits into from
Dec 2, 2024
Prev Previous commit
Next Next commit
update: no cover
kozistr committed Dec 2, 2024
commit 0b2ad892a9c8d5300b9a7c55f0c151bf4ff83eaa
2 changes: 1 addition & 1 deletion pytorch_optimizer/optimizer/demo.py
Original file line number Diff line number Diff line change
@@ -280,7 +280,7 @@ def get_smaller_split(n: int, close_to: int) -> int:
return n


class DeMo(torch.optim.SGD, BaseOptimizer):
class DeMo(torch.optim.SGD, BaseOptimizer): # pragma: no cover
r"""Decoupled Momentum Optimization.

:param params: PARAMETERS. iterable of parameters to optimize or dicts defining parameter groups.