Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing get_last_lr() in Custom PolyLR Scheduler #2693

Open
SimoneBendazzoli93 opened this issue Feb 7, 2025 · 0 comments · May be fixed by #2694
Open

Missing get_last_lr() in Custom PolyLR Scheduler #2693

SimoneBendazzoli93 opened this issue Feb 7, 2025 · 0 comments · May be fixed by #2694
Assignees

Comments

@SimoneBendazzoli93
Copy link

SimoneBendazzoli93 commented Feb 7, 2025

The custom PolyLRScheduler scheduler implementation is missing the get_last_lr() function, which is essential for retrieving the last computed learning rate for each parameter group. This function is present in PyTorch’s built-in schedulers and is useful for logging, debugging, and maintaining API consistency. Adding this function will improve usability and integration with standard PyTorch workflows.
One practical example is integrating nnUNet training in a MONAI Bundle standard (i.e. using PolyLRScheduler in the LearningRateSchedulerHandler ): currently, in order to do so, the PolyLRScheduler needs to be modified as follows:

from torch.optim.lr_scheduler import _LRScheduler


class PolyLRScheduler(_LRScheduler):
    def __init__(self, optimizer, initial_lr: float, max_steps: int, exponent: float = 0.9, current_step: int = None):
        self.optimizer = optimizer
        self.initial_lr = initial_lr
        self.max_steps = max_steps
        self.exponent = exponent
        self.ctr = 0
        super().__init__(optimizer, current_step if current_step is not None else -1, False)

    def step(self, current_step=None):
        if current_step is None or current_step == -1:
            current_step = self.ctr
            self.ctr += 1

        new_lr = self.initial_lr * (1 - current_step / self.max_steps) ** self.exponent
        for param_group in self.optimizer.param_groups:
            param_group['lr'] = new_lr

        self._last_lr = [group['lr'] for group in self.optimizer.param_groups] # 🚀 Missing definition

    def get_last_lr(self):   # 🚀 Missing function that should be included
        return self._last_lr
@SimoneBendazzoli93 SimoneBendazzoli93 changed the title PolyLR Scheduler fix for Missing get_last_lr() in Custom PolyLR Scheduler Feb 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants