-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove default optimizer, add None optimizer option #1279
Merged
williamFalcon
merged 7 commits into
Lightning-AI:master
from
ethanwharris:optimizer_warnings
Apr 2, 2020
Merged
Changes from 4 commits
Commits
Show all changes
7 commits
Select commit
Hold shift + click to select a range
832f4e1
Add warning when using default optimizer
ethanwharris 9547557
Refactor optimizer tests to test_optimizers
ethanwharris ceff1d9
Remove default optimizer, add option to use no optimizer
ethanwharris 962cf8e
Update CHANGELOG.md
ethanwharris 993c648
Update pytorch_lightning/trainer/optimizers.py
ethanwharris c07adbd
Merge branch 'master' into optimizer_warnings
ethanwharris d188cd6
Fix style
ethanwharris File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,102 @@ | ||
import warnings | ||
from abc import ABC | ||
from typing import List, Tuple | ||
|
||
import torch | ||
from torch import optim | ||
from torch.optim.optimizer import Optimizer | ||
|
||
from pytorch_lightning.core.lightning import LightningModule | ||
|
||
|
||
class TrainerOptimizersMixin(ABC): | ||
|
||
def init_optimizers( | ||
self, | ||
model: LightningModule | ||
) -> Tuple[List, List]: | ||
optimizers = model.configure_optimizers() | ||
|
||
if optimizers is None: | ||
warnings.warn('`LightningModule.configure_optimizers` is not overriden or returned `None`,' | ||
'this fit will run with no optimizer', UserWarning) | ||
ethanwharris marked this conversation as resolved.
Show resolved
Hide resolved
|
||
optimizers = _MockOptimizer() | ||
|
||
# single output, single optimizer | ||
if isinstance(optimizers, Optimizer): | ||
return [optimizers], [] | ||
|
||
# two lists, optimizer + lr schedulers | ||
elif len(optimizers) == 2 and isinstance(optimizers[0], list): | ||
optimizers, lr_schedulers = optimizers | ||
lr_schedulers = self.configure_schedulers(lr_schedulers) | ||
return optimizers, lr_schedulers | ||
|
||
# single list or tuple, multiple optimizer | ||
elif isinstance(optimizers, (list, tuple)): | ||
return optimizers, [] | ||
|
||
# unknown configuration | ||
else: | ||
raise ValueError('Unknown configuration for model optimizers. Output' | ||
'from model.configure_optimizers() should either be:' | ||
'* single output, single torch.optim.Optimizer' | ||
'* single output, list of torch.optim.Optimizer' | ||
'* two outputs, first being a list of torch.optim.Optimizer', | ||
'second being a list of torch.optim.lr_scheduler') | ||
|
||
def configure_schedulers(self, schedulers: list): | ||
# Convert each scheduler into dict sturcture with relevant information | ||
lr_schedulers = [] | ||
default_config = {'interval': 'epoch', # default every epoch | ||
'frequency': 1, # default every epoch/batch | ||
'reduce_on_plateau': False, # most often not ReduceLROnPlateau scheduler | ||
'monitor': 'val_loss'} # default value to monitor for ReduceLROnPlateau | ||
for scheduler in schedulers: | ||
if isinstance(scheduler, dict): | ||
if 'scheduler' not in scheduler: | ||
raise ValueError(f'Lr scheduler should have key `scheduler`', | ||
' with item being a lr scheduler') | ||
scheduler['reduce_on_plateau'] = isinstance( | ||
scheduler['scheduler'], optim.lr_scheduler.ReduceLROnPlateau) | ||
|
||
lr_schedulers.append({**default_config, **scheduler}) | ||
|
||
elif isinstance(scheduler, optim.lr_scheduler.ReduceLROnPlateau): | ||
lr_schedulers.append({**default_config, 'scheduler': scheduler, | ||
'reduce_on_plateau': True}) | ||
|
||
elif isinstance(scheduler, optim.lr_scheduler._LRScheduler): | ||
lr_schedulers.append({**default_config, 'scheduler': scheduler}) | ||
else: | ||
raise ValueError(f'Input {scheduler} to lr schedulers ' | ||
'is a invalid input.') | ||
return lr_schedulers | ||
|
||
|
||
class _MockOptimizer(Optimizer): | ||
"""The `_MockOptimizer` will be used inplace of an optimizer in the event that `None` | ||
is returned from `configure_optimizers`. | ||
""" | ||
|
||
def __init__(self): | ||
super().__init__([torch.zeros(1)], {}) | ||
|
||
def add_param_group(self, param_group): | ||
pass # Do Nothing | ||
|
||
def load_state_dict(self, state_dict): | ||
pass # Do Nothing | ||
|
||
def state_dict(self): | ||
return {} # Return Empty | ||
|
||
def step(self, closure=None): | ||
if closure is not None: | ||
closure() | ||
|
||
def zero_grad(self): | ||
pass # Do Nothing | ||
|
||
def __repr__(self): | ||
return 'No Optimizer' |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what if we just make the whole function optional? instead of just the args? or is that optional syntax doing that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
or i guess if we make the method still required but optional return, then it maintains readability? on second thought i kind of like this better. that way it’s explicit in code that you used no optimizer
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, this does that - or at least, PyCharm isn't prompting me to implement the method :) - the optional thing just means you wont get a type warning if you return
None
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
^^ we can do that - currently method isn't required but happy to change that :) agree that this behaviour should be explicit