You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In #1279, support for returning None from configure_optimizers was added to Lightning. The use case was training without an optimizer. This preceded support for manual optimization, in which the user controls the backward & optimizer step directly inside of their training step.
The _MockOptimizer leaks out like this which could be very confusing for developers.
classMyLightningModule(LightningModule):
defconfigure_optimizers():
returnNonedeftraining_step(batch, batch_idx):
opt=self.optimizers()
# opt is not None! what?!
Is training with no optimizer a valid use case Lightning supports? Are there examples/references one could share to learn more about these use cases?
If the Trainer creates a mock optimizer for users, should the mock optimizer ever be exposed back to the user?
If training with no optimizer is a valid use case, should we require users to use manual optimization for this, so we don't configure a mock optimizer instance for them?
This issue has been automatically marked as stale because it hasn't had any recent activity. This issue will be closed in 7 days if no further activity occurs. Thank you for your contributions, Pytorch Lightning Team!
In #1279, support for returning
None
fromconfigure_optimizers
was added to Lightning. The use case was training without an optimizer. This preceded support for manual optimization, in which the user controls the backward & optimizer step directly inside of their training step.The
_MockOptimizer
leaks out like this which could be very confusing for developers.Originally posted by @ananthsub in #11155 (comment)
cc @tchaton @justusschock @awaelchli @Borda @rohitgr7
The text was updated successfully, but these errors were encountered: