-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix toggle optimizer #5775
Fix toggle optimizer #5775
Conversation
Codecov Report
@@ Coverage Diff @@
## master #5775 +/- ##
=======================================
- Coverage 94% 93% -0%
=======================================
Files 134 134
Lines 10053 9913 -140
=======================================
- Hits 9409 9255 -154
- Misses 644 658 +14 |
pytorch_lightning/core/lightning.py
Outdated
if param in param_requires_grad_state: | ||
param.requires_grad = param_requires_grad_state[param] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This forces all optimizers to have the same grad state for all params?
@ananthsub could you please add a test so we won't accidentally break it again... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @ananthsub,
I apologise this part wasn't better tested and I introduced a critical bug.
I think there were still a bug in the code with the missing continue.
I have added a test with 3 optimizers.
Would you mind to check if everything works fine for you ?
Best,
T.C
Hello @ananthsub! Thanks for updating this PR. There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻 Comment last updated at 2021-02-04 20:27:46 UTC |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice!
Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
thanks @tchaton for the test cases - confirming that this patch works for me |
* Update lightning.py * update changelog * add a 3 optimizer test * resolve flake8 * remove extra code * typo * resolve typo * Apply suggestions from code review Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: tchaton <thomas@grid.ai> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
* Update lightning.py * update changelog * add a 3 optimizer test * resolve flake8 * remove extra code * typo * resolve typo * Apply suggestions from code review Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: tchaton <thomas@grid.ai> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
* Update lightning.py * update changelog * add a 3 optimizer test * resolve flake8 * remove extra code * typo * resolve typo * Apply suggestions from code review Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: tchaton <thomas@grid.ai> Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com> Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
What does this PR do?
Fixes #5292
This more closely follows the suggestions from the issue. Inside of
toggle_optimizers
, first we scan over allrequires_grad
values for all the parameters and store them on the lightning module., and setrequires_grad=False
. Then we iterate over the values of the current optimizer's parameters and set them according to the dictionary created from the initial scan.The issue with the initial fix was multiple optimizers with shared parameters were handled incorrectly.
Before submitting
PR review
Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:
Did you have fun?
Make sure you had fun coding 🙃