-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Do not count hitting maxiter as optimization failure & update default maxiter #1478
Conversation
This pull request was exported from Phabricator. Differential Revision: D41053812 |
Codecov Report
@@ Coverage Diff @@
## main #1478 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 149 149
Lines 12935 12945 +10
=========================================
+ Hits 12935 12945 +10
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
test/generation/test_gen.py
Outdated
@@ -69,7 +69,7 @@ def _setUp(self, double=False, expand=False): | |||
class TestGenCandidates(TestBaseCandidateGeneration): | |||
def test_gen_candidates(self, gen_candidates=gen_candidates_scipy, options=None): | |||
options = options or {} | |||
options = {**options, "maxiter": 5} | |||
options["maxiter"] = options.get("maxiter", 5) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is problematic since if options is passed as a dict, this method will now modify the options
dict in-place. We generally should avoid these kinds of side effects to function arguments (unless we specifically want this to happen).
test/generation/test_gen.py
Outdated
@@ -227,6 +213,15 @@ def test_gen_candidates_scipy_warns_opt_failure(self): | |||
) | |||
self.assertTrue(expected_warning_raised) | |||
|
|||
def test_gen_candidates_scipy_no_maxiter_warning(self): | |||
# Check that no warnings are raised on hitting maxiter. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should probably log this? Could be at the info or maybe even the warning logging level.
… maxiter (pytorch#1478) Summary: Pull Request resolved: pytorch#1478 We currently retry optimization upon hitting maxiter, which typically leads to optimizing the same thing twice. This diff updates the warning handling to avoid this behavior. The default maxiter is also updated to 2_000, which is a steep change from scipy defaults of 15_000 for L-BFGS-B and 100 for SLSQP. This is just a temporary setting to bridge the gap between the two until we come up with better stopping conditions for the optimizers. Differential Revision: D41053812 fbshipit-source-id: 88dff72b6ae06ae11b0e9cf70c715d2ccca7f8fc
b0ac7c7
to
0767a55
Compare
This pull request was exported from Phabricator. Differential Revision: D41053812 |
… maxiter (pytorch#1478) Summary: Pull Request resolved: pytorch#1478 We currently retry optimization upon hitting maxiter, which typically leads to optimizing the same thing twice. This diff updates the warning handling to avoid this behavior. The default maxiter is also updated to 2_000, which is a steep change from scipy defaults of 15_000 for L-BFGS-B and 100 for SLSQP. This is just a temporary setting to bridge the gap between the two until we come up with better stopping conditions for the optimizers. Differential Revision: https://internalfb.com/D41053812 fbshipit-source-id: 0723841d9affb3f34cca92f127c1fbf100850c41
… maxiter (pytorch#1478) Summary: Pull Request resolved: pytorch#1478 We currently retry optimization upon hitting maxiter, which typically leads to optimizing the same thing twice. This diff updates the warning handling to avoid this behavior. The default maxiter is also updated to 2_000, which is a steep change from scipy defaults of 15_000 for L-BFGS-B and 100 for SLSQP. This is just a temporary setting to bridge the gap between the two until we come up with better stopping conditions for the optimizers. Reviewed By: esantorella Differential Revision: D41053812 fbshipit-source-id: a0f27fb72f3cab6343fb55b6041717706e2445ce
0767a55
to
46e83c0
Compare
This pull request was exported from Phabricator. Differential Revision: D41053812 |
Summary:
We currently retry optimization upon hitting maxiter, which typically leads to optimizing the same thing twice. This diff updates the warning handling to avoid this behavior.
The default maxiter is also updated to 2_000, which is a steep change from scipy defaults of 15_000 for L-BFGS-B and 100 for SLSQP. This is just a temporary setting to bridge the gap between the two until we come up with better stopping conditions for the optimizers.
Differential Revision: D41053812