Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do not count hitting maxiter as optimization failure & update default maxiter #1478

Closed
wants to merge 1 commit into from

Conversation

saitcakmak
Copy link
Contributor

Summary:
We currently retry optimization upon hitting maxiter, which typically leads to optimizing the same thing twice. This diff updates the warning handling to avoid this behavior.

The default maxiter is also updated to 2_000, which is a steep change from scipy defaults of 15_000 for L-BFGS-B and 100 for SLSQP. This is just a temporary setting to bridge the gap between the two until we come up with better stopping conditions for the optimizers.

Differential Revision: D41053812

@facebook-github-bot facebook-github-bot added CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported labels Nov 5, 2022
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D41053812

@codecov
Copy link

codecov bot commented Nov 5, 2022

Codecov Report

Merging #1478 (000fba1) into main (1fe3989) will not change coverage.
The diff coverage is 100.00%.

❗ Current head 000fba1 differs from pull request most recent head 46e83c0. Consider uploading reports for the commit 46e83c0 to get more accurate results

@@            Coverage Diff            @@
##              main     #1478   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files          149       149           
  Lines        12935     12945   +10     
=========================================
+ Hits         12935     12945   +10     
Impacted Files Coverage Δ
botorch/generation/gen.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@@ -69,7 +69,7 @@ def _setUp(self, double=False, expand=False):
class TestGenCandidates(TestBaseCandidateGeneration):
def test_gen_candidates(self, gen_candidates=gen_candidates_scipy, options=None):
options = options or {}
options = {**options, "maxiter": 5}
options["maxiter"] = options.get("maxiter", 5)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is problematic since if options is passed as a dict, this method will now modify the options dict in-place. We generally should avoid these kinds of side effects to function arguments (unless we specifically want this to happen).

@@ -227,6 +213,15 @@ def test_gen_candidates_scipy_warns_opt_failure(self):
)
self.assertTrue(expected_warning_raised)

def test_gen_candidates_scipy_no_maxiter_warning(self):
# Check that no warnings are raised on hitting maxiter.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should probably log this? Could be at the info or maybe even the warning logging level.

saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 7, 2022
… maxiter (pytorch#1478)

Summary:
Pull Request resolved: pytorch#1478

We currently retry optimization upon hitting maxiter, which typically leads to optimizing the same thing twice. This diff updates the warning handling to avoid this behavior.

The default maxiter is also updated to 2_000, which is a steep change from scipy defaults of 15_000 for L-BFGS-B and 100 for SLSQP. This is just a temporary setting to bridge the gap between the two until we come up with better stopping conditions for the optimizers.

Differential Revision: D41053812

fbshipit-source-id: 88dff72b6ae06ae11b0e9cf70c715d2ccca7f8fc
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D41053812

saitcakmak added a commit to saitcakmak/botorch that referenced this pull request Nov 9, 2022
… maxiter (pytorch#1478)

Summary:
Pull Request resolved: pytorch#1478

We currently retry optimization upon hitting maxiter, which typically leads to optimizing the same thing twice. This diff updates the warning handling to avoid this behavior.

The default maxiter is also updated to 2_000, which is a steep change from scipy defaults of 15_000 for L-BFGS-B and 100 for SLSQP. This is just a temporary setting to bridge the gap between the two until we come up with better stopping conditions for the optimizers.

Differential Revision: https://internalfb.com/D41053812

fbshipit-source-id: 0723841d9affb3f34cca92f127c1fbf100850c41
… maxiter (pytorch#1478)

Summary:
Pull Request resolved: pytorch#1478

We currently retry optimization upon hitting maxiter, which typically leads to optimizing the same thing twice. This diff updates the warning handling to avoid this behavior.

The default maxiter is also updated to 2_000, which is a steep change from scipy defaults of 15_000 for L-BFGS-B and 100 for SLSQP. This is just a temporary setting to bridge the gap between the two until we come up with better stopping conditions for the optimizers.

Reviewed By: esantorella

Differential Revision: D41053812

fbshipit-source-id: a0f27fb72f3cab6343fb55b6041717706e2445ce
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D41053812

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants