-
Notifications
You must be signed in to change notification settings - Fork 358
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dedicate half constraint budget to a suboptimizer #1047
Merged
Merged
Changes from 37 commits
Commits
Show all changes
55 commits
Select commit
Hold shift + click to select a range
09b9774
cst_solver
teytaud 748dd8c
warning
teytaud 1aec317
warning
teytaud 066fd8d
Merge branch 'master' of github.com:facebookresearch/nevergrad into c…
teytaud 7b2ca76
fix
teytaud 4b0c9d7
black
teytaud 5ff0766
fix keras issue
teytaud 08dd791
Update test_mlfunctionlib.py
teytaud 2d51bfa
black
teytaud c3f46d0
Merge branch 'master' into cstsolve
jrapin 7ee4d9a
[PR on cstsolve PR] Try to extract constraint solving (#968)
jrapin 4501834
Merge branch 'master' of github.com:facebookresearch/nevergrad
teytaud 7e4ec60
Update utils.py
teytaud 54dc137
Merge branch 'master' of github.com:facebookresearch/nevergrad into c…
teytaud 5f59739
Merge branch 'master' of github.com:facebookresearch/nevergrad
teytaud 7651c97
Fix a bug.
teytaud c3671e1
Update base.py
teytaud 22d589c
Merge branch 'master' of gthub.com:facebookresearch/nevergrad
teytaud adceda1
Merge branch 'master' of github.com:facebookresearch/nevergrad into c…
teytaud 3809c2b
fix
teytaud e10dd0c
fix
teytaud dc2dd86
Merge bt checZZranch 'master' of github.com:facebookresearch/nevergrad
teytaud 4910f3d
Merge branch 'master' of github.com:facebookresearch/nevergrad
teytaud f0fa604
Merge branch 'master' of github.com:facebookresearch/nevergrad
teytaud c86b6cf
fix
teytaud da89ea4
Merge branch 'master' into ctr_and_cstsolve
teytaud 558c066
Update base.py
teytaud 2289c15
Update base.py
teytaud 43ed96b
Update base.py
teytaud dfec878
fix
teytaud a5756e6
fix
teytaud 77b661b
fix
teytaud 8d8aa9a
fix_comment
teytaud 7bf71b1
Merge branch 'master' into ctr_and_cstsolve
teytaud b5184d8
fixes
teytaud 8f28edb
black
teytaud a483223
fix
teytaud 09fe55f
fix
teytaud 5472f75
fix_typo
teytaud 663bdd2
Update nevergrad/optimization/base.py
teytaud a790abf
Update nevergrad/optimization/base.py
jrapin 240acd5
Clarify suggest arguments (#1072)
jrapin 7d35a25
Skip rocket xp test for speed (#1075)
jrapin 70ccc97
Update parametrization flatten function (#1074)
jrapin c49f2c0
Add configuration for PSO + simplifications (#1073)
jrapin 9da6c9d
Remove deprecated stuff (#1041)
teytaud 5493ec8
Morphing with Nevergrad (#1042)
teytaud 0bfa4ed
Add MOO xp variants (#1004)
teytaud bbbfa2c
minor
jrapin db68498
fix
jrapin 1ce89ba
CHANGELOG
jrapin b9effad
merge
jrapin 0102ab1
black
teytaud da52dd7
fix
teytaud af0ef7c
[PR on PR] Fix constraints (#1079)
jrapin File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -433,19 +433,29 @@ def ask(self) -> p.Parameter: | |
current_num_ask = self.num_ask | ||
# tentatives if a cheap constraint is available | ||
# TODO: this should be replaced by an optimization algorithm. | ||
max_trials = self._constraints_manager.max_trials | ||
max_trials = max( | ||
1, | ||
self._constraints_manager.max_trials // 2 | ||
if self.budget is None | ||
else (self.budget - self.num_ask), | ||
) # half will be used for sub-optimization --- if the optimization method does not need/use a budget. | ||
# TODO(oteytaud): actually we could do this even when the budget is known, if we are sure that exceeding the budget is not a problem. | ||
# Very simple constraint solver: | ||
# - we use a simple algorithm. | ||
# - no memory of previous iterations. | ||
# - just projection to constraint satisfaction. | ||
# We try using the normal tool during half constraint budget, in order to reduce the impact on the normal run. | ||
for k in range(max_trials): | ||
is_suggestion = False | ||
if self._suggestions: | ||
if self._suggestions: # use suggestions if available | ||
is_suggestion = True | ||
candidate = self._suggestions.pop() | ||
else: | ||
candidate = self._internal_ask_candidate() | ||
# only register actual asked points | ||
if candidate.satisfies_constraints(): | ||
break # good to go! | ||
if self._penalize_cheap_violations: | ||
# TODO using a suboptimizer instead may help remove this | ||
# Warning! This might be a tell not asked. | ||
self._internal_tell_candidate(candidate, float("Inf")) # DE requires a tell | ||
self._num_ask += ( | ||
1 # this is necessary for some algorithms which need new num to ask another point | ||
|
@@ -456,6 +466,9 @@ def ask(self) -> p.Parameter: | |
"sending candidate anyway.", | ||
errors.FailedConstraintWarning, | ||
) | ||
if not candidate.satisfies_constraints(): | ||
# still not solving, let's run sub-optimization | ||
candidate = _constraint_solver(candidate, budget=max_trials) | ||
if not is_suggestion: | ||
if candidate.uid in self._asked: | ||
raise RuntimeError( | ||
|
@@ -736,3 +749,25 @@ def __eq__(self, other: tp.Any) -> tp.Any: | |
if self._config == other._config: | ||
return True | ||
return False | ||
|
||
|
||
def _constraint_solver(parameter: p.Parameter, budget: int) -> p.Parameter: | ||
"""Runs a suboptimization to solve the parameter constraints""" | ||
parameter_without_constraint = parameter.copy() | ||
parameter_without_constraint._constraint_checkers.clear() | ||
opt = registry["OnePlusOne"](parameter_without_constraint, num_workers=1, budget=budget) | ||
opt._constraints_manager.max_trials = 1 | ||
teytaud marked this conversation as resolved.
Show resolved
Hide resolved
|
||
for _ in range(budget): | ||
cand = opt.ask() | ||
# Our objective function is minimum for the point the closest to | ||
# the original candidate under the constraints. | ||
penalty = cand._constraint_penalties() | ||
jrapin marked this conversation as resolved.
Show resolved
Hide resolved
|
||
if not penalty > 0: # constraints are satisfied | ||
return cand | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. don't return this, since this candidate does not have any constraint anymore listed, it could create weird bugs. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. oh my god thx. |
||
# TODO: this may not scale well with dimension | ||
distance = np.tanh(np.sum(cand.get_standardized_data(reference=parameter) ** 2)) | ||
# TODO: because of the return whenever constraints are satisfied, the first case never arises | ||
loss = distance if penalty <= 0 else penalty + distance + 1.0 | ||
opt.tell(cand, loss) | ||
data = opt.recommend().get_standardized_data(reference=parameter_without_constraint) | ||
return parameter.spawn_child().set_standardized_data(data) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why
self.budget - self.num_ask
? right now this does not make sense to me, and this seems unexpectedThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the optimizer has a maximum budget, we want the budget we use to be smaller than the remaining budget before the optimizer crashes, hence self.budget - self.num_ask.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so you probably want to do some kind of mean between self._constraints_manager.max_trials and
self.budget - self.num_ask
so that:you don't go beyond max_trials, and you dont exhaust more than half the remaining budget of the optimizer
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
min(self._constraints_manager.max_trials, self.budget - self.num_ask) could make sense.
Eventually all that stuff is going to be equipped with parameters for combining the auxiliary solver and the simple rejection of mutations.
Besides adding the famous constraint learning tool for expensive constraints that would be useful for some of our users... still a long way to go on the path to constraint management :-)