Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update acquisition function scaling for constrained optimization #413

Merged
merged 1 commit into from
May 8, 2023

Conversation

till-m
Copy link
Member

@till-m till-m commented May 5, 2023

This makes the acqusition function equivalent to the unconstrained case if the GP is certain that the constraint is fulfilled.

NB: Should we add a note somewhere that constrained optimization is best used with the EI acquisition function?

@till-m till-m requested review from fmfn and bwheelz36 May 5, 2023 08:27
@codecov-commenter
Copy link

Codecov Report

Patch coverage: 100.00% and no project coverage change.

Comparison is base (33b99ec) 98.57% compared to head (ff087a0) 98.57%.

📣 This organization is not using Codecov’s GitHub App Integration. We recommend you install it so Codecov can continue to function properly for your repositories. Learn more

Additional details and impacted files
@@           Coverage Diff           @@
##           master     #413   +/-   ##
=======================================
  Coverage   98.57%   98.57%           
=======================================
  Files           8        8           
  Lines         560      560           
  Branches       79       79           
=======================================
  Hits          552      552           
  Misses          4        4           
  Partials        4        4           
Impacted Files Coverage Δ
bayes_opt/util.py 97.84% <100.00%> (ø)

☔ View full report in Codecov by Sentry.
📢 Do you have feedback about the report comment? Let us know in this issue.

@bwheelz36
Copy link
Collaborator

Hi @till-m , I don't 100% understand this but all checks are passing and it seems like you do, so from this perspective I'm happy to approve :-) you can merge this yourself now right?

@till-m
Copy link
Member Author

till-m commented May 8, 2023

Hi @till-m , I don't 100% understand this but all checks are passing and it seems like you do, so from this perspective I'm happy to approve :-) you can merge this yourself now right?

Fair point, I'll just outline my thinking for the record:
Gardner derived constrained optimization while using EI, which is non-negative. However, if an acquisition function is used that is not non-negative we cannot just scale by probability (this would make negative acq-value points more attractive if the model is sure of the constraint not being fulfilled!). Because of that, I thought up some sort of scaling law for negative values of the acquisition function when I made the original PR. The idea here was to have the same properties that of the positive case:

  1. Make exploring values more attractive if the model is rather sure of the constraint being fulfilled.
  2. Make exploring values less attractive if the model is rather sure of the constraint not being fulfilled.

However, the original formulation by Gardner has a third nice property, which is leaving the objective unchanged (compared to the unconstrained optimization) if the model is sure of the constraint being fulfilled. This probably makes little difference in practice, but I felt it would maybe be nice to have this property in the negative case as well, thus the small change in rescaling.

@till-m till-m merged commit 0baf8e1 into bayesian-optimization:master May 8, 2023
@till-m till-m deleted the till-m-patch-1 branch July 20, 2024 09:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants