Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to set the beta coefficient of generation strategy? #2525

Closed
omrisch opened this issue Jun 17, 2024 · 7 comments
Closed

How to set the beta coefficient of generation strategy? #2525

omrisch opened this issue Jun 17, 2024 · 7 comments
Assignees
Labels
question Further information is requested

Comments

@omrisch
Copy link

omrisch commented Jun 17, 2024

I want to generate explore/exploit experimental suggestions based on different beta parameters.
How can I specify this in the generation strategy?

@bernardbeckerman
Copy link
Contributor

@omrisch thanks for asking this! I'm following up internally to find the right person to help you here.

@bernardbeckerman bernardbeckerman self-assigned this Jun 17, 2024
@bernardbeckerman bernardbeckerman added the question Further information is requested label Jun 17, 2024
@Balandat
Copy link
Contributor

Can you provide some more context? I assume by beta you are referring to the exploration component in UCB algorithms / acquisition functions? Are you already using a UCB-type algorithm? Note that by default Ax uses an Expected Improvement-based acquisition function, not UCB.

@omrisch
Copy link
Author

omrisch commented Jun 18, 2024 via email

@Balandat
Copy link
Contributor

@omrisch looks like your photo didn't come through (probably b/c you responded via email and not on github).

There is no explicit way to control the explore-exploit tradeoff; for (non-noisy) EI a manual way to achieve this is to artificially deflate the value f* of the incumbent (best observed point so far). Are your observations noiseless? If not, are you providing observation noise variances or are you letting Ax infer the noise level?

This discussion is relevant in this context: pytorch/botorch#373

@omrisch
Copy link
Author

omrisch commented Jun 19, 2024

My observations are noisy, currently I let Ax infer the noise

@Balandat
Copy link
Contributor

So by default this will use botorch's qLogNoisyExpectedImprovement acquisition function. This implementation does not use an explicit incumbent (since observations are noisy), but instead integrates out the uncertainty in incumbent by sampling its values from the posterior (see sec A.8 in our logEI paper). What that means is that there is no parameter that directly adjusts the exploration tradeoff. One could introduce one, but that would require some thought and research.

You could instead specify a non-noisy acquisition function that uses an explicit incumbent and register a new acquisition function and implement an input constructor for it following the general setup as in our modular BoTorch model tutorial, but my strong expectation is that this will likely result in worse performance compared to qLogNEI.

Do you have any specific concerns that make you think you need to adjust the exploration component?

@mgarrard
Copy link
Contributor

@omrisch closing due to inactivity, please feel free to re-open or start a new issue for additional help :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants