Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fatmax and logsumexp for infinities #1999

Closed
wants to merge 1 commit into from

Conversation

SebastianAment
Copy link
Contributor

Summary:
This commit improves the robustness of fatmax and logsumexp for inputs with infinities.

  • In constrast to torch.logsumexp, logsumexp does not give rise to NaNs in its backward pass even if infinities are present.
  • fatmax is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an alpha parameter, which controls the the asymptotic power decay of the fat-tailed approximation.

In addition, the commit introduces helper functions derivative of logsumexp and fatmax, e.g. logplusexp, fatminimum, fatmaximum, fixes a similar infinity issue with logdiffexp, and improves the associated test suite.

Differential Revision: D48878020

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Aug 31, 2023
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48878020

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Sep 1, 2023
Summary:

This commit improves the robustness of `fatmax` and `logsumexp` for inputs with infinities.
-  In constrast to `torch.logsumexp`, `logsumexp` does not give rise to `NaN`s in its backward pass even if infinities are present.
- `fatmax` is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an `alpha` parameter, which controls the the asymptotic power decay of the fat-tailed approximation.

In addition, the commit introduces helper functions derivative of `logsumexp` and `fatmax`, e.g. `logplusexp`, `fatminimum`, `fatmaximum`, fixes a similar infinity issue with `logdiffexp`, and improves the associated test suite.

Reviewed By: Balandat

Differential Revision: D48878020
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48878020

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Sep 1, 2023
Summary:

This commit improves the robustness of `fatmax` and `logsumexp` for inputs with infinities.
-  In constrast to `torch.logsumexp`, `logsumexp` does not give rise to `NaN`s in its backward pass even if infinities are present.
- `fatmax` is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an `alpha` parameter, which controls the the asymptotic power decay of the fat-tailed approximation.

In addition, the commit introduces helper functions derivative of `logsumexp` and `fatmax`, e.g. `logplusexp`, `fatminimum`, `fatmaximum`, fixes a similar infinity issue with `logdiffexp`, and improves the associated test suite.

Reviewed By: Balandat

Differential Revision: D48878020
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48878020

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Sep 1, 2023
Summary:

This commit improves the robustness of `fatmax` and `logsumexp` for inputs with infinities.
-  In constrast to `torch.logsumexp`, `logsumexp` does not give rise to `NaN`s in its backward pass even if infinities are present.
- `fatmax` is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an `alpha` parameter, which controls the the asymptotic power decay of the fat-tailed approximation.

In addition, the commit introduces helper functions derivative of `logsumexp` and `fatmax`, e.g. `logplusexp`, `fatminimum`, `fatmaximum`, fixes a similar infinity issue with `logdiffexp`, and improves the associated test suite.

Reviewed By: Balandat

Differential Revision: D48878020
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48878020

@codecov
Copy link

codecov bot commented Sep 1, 2023

Codecov Report

Merging #1999 (1d23f8c) into main (748b46a) will increase coverage by 0.00%.
The diff coverage is 100.00%.

❗ Current head 1d23f8c differs from pull request most recent head c7b1f0a. Consider uploading reports for the commit c7b1f0a to get more accurate results

@@           Coverage Diff           @@
##             main    #1999   +/-   ##
=======================================
  Coverage   99.99%   99.99%           
=======================================
  Files         179      179           
  Lines       15738    15770   +32     
=======================================
+ Hits        15737    15769   +32     
  Misses          1        1           
Files Changed Coverage Δ
botorch/utils/safe_math.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

Summary:

This commit improves the robustness of `fatmax` and `logsumexp` for inputs with infinities.
-  In constrast to `torch.logsumexp`, `logsumexp` does not give rise to `NaN`s in its backward pass even if infinities are present.
- `fatmax` is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an `alpha` parameter, which controls the the asymptotic power decay of the fat-tailed approximation.

In addition, the commit introduces helper functions derivative of `logsumexp` and `fatmax`, e.g. `logplusexp`, `fatminimum`, `fatmaximum`, fixes a similar infinity issue with `logdiffexp`, and improves the associated test suite.

Reviewed By: Balandat

Differential Revision: D48878020
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D48878020

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 9649b1c.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants