-
Notifications
You must be signed in to change notification settings - Fork 413
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fatmax
and logsumexp
for infinities
#1999
Conversation
This pull request was exported from Phabricator. Differential Revision: D48878020 |
Summary: This commit improves the robustness of `fatmax` and `logsumexp` for inputs with infinities. - In constrast to `torch.logsumexp`, `logsumexp` does not give rise to `NaN`s in its backward pass even if infinities are present. - `fatmax` is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an `alpha` parameter, which controls the the asymptotic power decay of the fat-tailed approximation. In addition, the commit introduces helper functions derivative of `logsumexp` and `fatmax`, e.g. `logplusexp`, `fatminimum`, `fatmaximum`, fixes a similar infinity issue with `logdiffexp`, and improves the associated test suite. Reviewed By: Balandat Differential Revision: D48878020
750170b
to
08a965b
Compare
This pull request was exported from Phabricator. Differential Revision: D48878020 |
Summary: This commit improves the robustness of `fatmax` and `logsumexp` for inputs with infinities. - In constrast to `torch.logsumexp`, `logsumexp` does not give rise to `NaN`s in its backward pass even if infinities are present. - `fatmax` is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an `alpha` parameter, which controls the the asymptotic power decay of the fat-tailed approximation. In addition, the commit introduces helper functions derivative of `logsumexp` and `fatmax`, e.g. `logplusexp`, `fatminimum`, `fatmaximum`, fixes a similar infinity issue with `logdiffexp`, and improves the associated test suite. Reviewed By: Balandat Differential Revision: D48878020
08a965b
to
55392c1
Compare
This pull request was exported from Phabricator. Differential Revision: D48878020 |
Summary: This commit improves the robustness of `fatmax` and `logsumexp` for inputs with infinities. - In constrast to `torch.logsumexp`, `logsumexp` does not give rise to `NaN`s in its backward pass even if infinities are present. - `fatmax` is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an `alpha` parameter, which controls the the asymptotic power decay of the fat-tailed approximation. In addition, the commit introduces helper functions derivative of `logsumexp` and `fatmax`, e.g. `logplusexp`, `fatminimum`, `fatmaximum`, fixes a similar infinity issue with `logdiffexp`, and improves the associated test suite. Reviewed By: Balandat Differential Revision: D48878020
55392c1
to
fec94c5
Compare
This pull request was exported from Phabricator. Differential Revision: D48878020 |
Codecov Report
@@ Coverage Diff @@
## main #1999 +/- ##
=======================================
Coverage 99.99% 99.99%
=======================================
Files 179 179
Lines 15738 15770 +32
=======================================
+ Hits 15737 15769 +32
Misses 1 1
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
Summary: This commit improves the robustness of `fatmax` and `logsumexp` for inputs with infinities. - In constrast to `torch.logsumexp`, `logsumexp` does not give rise to `NaN`s in its backward pass even if infinities are present. - `fatmax` is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of an `alpha` parameter, which controls the the asymptotic power decay of the fat-tailed approximation. In addition, the commit introduces helper functions derivative of `logsumexp` and `fatmax`, e.g. `logplusexp`, `fatminimum`, `fatmaximum`, fixes a similar infinity issue with `logdiffexp`, and improves the associated test suite. Reviewed By: Balandat Differential Revision: D48878020
fec94c5
to
c7b1f0a
Compare
This pull request was exported from Phabricator. Differential Revision: D48878020 |
This pull request has been merged in 9649b1c. |
Summary:
This commit improves the robustness of
fatmax
andlogsumexp
for inputs with infinities.torch.logsumexp
,logsumexp
does not give rise toNaN
s in its backward pass even if infinities are present.fatmax
is updated to exhibit the same behavior in the presence of infinities, and now allows for the specification of analpha
parameter, which controls the the asymptotic power decay of the fat-tailed approximation.In addition, the commit introduces helper functions derivative of
logsumexp
andfatmax
, e.g.logplusexp
,fatminimum
,fatmaximum
, fixes a similar infinity issue withlogdiffexp
, and improves the associated test suite.Differential Revision: D48878020