-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Orthogonal Additive Kernels #1869
Conversation
This pull request was exported from Phabricator. Differential Revision: D45217852 |
Summary: Pull Request resolved: pytorch#1869 OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian). This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals. OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero. Reviewed By: Balandat Differential Revision: D45217852 fbshipit-source-id: 8af544145b0c3d0c280accc6d8192971f6f22a72
9bbde2e
to
1f54e6a
Compare
This pull request was exported from Phabricator. Differential Revision: D45217852 |
Codecov Report
@@ Coverage Diff @@
## main #1869 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 171 172 +1
Lines 15006 15093 +87
=========================================
+ Hits 15006 15093 +87
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
Summary: Pull Request resolved: pytorch#1869 OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian). This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals. OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero. Reviewed By: Balandat Differential Revision: D45217852 fbshipit-source-id: 3a124f7af1d9301bad84fc3fcc22f382b8f1693f
1f54e6a
to
acb0eec
Compare
This pull request was exported from Phabricator. Differential Revision: D45217852 |
Summary: Pull Request resolved: pytorch#1869 OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian). This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals. OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero. Reviewed By: Balandat Differential Revision: D45217852 fbshipit-source-id: 2ed6911e0cdf4da526690df98bc9f392712e29bc
acb0eec
to
4592593
Compare
This pull request was exported from Phabricator. Differential Revision: D45217852 |
Summary: Pull Request resolved: pytorch#1869 OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian). This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals. OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero. Reviewed By: Balandat Differential Revision: D45217852 fbshipit-source-id: 10059298add80ba49ad48d7a0484a8cc2fb12f32
4592593
to
009ae77
Compare
This pull request was exported from Phabricator. Differential Revision: D45217852 |
Summary: Pull Request resolved: pytorch#1869 OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian). This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals. OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero. Reviewed By: Balandat Differential Revision: D45217852 fbshipit-source-id: 7024d5fce99d943c16cc936d007dc497224d2912
009ae77
to
e1002ca
Compare
This pull request was exported from Phabricator. Differential Revision: D45217852 |
Summary: Pull Request resolved: pytorch#1869 OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian). This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals. OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero. Reviewed By: Balandat Differential Revision: D45217852 fbshipit-source-id: 650c42daf0b8e5eab3735913b723cb8ea3bacacf
e1002ca
to
7a8222d
Compare
This pull request was exported from Phabricator. Differential Revision: D45217852 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D45217852 |
Summary: Pull Request resolved: pytorch#1869 OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian). This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals. OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero. Reviewed By: Balandat Differential Revision: D45217852 fbshipit-source-id: c2ef5d82e3dff5c7a2ce34e678686a29cc236351
7a8222d
to
6751493
Compare
This pull request was exported from Phabricator. Differential Revision: D45217852 |
Summary: Pull Request resolved: pytorch#1869 OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian). This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals. OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero. Reviewed By: Balandat Differential Revision: D45217852 fbshipit-source-id: 1048ce03f6afcb5bc4df6f2e4fa8032b217a1b23
6751493
to
d711be2
Compare
Summary: Pull Request resolved: pytorch#1869 OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian). This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals. OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero. Reviewed By: Balandat Differential Revision: D45217852 fbshipit-source-id: 37efabffbfad8ca32c211e80a468bf2decad140e
d711be2
to
c75ac05
Compare
This pull request was exported from Phabricator. Differential Revision: D45217852 |
This pull request has been merged in 6d329a8. |
Summary:
OAKs were introduced in Additive Gaussian Processes Revisited but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).
This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.
OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.
Reviewed By: Balandat
Differential Revision: D45217852