Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Orthogonal Additive Kernels #1869

Closed
wants to merge 1 commit into from

Conversation

SebastianAment
Copy link
Contributor

Summary:
OAKs were introduced in Additive Gaussian Processes Revisited but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

@facebook-github-bot facebook-github-bot added CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported labels Jun 6, 2023
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Jun 6, 2023
Summary:
Pull Request resolved: pytorch#1869

OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

fbshipit-source-id: 8af544145b0c3d0c280accc6d8192971f6f22a72
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

@codecov
Copy link

codecov bot commented Jun 6, 2023

Codecov Report

Merging #1869 (a620fff) into main (667b2eb) will not change coverage.
The diff coverage is 100.00%.

❗ Current head a620fff differs from pull request most recent head c75ac05. Consider uploading reports for the commit c75ac05 to get more accurate results

@@            Coverage Diff            @@
##              main     #1869   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files          171       172    +1     
  Lines        15006     15093   +87     
=========================================
+ Hits         15006     15093   +87     
Impacted Files Coverage Δ
...torch/models/kernels/orthogonal_additive_kernel.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Jun 6, 2023
Summary:
Pull Request resolved: pytorch#1869

OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

fbshipit-source-id: 3a124f7af1d9301bad84fc3fcc22f382b8f1693f
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Jun 6, 2023
Summary:
Pull Request resolved: pytorch#1869

OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

fbshipit-source-id: 2ed6911e0cdf4da526690df98bc9f392712e29bc
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Jun 6, 2023
Summary:
Pull Request resolved: pytorch#1869

OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

fbshipit-source-id: 10059298add80ba49ad48d7a0484a8cc2fb12f32
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Jun 7, 2023
Summary:
Pull Request resolved: pytorch#1869

OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

fbshipit-source-id: 7024d5fce99d943c16cc936d007dc497224d2912
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Jun 7, 2023
Summary:
Pull Request resolved: pytorch#1869

OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

fbshipit-source-id: 650c42daf0b8e5eab3735913b723cb8ea3bacacf
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Jun 7, 2023
Summary:
Pull Request resolved: pytorch#1869

OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

fbshipit-source-id: c2ef5d82e3dff5c7a2ce34e678686a29cc236351
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Jun 7, 2023
Summary:
Pull Request resolved: pytorch#1869

OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

fbshipit-source-id: 1048ce03f6afcb5bc4df6f2e4fa8032b217a1b23
Summary:
Pull Request resolved: pytorch#1869

OAKs were introduced in [Additive Gaussian Processes Revisited](https://arxiv.org/pdf/2206.09861.pdf) but were limited to Gaussian kernels with Gaussian data densities (which required the application of normalizing flows to the actual input data to make it look Gaussian).

This commit introduces a generalization of OAKs that works with arbitrary base kernels by leveraging Gauss-Legendre quadrature rules for the associated one-dimensional integrals.

OAKs could be more sample-efficient than canonical kernels in higher dimensions, and allow for more efficient relevance determination, because dimensions or interactions of dimensions can be pruned by setting their assoicated coefficients -- not just their lengthscales -- to zero.

Reviewed By: Balandat

Differential Revision: D45217852

fbshipit-source-id: 37efabffbfad8ca32c211e80a468bf2decad140e
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D45217852

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 6d329a8.

@Balandat
Copy link
Contributor

Balandat commented Jun 8, 2023

cc @alexisboukouvalas, @jameshensman

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants