-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extend Posterior API to support torch distributions & overhaul MCSampler API #1486
Conversation
This pull request was exported from Phabricator. Differential Revision: D39759489 |
…ler API (#1486) Summary: X-link: pytorch/botorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. - Some more listed in T134364907 - Test fixes and new units Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: 6eacc9ab011c5a5dfa4209f127b225d1e7d0ca9f
…ler API (facebook#1486) Summary: X-link: pytorch/botorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. - Some more listed in T134364907 - Test fixes and new units Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: ffec0cc4594e8ff8ae2777928a4acf1ee1259038
…ler API (#1254) Summary: X-link: facebook/Ax#1254 Pull Request resolved: facebookresearch#193 X-link: pytorch/botorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. - Some more listed in T134364907 - Test fixes and new units Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: 14457c86793aa05489c090fe0f969756fbd22bf6
683e2f5
to
f626884
Compare
This pull request was exported from Phabricator. Differential Revision: D39759489 |
…ler API (pytorch#1254) Summary: X-link: facebook/Ax#1254 X-link: facebookresearch/aepsych#193 Pull Request resolved: pytorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. - Some more listed in T134364907 - Test fixes and new units Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: a851029dde668c954ce6104c0155baf718dfa860
…ler API (facebook#1254) Summary: Pull Request resolved: facebook#1254 X-link: facebookresearch/aepsych#193 X-link: pytorch/botorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. - Some more listed in T134364907 - Test fixes and new units Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: e18a1eb5ecd989aa926e646e71eae27f9f685ae4
Codecov Report
@@ Coverage Diff @@
## main #1486 +/- ##
==========================================
Coverage 100.00% 100.00%
==========================================
Files 143 149 +6
Lines 12752 12887 +135
==========================================
+ Hits 12752 12887 +135
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
This pull request was exported from Phabricator. Differential Revision: D39759489 |
…ler API (pytorch#1254) Summary: X-link: facebook/Ax#1254 X-link: facebookresearch/aepsych#193 Pull Request resolved: pytorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. - Some more listed in T134364907 - Test fixes and new units Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: f1a7d9b390e4012a35f13cfd04ee1114bc12e53e
f626884
to
15d6cdc
Compare
…ler API (pytorch#1254) Summary: X-link: facebook/Ax#1254 X-link: facebookresearch/aepsych#193 Pull Request resolved: pytorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. - Some more listed in T134364907 - Test fixes and new units Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: 59fa663777555ff6d528dab53d124665ae5e75e7
15d6cdc
to
a41ad1b
Compare
This pull request was exported from Phabricator. Differential Revision: D39759489 |
…ler API (pytorch#1254) Summary: X-link: facebook/Ax#1254 X-link: facebookresearch/aepsych#193 Pull Request resolved: pytorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. - Some more listed in T134364907 - Test fixes and new units Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: 64ba2cac975fb347cbf2b61cd0e8fc627edc6a6f
This pull request was exported from Phabricator. Differential Revision: D39759489 |
a41ad1b
to
f5b95cb
Compare
…ler API (pytorch#1254) Summary: X-link: facebook/Ax#1254 X-link: facebookresearch/aepsych#193 Pull Request resolved: pytorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: 325519e3458dcb5bd1f09cf8e71466f5aecda6ae
…ler API (#1254) Summary: X-link: facebook/Ax#1254 Pull Request resolved: facebookresearch#193 X-link: pytorch/botorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: d6f2d48c1019370f9727acb3fc2652f048e302a0
f5b95cb
to
c8ec647
Compare
This pull request was exported from Phabricator. Differential Revision: D39759489 |
…ler API (facebook#1254) Summary: Pull Request resolved: facebook#1254 X-link: facebookresearch/aepsych#193 X-link: pytorch/botorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: 7c56a0b185333ba8a90b99d14058dba1add4d2de
…ler API (pytorch#1254) Summary: X-link: facebook/Ax#1254 X-link: facebookresearch/aepsych#193 Pull Request resolved: pytorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Differential Revision: https://internalfb.com/D39759489 fbshipit-source-id: 99bb2ee02e9e52a8b390f736c4a0668bce8bb09d
…ler API (pytorch#1254) Summary: X-link: facebook/Ax#1254 X-link: facebookresearch/aepsych#193 Pull Request resolved: pytorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Differential Revision: https://internalfb.com/D39759489 fbshipit-source-id: 2669ea7c5996095e2635e8572184b1c22e969d57
…ler API (#1254) Summary: X-link: facebook/Ax#1254 Pull Request resolved: #193 X-link: pytorch/botorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: f4db866320bab9a5455dfc0c2f7fe2cc15385453
…ler API (#1254) Summary: Pull Request resolved: #1254 X-link: facebookresearch/aepsych#193 X-link: pytorch/botorch#1486 The main goal here is to broadly support non-Gaussian posteriors. - Adds a generic `TorchPosterior` which wraps a Torch `Distribution`. This defines a few properties that we commonly expect, and calls the `distribution` for the rest. - For a unified plotting API, this shifts away from mean & variance to a quantile function. Most torch distributions implement inverse CDF, which is used as quantile. For others, the user should implement it either at distribution or posterior level. - Hands off the burden of base sample handling from the posterior to the samplers. Using a dispatcher based `get_sampler` method, we can support SAA with mixed posteriors without having to shuffle base samples in a `PosteriorList`, as long as all base distributions have a corresponding sampler and support base samples. - Adds `ListSampler` for sampling from `PosteriorList`. - Adds `ForkedRNGSampler` and `StochasticSampler` for sampling from posteriors without base samples. - Adds `rsample_from_base_samples` for sampling with `base_samples` / with a `sampler`. - Absorbs `FullyBayesianPosteriorList` into `PosteriorList`. - For MC acqfs, introduces a `get_posterior_samples` for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior using `get_sampler`, eliminating the need to construct a sampler in `__init__`, which we used to do under the assumption of Gaussian posteriors. TODOs: - Relax the Gaussian assumption in acquisition functions & utilities. Some of this might be addressed in a follow-up diff. - Updates to website / docs & tutorials to clear up some of the Gaussian assumption, introduce the new relaxed API. Likely a follow-up diff. Other notables: - See D39760855 for usage of TorchDistribution in SkewGP. - TransformedPosterior could serve as the fallback option for derived posteriors. - MC samplers no longer support resample or collapse_batch_dims(=False). These can be handled by i) not using base samples, ii) just using torch.fork_rng and sampling without base samples from that. Samplers are only meant to support SAA. Introduces `ForkedRNGSampler` and `StochasticSampler` as convenience samplers for these use cases. - Introduced `batch_range_override` for the sampler to support edge cases where we may want to override `posterior.batch_range` (needed in `qMultiStepLookahead`) - Removes unused sampling utilities `construct_base_samples(_from_posterior)`, which assume Gaussian posterior. - Moves the main logic of `_set_sampler` method of CachedCholesky subclasses to a `_update_base_samples` method on samplers, and simplifies these classes a bit more. Reviewed By: Balandat Differential Revision: D39759489 fbshipit-source-id: f4db866320bab9a5455dfc0c2f7fe2cc15385453
Summary: Deprecated since pytorch#1486 Differential Revision: D56799082
Summary: Deprecated since pytorch#1486 Reviewed By: nlutsenko Differential Revision: D56799082
Summary: Deprecated since pytorch#1486 Reviewed By: nlutsenko Differential Revision: D56799082
Summary: Deprecated since pytorch#1486 Reviewed By: nlutsenko Differential Revision: D56799082
Summary: Deprecated since pytorch#1486 Differential Revision: D56799082 Reviewed By: Balandat
Summary: Deprecated since pytorch#1486 Differential Revision: D56799082 Reviewed By: Balandat
Summary: Deprecated since #1486 Reviewed By: Balandat Differential Revision: D56799082 fbshipit-source-id: 287e698529fed801974025874b5380b56ea76119
Summary:
The main goal here is to broadly support non-Gaussian posteriors.
TorchPosterior
which wraps a TorchDistribution
. This defines a few properties that we commonly expect, and calls thedistribution
for the rest.get_sampler
method, we can support SAA with mixed posteriors without having to shuffle base samples in aPosteriorList
, as long as all base distributions have a corresponding sampler and support base samples.ListSampler
for sampling fromPosteriorList
.ForkedRNGSampler
andStochasticSampler
for sampling from posteriors without base samples.rsample_from_base_samples
for sampling withbase_samples
/ with asampler
.FullyBayesianPosteriorList
intoPosteriorList
.get_posterior_samples
for sampling from the posterior with base samples / a sampler. If a sampler was not specified, this constructs the appropriate sampler for the posterior usingget_sampler
, eliminating the need to construct a sampler in__init__
, which we used to do under the assumption of Gaussian posteriors.TODOs:
Other notables:
ForkedRNGSampler
andStochasticSampler
as convenience samplers for these use cases.batch_range_override
for the sampler to support edge cases where we may want to overrideposterior.batch_range
(needed inqMultiStepLookahead
)construct_base_samples(_from_posterior)
, which assume Gaussian posterior._set_sampler
method of CachedCholesky subclasses to a_update_base_samples
method on samplers, and simplifies these classes a bit more.Reviewed By: Balandat
Differential Revision: D39759489