Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removing custom BlockDiagLazyTensor logic when using Standardize (Take 2) #1414

Closed
wants to merge 1 commit into from

Conversation

SebastianAment
Copy link
Contributor

Summary: Due to this linear operator PR, we should now be able to remove the custom logic in Standardize without performance impact.

Differential Revision: D39746709

@facebook-github-bot facebook-github-bot added CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported labels Sep 22, 2022
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D39746709

@codecov
Copy link

codecov bot commented Sep 22, 2022

Codecov Report

Merging #1414 (d9fabef) into main (a6cc512) will not change coverage.
The diff coverage is 100.00%.

@@            Coverage Diff            @@
##              main     #1414   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files          134       134           
  Lines        12408     12403    -5     
=========================================
- Hits         12408     12403    -5     
Impacted Files Coverage Δ
botorch/models/transforms/outcome.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D39746709

SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Nov 7, 2022
…(Take 2) (pytorch#1414)

Summary:
Pull Request resolved: pytorch#1414

Due to [this linear operator PR](cornellius-gp/linear_operator#14), we should now be able to remove the custom logic in `Standardize` without performance impact.

Differential Revision: D39746709

fbshipit-source-id: 506d1fc3a34778fa5bb0d91779ca5f73b24f4146
SebastianAment added a commit to SebastianAment/botorch that referenced this pull request Nov 8, 2022
…(Take 2) (pytorch#1414)

Summary:
Pull Request resolved: pytorch#1414

Due to [this linear operator PR](cornellius-gp/linear_operator#14), we should now be able to remove the custom logic in `Standardize` without performance impact.

Differential Revision: D39746709

fbshipit-source-id: 1c68d8033d18c4a489600dd743ad5ab24efc5fb0
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D39746709

…(Take 2) (pytorch#1414)

Summary:
Pull Request resolved: pytorch#1414

Due to [this linear operator PR](cornellius-gp/linear_operator#14), we should now be able to remove the custom logic in `Standardize` without performance impact.

Reviewed By: saitcakmak

Differential Revision: D39746709

fbshipit-source-id: c1477bcc14ec145583a5d0501fbe1cdac5bfe9bd
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D39746709

esantorella added a commit to esantorella/botorch that referenced this pull request Feb 10, 2023
…ytorch#1414)

Summary: Pull Request resolved: facebook/Ax#1414

Differential Revision: D43178298

fbshipit-source-id: 05441f9140d7971439ed931d66e78a9cb0d9aebf
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants