Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add ER/IR-L0 Acqusition function #1916

Closed
wants to merge 1 commit into from

Conversation

qingfeng10
Copy link
Contributor

Summary: Implement internal regularization with L0 norm (IR-L0) and external regularization with L0 norm (ER-L0) in MBM.

Differential Revision: D46059263

@facebook-github-bot facebook-github-bot added CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported labels Jul 5, 2023
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D46059263

@codecov
Copy link

codecov bot commented Jul 5, 2023

Codecov Report

Merging #1916 (0d2e0c3) into main (4b79331) will increase coverage by 0.00%.
The diff coverage is 100.00%.

❗ Current head 0d2e0c3 differs from pull request most recent head 2e74876. Consider uploading reports for the commit 2e74876 to get more accurate results

@@           Coverage Diff           @@
##             main    #1916   +/-   ##
=======================================
  Coverage   99.94%   99.94%           
=======================================
  Files         176      176           
  Lines       15467    15480   +13     
=======================================
+ Hits        15459    15472   +13     
  Misses          8        8           
Impacted Files Coverage Δ
botorch/acquisition/penalized.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

qingfeng10 pushed a commit to qingfeng10/botorch that referenced this pull request Jul 11, 2023
Summary:
Pull Request resolved: pytorch#1916

Implement internal regularization with L0 norm (IR-L0) and external regularization with L0 norm (ER-L0) in MBM.

Differential Revision: D46059263

fbshipit-source-id: e7a06c6813c203e01610b6d6276453123e56ec5e
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D46059263

qingfeng10 pushed a commit to qingfeng10/botorch that referenced this pull request Jul 11, 2023
Summary:
Pull Request resolved: pytorch#1916

Implement internal regularization with L0 norm (IR-L0) and external regularization with L0 norm (ER-L0) in MBM.

Reviewed By: lena-kashtelyan

Differential Revision: D46059263

fbshipit-source-id: 7c7fa206d964d5db43790c02a8b1d9130ce7189e
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D46059263

qingfeng10 pushed a commit to qingfeng10/botorch that referenced this pull request Jul 12, 2023
Summary:
Pull Request resolved: pytorch#1916

Implement internal regularization with L0 norm (IR-L0) and external regularization with L0 norm (ER-L0) in MBM.

Reviewed By: lena-kashtelyan

Differential Revision: D46059263

fbshipit-source-id: a1e041be540d64b76c7bd36e4d19cbc2448f7cbf
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D46059263

Summary:
Pull Request resolved: pytorch#1916

Implement internal regularization with L0 norm (IR-L0) and external regularization with L0 norm (ER-L0) in MBM.

Reviewed By: lena-kashtelyan

Differential Revision: D46059263

fbshipit-source-id: 84d2e1882d9fe2a54c2e51461ce5b81671004201
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D46059263

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 36c8c02.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants