Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fast Gradient Clipping and Ghost Clipping #656

Closed
wants to merge 1 commit into from

Conversation

EnayatUllah
Copy link
Contributor

@EnayatUllah EnayatUllah commented Jul 15, 2024

Introducing Fast Gradient Clipping and Ghost Clipping to Opacus for memory-efficient training with DP SGD.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jul 15, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

EnayatUllah added a commit to EnayatUllah/opacus that referenced this pull request Jul 18, 2024
Summary:
Pull Request resolved: pytorch#656

Changes:
- GradSampleModuleGC now inherits from GradSampleModule, removing and cleaning redundant code
- DPOptimizerGC now inherits from DPOptimizer, removing and cleaning redundant code
- Modified PrivacyEngine to work with Ghost Clipping. Usage: pass grad_sample_mode = 'ghost' and clipping = 'ghost' in the make_private function

Differential Revision: D58210796
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

EnayatUllah added a commit to EnayatUllah/opacus that referenced this pull request Jul 19, 2024
Summary:
Pull Request resolved: pytorch#656

Introducing Fast Gradient Clipping and Ghost Clipping to Opacus for memory-efficient training with DP SGD.

Differential Revision: D58210796
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

EnayatUllah added a commit to EnayatUllah/opacus that referenced this pull request Jul 19, 2024
Summary:
Pull Request resolved: pytorch#656

Introducing Fast Gradient Clipping and Ghost Clipping to Opacus for memory-efficient training with DP SGD.

Differential Revision: D58210796
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

EnayatUllah added a commit to EnayatUllah/opacus that referenced this pull request Jul 20, 2024
Summary:
Pull Request resolved: pytorch#656

Introducing Fast Gradient Clipping and Ghost Clipping to Opacus for memory-efficient training with DP SGD.

Reviewed By: HuanyuZhang

Differential Revision: D58210796
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

EnayatUllah added a commit to EnayatUllah/opacus that referenced this pull request Jul 21, 2024
Summary:
Pull Request resolved: pytorch#656

Itroducing Fast Gradient Clipping and Ghost Clipping to Opacus for memory-efficient training with DP SGD.

Reviewed By: HuanyuZhang

Differential Revision: D58210796
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

EnayatUllah added a commit to EnayatUllah/opacus that referenced this pull request Jul 21, 2024
Summary:
Pull Request resolved: pytorch#656

Itroducing Fast Gradient Clipping and Ghost Clipping to Opacus for memory-efficient training with DP SGD.

Reviewed By: HuanyuZhang

Differential Revision: D58210796
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

EnayatUllah added a commit to EnayatUllah/opacus that referenced this pull request Jul 21, 2024
Summary:
Pull Request resolved: pytorch#656

Itroducing Fast Gradient Clipping and Ghost Clipping to Opacus for memory-efficient training with DP SGD.

Reviewed By: HuanyuZhang

Differential Revision: D58210796
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

EnayatUllah added a commit to EnayatUllah/opacus that referenced this pull request Jul 21, 2024
Summary:
Pull Request resolved: pytorch#656

Itroducing Fast Gradient Clipping and Ghost Clipping to Opacus for memory-efficient training with DP SGD.

Reviewed By: HuanyuZhang

Differential Revision: D58210796
Summary:
Pull Request resolved: pytorch#656

Itroducing Fast Gradient Clipping and Ghost Clipping to Opacus for memory-efficient training with DP SGD.

Reviewed By: HuanyuZhang

Differential Revision: D58210796
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58210796

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 670fde6.

@EnayatUllah EnayatUllah changed the title Ghost Clipping - cleaned up code Fast Gradient Clipping and Ghost Clipping Jul 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants