Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Debug Transformer Engine FP8 support with Megatron-core infrastructure #6739

Closed
wants to merge 1 commit into from

Conversation

timmoon10
Copy link
Collaborator

What does this PR do ?

Debugs Transformer Engine FP8 support with Megatron-core infrastructure

Collection: NLP

Changelog

  • Construct FP8 amax reduction group

Usage

Set transformer_engine: True and fp8: True in the config.

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Pinging @aklife97 @yen-shi @erhoo82 @ksivaman

Additional Information

Signed-off-by: Tim Moon <tmoon@nvidia.com>
@github-actions github-actions bot added the NLP label May 26, 2023
@timmoon10
Copy link
Collaborator Author

Closed in favor of #6740

@timmoon10 timmoon10 closed this May 26, 2023
@timmoon10 timmoon10 deleted the fp8-amax-group-debug branch May 26, 2023 19:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant