Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tensor-parallel communication overlap with userbuffer backend #6792

Merged

Conversation

github-actions[bot]
Copy link
Contributor

@github-actions github-actions bot commented Jun 1, 2023

What does this PR do ?

Add (1) interfaces to TE and initialized (2) process group setting to support tensor-parallel communication overlap with userbuffer backend.

Changelog

  • Add specific line by line info of high level changes in this PR.

Usage

Set ub_tp_comm_overlap to True

# Add a code snippet demonstrating how to use this 

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • [*] New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

* add interfaces for tp_communication overlap

[pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Interface to provide custom userbuffer communicator settings by yaml file

[pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Construct MPI process group for userbuffers support

Signed-off-by: Tim Moon <tmoon@nvidia.com>

---------

Signed-off-by: Tim Moon <tmoon@nvidia.com>
Co-authored-by: Tim Moon <tmoon@nvidia.com>
Co-authored-by: Abhinav Khattar <aklife97@gmail.com>
@erhoo82
Copy link
Collaborator

erhoo82 commented Jun 5, 2023

LGTM

erhoo82
erhoo82 previously approved these changes Jun 5, 2023
@github-actions
Copy link
Contributor Author

This PR is stale because it has been open for 14 days with no activity. Remove stale label or comment or update or this will be closed in 7 days.

@github-actions github-actions bot added the stale label Jun 20, 2023
@ericharper
Copy link
Collaborator

@erhoo82 could you help resolve the conflicts?

@github-actions github-actions bot removed the stale label Jun 21, 2023
Signed-off-by: ericharper <complex451@gmail.com>
Signed-off-by: Abhinav Khattar <aklife97@gmail.com>
Copy link
Collaborator

@ericharper ericharper left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

@ericharper ericharper merged commit 29b9b8a into main Jun 28, 2023
@ericharper ericharper deleted the cherry-pick-main-e4460d1a8e728251aae87049ddeaf9af328cbc9c branch June 28, 2023 22:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants