Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

V5e8 ray #159

Merged
merged 5 commits into from
Jul 30, 2024
Merged

V5e8 ray #159

merged 5 commits into from
Jul 30, 2024

Conversation

FanhaiLu1
Copy link
Collaborator

@FanhaiLu1 FanhaiLu1 commented Jul 30, 2024

This PR add ray support for v5e-8 vms.

Thanks Richard for adding 8 chips TPU support in ray. But there is an issue to get get_current_pod_worker_count for v5e-8, this RP add parameter to let engineer specify worker count.

num_hosts = (
num_hosts if is_disaggregated else tpu.get_current_pod_worker_count()
)
num_hosts = num_hosts if num_hosts > 0 else tpu.get_current_pod_worker_count()
print(f"pod_name:{pod_name}, number of host: {num_hosts}")
assert (
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit, consider adding more assertion to check the number of hosts * working chips == total devices.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, added.

@FanhaiLu1 FanhaiLu1 merged commit c2bce76 into AI-Hypercomputer:main Jul 30, 2024
4 checks passed
@FanhaiLu1 FanhaiLu1 deleted the v5e8-ray branch July 31, 2024 15:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants