-
Notifications
You must be signed in to change notification settings - Fork 506
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Node hash new #3514
Node hash new #3514
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this lgtm.
i see you can torch-pin from this PR to #75324, but does that run enough CI steps to be sure?
Let me know what the best sequence for landing is. Or if we should rebase/pin #75324 somehow.
@wconstab pin to #75324 will checkout that pytorch branch when building in pt/xla CI. We will run both CPU and GPU test so it should be enough to cover :D. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, @JackCaoG!
This is to fix the build error in pytorch/pytorch#75324. Pytorch/xla needs to implement two hash function and maintain the hash object.
From PyTorch/XLA's perspective,
hash
andshape_hash
is the same thing.