-
Notifications
You must be signed in to change notification settings - Fork 505
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
internalize _parse_xla_device #7675
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would the deprecation wrapper here work? https://github.com/pytorch/xla/blob/master/torch_xla/experimental/deprecation.py
Also did some updates to the deprecation decorator function. Previously it will print the warning message whenever we |
This was actually not written as a decorator. It essentially aliases a function from the new location to the deprecated location. We don't need to keep the old implementation around in most cases. Check out some of the old usage that Jack removed in #7579 |
Oh, I see. Just fixed wrapper it back. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Otherwise LGTM
torch_xla/core/xla_model.py
Outdated
iutils.parse_xla_device, | ||
] | ||
for alias in aliases: | ||
register_deprecated(torch_xla.core, alias) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! I think this should actually be torch_xla.core.xla_model
looking at the implementation of register_deprecated
. Please double-check that you get the expected message when you call torch_xla.core.xla_model.parse_xla-device(...)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated to parse_xla_device = deprecated(torch_xla.core, iutils.parse_xla_device)
instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error message still says torch_xla.core.parse_xla_device
instead of torch_xla.core.xla_model.parse_xla_device
:
root@t1v-n-bf2f726f-w-0:/workspaces/ptxla# python
...
>>> import torch_xla.core.xla_model as xm
>>> xm.parse_xla_device('TPU:0')
WARNING:root:torch_xla.core.parse_xla_device is deprecated. Use torch_xla._internal.utils.parse_xla_device instead.
('TPU', 0)
You'll actually need to pass the xla_model
module into register_deprecated
. That was the only way I could find to print the qualified name of the aliased method consistently, but feel free to refactor it if you find a better way.
I think we need to add a unit test here to confirm that the right warning is getting printed, since you're going to be adding a lot of these aliases and this is an easy mistake to make. You can create a new unit test file for all of these public to internal aliases and we can remove it later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should work:
import torch_xla._internal.utils as _utils
from . import xla_model as this_module
parse_xla_device = deprecated(this_module, _utils.parse_xla_device)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Resolves #7652
Along with some nit fixes.