Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

backward compatible EnsureType #4637

Closed
wyli opened this issue Jul 6, 2022 · 2 comments · Fixed by #4638
Closed

backward compatible EnsureType #4637

wyli opened this issue Jul 6, 2022 · 2 comments · Fixed by #4638
Assignees

Comments

@wyli
Copy link
Contributor

wyli commented Jul 6, 2022

Is your feature request related to a problem? Please describe.
the current EnsureType removes any metadata

output_type = torch.Tensor if self.data_type == "tensor" else np.ndarray

given it's extensively used in the existing tutorials, would be great to enable metatensor output support and remove the inverse to preserve metadata when inverting the transform.

# FIXME: currently, only convert tensor data to numpy array or scalar number,
# need to also invert numpy array but it's not easy to determine the previous data type
d[key] = convert_to_numpy(d[key])
# Remove the applied transform
self.pop_transform(d, key)

@wyli wyli self-assigned this Jul 6, 2022
@Nic-Ma
Copy link
Contributor

Nic-Ma commented Jul 6, 2022

Hi @wyli ,

Thanks for raising this ticket.
@KumoLiu and I also see this issue when updating the tutorials, I think maybe we can just change the default value of data_type arg to data_type: str = "metatensor" instead of "tensor"?

Thanks.

@wyli
Copy link
Contributor Author

wyli commented Jul 6, 2022

sure @Nic-Ma, I'm working on a fix for this one.

wyli added a commit to wyli/MONAI that referenced this issue Jul 6, 2022
Signed-off-by: Wenqi Li <wenqil@nvidia.com>
@wyli wyli closed this as completed in #4638 Jul 7, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants