Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lower celu, celu_, selu, and selu_ #3547

Merged
merged 2 commits into from
May 4, 2022
Merged

Lower celu, celu_, selu, and selu_ #3547

merged 2 commits into from
May 4, 2022

Conversation

wonjoolee95
Copy link
Collaborator

@wonjoolee95 wonjoolee95 commented May 2, 2022

Lower celu, celu_, selu, and selu_.

Trying to fix the newly introduced test failures due to #3539. This PR lowers celu, celu_, selu, and selu_ into separate lowerings.

Once this PR merges, I'll rebase the above PR and fix its tests.

@wonjoolee95 wonjoolee95 self-assigned this May 2, 2022
@wonjoolee95 wonjoolee95 marked this pull request as ready for review May 2, 2022 17:00
@wonjoolee95
Copy link
Collaborator Author

This should be ready for a quick review, @JackCaoG. Thanks!

@wonjoolee95 wonjoolee95 requested a review from JackCaoG May 3, 2022 00:40
Copy link
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mostly lgtm

@wonjoolee95
Copy link
Collaborator Author

Thanks for the comments, @JackCaoG! For some reason, if I remove celu_ and selu_, the codegen doesn't seem to take care of the in-place functions. The CI tests on the previous commit failed with celu and selu xla counters not changing when the in-place functions were called. Also I don't see the celu_ and selu_ being generated at RegisterXLA.cpp. I've added back the in-place operators for now.

@JackCaoG
Copy link
Collaborator

JackCaoG commented May 3, 2022

Can you check torch_xla/csrc/RegisterXLA.cpp after removing inplace?

For example for acos I see

at::Tensor wrapper__acos(const at::Tensor & self) {
    // No device check


  // DeviceGuard omitted
  return torch_xla::XLANativeFunctions::acos(self);
}

} // anonymous namespace
at::Tensor & wrapper_out_acos_out(const at::Tensor & self, at::Tensor & out) {
  auto wrapper_out_acos_out_tmp = wrapper__acos(self);
  at::_copy_from_and_resize(wrapper_out_acos_out_tmp, out);
  return out;
}
at::Tensor & wrapper__acos_(at::Tensor & self) {
  auto wrapper__acos__tmp = wrapper__acos(self);
  at::_copy_from(wrapper__acos__tmp, self);
  return self;
} 

this allow us to implement one lowering.

@wonjoolee95
Copy link
Collaborator Author

Interestingly I don't see the in-place version getting automatically generated. Here is how the RegisterXLA.cpp looks like for celu and selu:

namespace {

at::Tensor wrapper__selu(const at::Tensor & self) {
    // No device check


  // DeviceGuard omitted
  return torch_xla::XLANativeFunctions::selu(self);
}

} // anonymous namespace
namespace {

at::Tensor wrapper__celu(const at::Tensor & self, const at::Scalar & alpha) {
    // No device check


  // DeviceGuard omitted
  return torch_xla::XLANativeFunctions::celu(self, alpha);
}

} // anonymous namespace

However, I don't see one auto-generated for celu_ or selu_.

@JackCaoG
Copy link
Collaborator

JackCaoG commented May 3, 2022

@bdhirsh do we know what's the prerequisite for a inplace op to be generated?

@bdhirsh
Copy link
Collaborator

bdhirsh commented May 3, 2022

oh, it's because celu and selu don't have out= variants. The codegen needs out= variants in order to group the different variants properly so it can generate the inplace kernels.

Manually writing the two celu_() and selu_() inplace kernel lowerings for XLA seems like a good short term solution?

Caveat: The functionalization pass in core will still correctly functionalize celu_() and selu_(), so once XLA is finally using it then you won't need to worry about manually writing them and we can delete them. I know I keep making it sound like some beautiful end state in the distant future 😛

@JackCaoG
Copy link
Collaborator

JackCaoG commented May 3, 2022

oh, ok. Let's keep celu_ and selu_ for now.

@wonjoolee95
Copy link
Collaborator Author

Thanks for the clarification, @bdhirsh! Just did a hard reset to the commit before I removed the in-place ops.

@wonjoolee95 wonjoolee95 requested a review from JackCaoG May 4, 2022 00:30
Copy link
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@JackCaoG JackCaoG merged commit d1de247 into master May 4, 2022
@JackCaoG JackCaoG deleted the selu-celu branch May 4, 2022 01:16
@miladm miladm linked an issue May 17, 2022 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants