-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Hexagon] Slice op relu #11449
[Hexagon] Slice op relu #11449
Conversation
def relu_te_compute(Input, out_shape, dtype): | ||
x = tvm.tir.const(0, dtype) | ||
Output = te.compute( | ||
out_shape, lambda n, h, w, c: tvm.te.max(Input[n, h, w, c], x), name="reluf16" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of using out_shape
as an argument to te.compute
, I'd recommend using Input.shape
. That way, the out_shape
parameter could be removed, and the user wouldn't need to specify it independent of the Input
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for pointing this out.
j3, j4 = sch.split(j2, [None, 2]) | ||
sch.reorder(n, i1, j1, k1, i2, j3, k2, j4) | ||
sch.transform_layout(block, 0, "read", transform_crouton_activation) | ||
sch.set_axis_separator(block, 0, "read", [4]) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI, after #11269 lands, the set_axis_separator
will be made automatically based on IndexMap.AXIS_SEPARATOR
, similar to how it is handled in TE-based schedules.
@arangasa there are couple of lint issues. Please fix those so we can see the test pipeline. |
4cada30
to
12ba658
Compare
Looks like there are still some lint errors. The CI's lint can be reproduced locally with |
12ba658
to
4027afb
Compare
@mehrdadh @Lunderberg : Hi, can you please let me know if any more changes are needed? If not can you please approve the PR. |
LGTM! I'll wait for @Lunderberg to take another look. |
@Lunderberg : Hi Eric, can you please review this patch again? Thanks. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
* Add support for relu slice op. * Format code * removing out_shape in relu def and lint issues * removing out_shape in relu def and lint issues * Changes as per the new format Co-authored-by: Venkat Rasagna Komatireddy <89959097+rasagna-quic@users.noreply.github.com> Co-authored-by: Venkat Rasagna Reddy Komatireddy <rasagna@hu-rasagna-hyd.qualcomm.com>
* Add support for relu slice op. * Format code * removing out_shape in relu def and lint issues * removing out_shape in relu def and lint issues * Changes as per the new format Co-authored-by: Venkat Rasagna Komatireddy <89959097+rasagna-quic@users.noreply.github.com> Co-authored-by: Venkat Rasagna Reddy Komatireddy <rasagna@hu-rasagna-hyd.qualcomm.com>
Raising PR on behalf of rasagna-quic (author)
Thanks for contributing to TVM! Please refer to guideline https://tvm.apache.org/docs/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from Reviewers by @ them in the pull request thread.
cc @mehrdadh