-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Relay] Conv2D padding representation #4787
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Leave some comments.
Also, I found that now we have get_pad_tuple
in both Relay and TOPI. I feel that we should remove the TOPI one to enforce the TOPI conv2d input type. @icemelon9 @yzhliu do you guys have any suggestions?
topi/python/topi/nn/conv2d.py
Outdated
@@ -62,6 +61,8 @@ def conv2d(input, filter, strides, padding, dilation, layout='NCHW', out_dtype=N | |||
output : tvm.Tensor | |||
4-D with shape [batch, out_channel, out_height, out_width] | |||
""" | |||
#only accepts 4-way padding | |||
assert len(padding) == 4, "only accepts 4-way padding" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm afraid that you have to add this assertion to all conv2d compute functions. Specifically, all conv2d functions with @autotvm.register_topi_compute(nn.conv2d, ...)
decorator should have this assertion. @icemelon9 could you help confirm?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zxy844288792 how about we revert this file first and add a note in python/tvm/relay/op/nn/nn.py
to remind us to add it back when #4644 is merged?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@comaniac Agree
LGTM. |
You should also add |
I have added get_pad_tuple2d for contrib_conv2d_nchwc and contrib_conv2d_nchwc_int8, contrib_conv2d_winograd_without_weight_transform and contrib_conv2d_winograd_nnpack_without_weight_transform |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@zxy844288792 I see that you didn't modify the |
Thanks @zxy844288792 @icemelon9 @comaniac |
* enforce 4-way padding * add util with get_pad_tuple * delete unnecessary arguments * fix lint * add container.Array case * fix cudnn conv2d asymmetric padding logic * rename get_pad_tuple to get_pad_tuple2d * revert change for topi/python/topi/nn/conv2d.py * add get_pad_tuple2d for several contrib conv2d ops * add get_pad_tuple2d for all conv2d ops
* enforce 4-way padding * add util with get_pad_tuple * delete unnecessary arguments * fix lint * add container.Array case * fix cudnn conv2d asymmetric padding logic * rename get_pad_tuple to get_pad_tuple2d * revert change for topi/python/topi/nn/conv2d.py * add get_pad_tuple2d for several contrib conv2d ops * add get_pad_tuple2d for all conv2d ops
* enforce 4-way padding * add util with get_pad_tuple * delete unnecessary arguments * fix lint * add container.Array case * fix cudnn conv2d asymmetric padding logic * rename get_pad_tuple to get_pad_tuple2d * revert change for topi/python/topi/nn/conv2d.py * add get_pad_tuple2d for several contrib conv2d ops * add get_pad_tuple2d for all conv2d ops
* enforce 4-way padding * add util with get_pad_tuple * delete unnecessary arguments * fix lint * add container.Array case * fix cudnn conv2d asymmetric padding logic * rename get_pad_tuple to get_pad_tuple2d * revert change for topi/python/topi/nn/conv2d.py * add get_pad_tuple2d for several contrib conv2d ops * add get_pad_tuple2d for all conv2d ops
As discussed here: https://discuss.tvm.ai/t/rfc-conv2d-padding-representation/5394. We agree we should enforce topi.nn.conv2d to accept 4-way padding. We also should have relay.nn.conv2d legalizes the padding to 4-way.
get_pad_tuple is from topi util.py. I deleted some unuseful code and reuse it for relay.op.nn.conv2d.
Thanks for contributing to TVM! Please refer to guideline https://docs.tvm.ai/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from Reviewers by @ them in the pull request thread.