Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Frontend][Paddle] [PaddlePaddle Hackathon 4] add convert support for p_norm/roi_align/softmax_with_cross_entropy #14826

Merged
merged 2 commits into from
May 17, 2023

Conversation

AndPuQing
Copy link
Contributor

@AndPuQing AndPuQing commented May 11, 2023

Add support for converting the following Paddle OPs.

cc @jiangjiajun

- affine_channel
- p_norm
- roi_align
- softmax_with_cross_entropy
@tvm-bot
Copy link
Collaborator

tvm-bot commented May 11, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

Generated by tvm-bot

Copy link
Contributor

@jiangjiajun jiangjiajun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please modify the title to "[Frontend][Paddle] [PaddlePaddle Hackathon 4] add convert support for p_norm/roi_align/softmax_with_cross_entropy", since the affine_channel is removed.

@AndPuQing AndPuQing changed the title [Frontend][Paddle] [PaddlePaddle Hackathon 4] add convert support for affine_channel/p_norm/roi_align/softmax_with_cross_entropy [Frontend][Paddle] [PaddlePaddle Hackathon 4] add convert support for p_norm/roi_align/softmax_with_cross_entropy May 16, 2023
@jiangjiajun
Copy link
Contributor

jiangjiajun commented May 17, 2023

@junrushao Could you help to approve this PR? I found that I'am still a contributor not a reviewer...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants