-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ONNX] Enable GPU in ONNX importer tests #7438
Merged
Merged
Changes from 8 commits
Commits
Show all changes
17 commits
Select commit
Hold shift + click to select a range
1595841
remove hardcoded target and ctx
electriclilies 43668a9
fix c-codgen for floating point mod
8655ce5
Merge pull request #3 from mbrookhart/onnx_gpu_tests
c02d9fb
MDisable onnx gpu test for argmin / argmax so we can get this fix mer…
electriclilies 371faff
lint
electriclilies fcbc54e
fix black
electriclilies dc4b083
Add flag to task_python_frontend.sh to only run GPU enabled tests on GPU
electriclilies 2f60c43
black again
electriclilies b58dc24
Enable GPU for test_nonzero
electriclilies e765689
Respond to comments
electriclilies e583e9d
Don't test batch matmul on CUDA
electriclilies e6e5cf0
Turn cuda off for dynamic batch matmul test
electriclilies 4f5a41c
Fix task script
electriclilies 9f31e99
Merge
electriclilies dabc571
Flaky test
electriclilies 96bf1ba
another flaky test
electriclilies 9268af2
Merge remote-tracking branch 'upstream/main' into onnx_gpu_enabled
electriclilies File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we sure we want this? I think this is skipping a lot of tests that were previously running.
It seems there are many tests that don't use
@tvm.testing.uses_gpu
.tvm/tests/python/frontend/pytorch/test_object_detection.py
Lines 93 to 94 in 96b0981
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably we should use
targets
option inverify_with_ort_with_inputs
for nowtvm/tests/python/frontend/onnx/test_forward.py
Line 126 in 3beec22
See for example
tvm/tests/python/frontend/onnx/test_forward.py
Lines 3421 to 3423 in 3beec22
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right now, all the tests are running on GPU, which causes the argmin / argmax onnx test to fail. The ONNX test file uses @tvm.testing.uses_gpu to indicate whether we should run tests on GPU or not, but right now those decorators don't actually do anything, which is definitely not good. I think we should try to move towards enabling the @tvm.testing.uses_gpu in the long run.
What if I set PYTEST_ADDOPTS before the onnx tests, and reset it after? Since the file uses the @tvm.testing.uses_gpu a lot, I think we should either remove the decorators or let them actually do something.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall, I think that hardcoding targets is bad because it's not easy to see -- that's what caused none of the ONNX tests to be run on GPU in the first place.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok, surprised to hear that
use_gpu
doesn't do anything. Let's make itThis should only modify env vars when running the onnx test.
Also, can you go through
TODO(mbrookhart)
in the onnx frontend test and adduse_gpus
? There are about 10-15 of them.tvm/tests/python/frontend/onnx/test_forward.py
Line 277 in 3beec22
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doh! I thought I had removed all of those comments, but I must have only done it for the dynamic test files and missed the onnx file.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can open a separate PR to remove these.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I already did it!