This repository has been archived by the owner on Jan 15, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 538
[TVM] TVM Integration Issue after changing to Boolean Mask. #1425
Labels
bug
Something isn't working
Comments
8 tasks
On my EC2 instance I commented out 'google_albert_base_v2' and the rest of the models seemed to work fine. created this pr to re-enable the tests #1437 |
The error I got:
|
For me, I think one potential cause is that the TVM does not allow mixed data types in the where operartor, e.g., https://github.com/apache/incubator-tvm/blob/7649075fbb71ecab0a41c6fe4d41a86724e42e7a/python/tvm/relay/frontend/mxnet.py#L2419-L2434. Thus, we may print the dtypes of the |
#1437 passed CPU CI but on GPU the remaining three models still all failed 'google_en_cased_bert_base', |
This was referenced Jan 9, 2021
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Description
I'm changing the mask to use the boolean type in #1405 to pass the AMP. However, it's causing issues in TVM integration. I created this issue to track this error and will skip the TVM test.
The text was updated successfully, but these errors were encountered: