-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
onnx.If errors when the two branches yield with different shapes #696
Comments
I'm not sure if we want to support this, but if we do I think we need to insert a type conversion from vtensor<[1], si64> to vtensor<[?], si64>. Not sure how to materialize that. |
KeypointRCNN_vaiq_int8 has if statements that returns not just different shapes but different ranks. We can't support that and will need something to be done with the model. |
Stella mentions a way to deal with similar problems https://discord.com/channels/973663919757492264/1173330951791706113/1246504997269798945 and @zjgarvey's comment in this morning's meeting got me to put 2 and 2 together. Will try to just remove the small branch of these onnx.If |
I vaguely remember the concensus is that this is a model issue and should be fixed by editing the model. Perhaps ask sharktank folks for help? |
reproducer extract from coat_mini model
`torch-mlir-opt --convert-torch-onnx-to-torch if.onnx.mlir --debug
|
we have 6 models https://github.com/pdhirajkumarprasad/SHARK-TestSuite/blob/feature/qa/issue/onnx-to-torch/onnx.if failed. Not sure if it is feasible to edit all the models. will ask tomorrow. |
Yeah, we could try adding the models to the basic opt list in the test
suite `azure_models.py`
…On Tue, Oct 22, 2024, 9:58 PM Chi_Liu ***@***.***> wrote:
I vaguely remember the concensus is that this is a model issue and should
be fixed by editing the model. Perhaps ask sharktank folks for help?
we have 6 models
https://github.com/pdhirajkumarprasad/SHARK-TestSuite/blob/feature/qa/issue/onnx-to-torch/onnx.if
failed. Not sure if it is the right way to edit all the models. We ask
tomorrow.
—
Reply to this email directly, view it on GitHub
<#696 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ALODRYIK6DRMAE3SE26QWRDZ44GFJAVCNFSM6AAAAABIAVWOZKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMZQG4ZDKMBQGY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
With adding to basic_opt list, 2 models
|
Ah, I see. For nlp models, we need to register those differently. |
I tried doing basic optimizations for those nlp models locally, but it still has the problematic if statement, so it's likely unable to fold it when dynamic dims are present. Add a PR to do those two static dim models as basic_opt tests for now. |
|
Fix the onnx.if different shapes issue nod-ai/SHARK-ModelDev#696. 4 dynamic models still left unfixed.
4 related model--splinter-large-few-shot-k-16-finetuned-squad-seed-0--anas-awadalla, similar failed pattern:
->
|
@jinchen62 you can take this 2. retinanet_resnet50_fpn_vaiq_int8 and KeypointRCNN_vaiq_int8 failed pattern are same, one branch dynamic/one branch static, return dynamic
|
With onnx-modifier.exe, This can be fixed by remove if node and add the squeeze node(in if then branch) pass its output to the next add node as input directly. Here is the test output.
|
I was working on #566
Found the problem in KeypointRCNN (gist with stripped IR at %5503 (the above link takes you to the correct line.
When lowering an
If
op with two branches returning different types, we encounter:Reproducer:
The text was updated successfully, but these errors were encountered: