-
Notifications
You must be signed in to change notification settings - Fork 502
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Export-to-ExecuTorch via Optimum integration #2128
Comments
#2090 is the initial integration to introduce |
Let me know if and how I can help with this! |
@ProjectProgramAMark Thank you for showing interests in this work! There are lots to contribute as we just landed the initial integration, for example, you can follow the guidance to extend it to support new task types, new recipes or new models. Or simply try anything out and share you feedback! |
This issue has been marked as stale because it has been open for 30 days with no activity. This thread will be automatically closed in 5 days if no further activity occurs. |
Feature request
Feature request
Integrate
ExecuTorch
to `optimum, enabling an new "Export to ExecuTorch" workflowMotivation
Enable a new e2e workflow for on-device ML use-cases via ExecuTorch
Your contribution
Drive initial integration
Provide default recipes for delegates, e.g. XNNPACK, CoreML, QNN, MPS, etc.
The text was updated successfully, but these errors were encountered: