Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Create ML Inference Ingest Processor for Local Model without MODEL_INPUT success #2601

Closed
mingshl opened this issue Jul 1, 2024 · 0 comments
Assignees
Labels
bug Something isn't working v2.15.0

Comments

@mingshl
Copy link
Collaborator

mingshl commented Jul 1, 2024

What is the bug?
In 2.15, if we create the ML Inference Ingest Processor for Local Model without MODEL_INPUT, it return success

because local model required MODEL_INPUT, the factory should throw out illegal Argument Exception

@mingshl mingshl added bug Something isn't working untriaged v2.15.0 labels Jul 1, 2024
@mingshl mingshl moved this to In Progress in ml-commons projects Jul 2, 2024
@b4sjoo b4sjoo removed the untriaged label Jul 16, 2024
@b4sjoo b4sjoo closed this as completed Jul 16, 2024
@github-project-automation github-project-automation bot moved this from In Progress to Done in ml-commons projects Jul 16, 2024
@b4sjoo b4sjoo added v2.16.0 Issues targeting release v2.16.0 v2.15.0 and removed v2.15.0 v2.16.0 Issues targeting release v2.16.0 labels Jul 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working v2.15.0
Projects
Development

No branches or pull requests

3 participants