Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use isolates to run inference #52

Closed
luiscib3r opened this issue May 16, 2023 · 1 comment
Closed

Use isolates to run inference #52

luiscib3r opened this issue May 16, 2023 · 1 comment

Comments

@luiscib3r
Copy link
Contributor

When inference is running in models that take a long time. The UI it freezes/locked because everything runs on a single thread.

Consider the possibility of implementing Isolate.spawn to run the model. The developer could do this externally, but it would be better if this process is handled within the package. The run method could return the execution status in a stream [loading, done]

@luiscib3r
Copy link
Contributor Author

luiscib3r commented May 18, 2023

I solved this and have added to PR #49.

final interpreter = Interpreter.fromAssets('assets/yourmodel.tflite');
final isolateInterpreter = IsolateInterpreter(address: interpreter.address);

Now interpreter continues working in the same way. But in isolateInterpreter the methods isolateInterpreter.run and isolateInterpreter.runForMultipleInputs are asynchronous and internally what they do is use an isolate to execute the inference, this way the ui does not freeze. You can also use isolateInterpreter.state and isolateInterpreter.stateChanges (a Stream) to know the state of the interpreter (idle or loading).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant