Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Feature add runSignature function #165

Closed
wants to merge 8 commits into from
Closed

[WIP] Feature add runSignature function #165

wants to merge 8 commits into from

Conversation

ghost
Copy link

@ghost ghost commented Nov 2, 2023

This pull request adds the ability to run signatures in Flutter TFLite.

List of functions added and their testing status:

  • getSignatureInputCount(String signatureKey)
  • getSignatureOutputCount(String signatureKey)
  • getSignatureInputName(String signatureKey, int index)
  • getSignatureOutputName(String signatureKey, int index)
  • getSignatureInputTensorShape(String signatureKey, String inputName)
  • getSignatureOutputTensorShape(String signatureKey, String outputName)
  • runSignature(Map<String, Object> inputs, Map<String, Object> outputs, String signatureKey)
  • deleteSignatureRunner(String signatureKey)

@ghost ghost changed the title Feature add runSignature func [WIP] Feature add runSignature function Nov 2, 2023
@CaptainDario
Copy link
Contributor

@st-duymai are you still working on this? It would be an awesome addition!

@ghost
Copy link
Author

ghost commented Jan 28, 2024

@st-duymai are you still working on this? It would be an awesome addition!

I'm still working on it. I'm doing it with the MoviNet sample. When I finish it, it would be great if you could help me review it.

@CaptainDario
Copy link
Contributor

Yes, sure. Ping me once you are ready. Due to some personal matters, I am not really active here, so maybe you need to ping me twice.

@michaeltys
Copy link

Hi @st-duymai , thanks for your contribution. I was testing the movinet sample and found that inference takes 1 - 2 seconds to complete for a single frame. I checked with multiple frame dimensions (from 2020 to 10001000) and the result is the same. Do you see a possible way to improve the performance here?

I took android implementation as a reference, the inference there takes 20-50 milliseconds on the same data even using Flutter Chanel for native code integration

@CaptainDario
Copy link
Contributor

In my experience, TF Lite flutter is close to the native c performance.

Are you sure that you only measure the inference latency? Not the frame decoding from the camera?

@michaeltys
Copy link

In my experience, TF Lite flutter is close to the native c performance.

Are you sure that you only measure the inference latency? Not the frame decoding from the camera?

Yes, I measured just the inference time. Frame decoding takes really little in comparison to the inference in this case

@CaptainDario
Copy link
Contributor

@michaeltys, did you try running it in the native tf lite Benchmark app?

Also, could you provide a link to the model?

@ghost
Copy link
Author

ghost commented Feb 15, 2024

@michaeltys Yes, I know this issue. The reason is when I use isolate, I can't remain all tensors and signatures during each inference.

@ghost
Copy link
Author

ghost commented Feb 16, 2024

I am closing this pull request as I will no longer be using this GitHub account. I may create a new pull request later.

@ghost ghost closed this Feb 16, 2024
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants