Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TFL detect app getting crashed with an error for ssd mobilenet v2 model trained for custom dataset. Please help me out #281

Closed
M1thun opened this issue Jan 21, 2019 · 7 comments

Comments

@M1thun
Copy link

M1thun commented Jan 21, 2019

This is the error I get after app gets crashed

java.lang.IllegalArgumentException: Cannot convert between a TensorFlowLite buffer with 1080000 bytes and a ByteBuffer with 270000 bytes.
at org.tensorflow.lite.Tensor.throwExceptionIfTypeIsIncompatible(Tensor.java:251)
at org.tensorflow.lite.Tensor.setTo(Tensor.java:110)
at org.tensorflow.lite.NativeInterpreterWrapper.run(NativeInterpreterWrapper.java:145)
at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:275)
at org.tensorflow.demo.TFLiteObjectDetectionAPIModel.recognizeImage(TFLiteObjectDetectionAPIModel.java:193)
at org.tensorflow.demo.DetectorActivity$3.run(DetectorActivity.java:247)
at android.os.Handler.handleCallback(Handler.java:742)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:157)
at android.os.HandlerThread.run(HandlerThread.java:61)

@M1thun M1thun changed the title TFL detect getting crashed with an error for ssd mobilenet v2 model trained for custom dataset. Please help me out TFL detect app getting crashed with an error for ssd mobilenet v2 model trained for custom dataset. Please help me out Jan 21, 2019
@simpeng
Copy link
Contributor

simpeng commented Jan 21, 2019

we don't actually support model built by TFlite.

@M1thun
Copy link
Author

M1thun commented Jan 21, 2019

Thanks for fast response ,
I tried deploying the frozen graph directly without optimizing the graph into android example(not tflite) and the app was getting crashed soon after I open it while it is working fine with my laptop webcam feed.
So for that reason I optimized the pb graph and then deploy the optimized graph into android example(not tflite) but again I reached an error mentioned below where there is is mismatch in the input data type which I couldnt resolve and then moved to exporting tflite ssd graph and further to .tflte to deploy in TFL detect and facing the errot mentioned in my 1st comment.

After trying to get output from optimized graph from webcam I got this following error:

$ python3 Object_detection_webcam.py
Traceback (most recent call last):
File "/home/mithun/anaconda3/envs/tfp3.6/lib/python3.6/site-packages/tensorflow/python/framework/importer.py", line 418, in import_graph_def
graph._c_graph, serialized, options) # pylint: disable=protected-access
tensorflow.python.framework.errors_impl.InvalidArgumentError: Input 0 of node ToFloat was passed float from image_tensor:0 incompatible with expected uint8.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "Object_detection_webcam.py", line 66, in
tf.import_graph_def(od_graph_def, name='')
File "/home/mithun/anaconda3/envs/tfp3.6/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 488, in new_func
return func(*args, **kwargs)
File "/home/mithun/anaconda3/envs/tfp3.6/lib/python3.6/site-packages/tensorflow/python/framework/importer.py", line 422, in import_graph_def
raise ValueError(str(e))
ValueError: Input 0 of node ToFloat was passed float from image_tensor:0 incompatible with expected uint8.

@pengwa
Copy link
Collaborator

pengwa commented Jan 21, 2019

1stly the error comes when your local tf local the frozen pb file, I observed something similar #77, but your case seems very different. I guess this may caused by TF compatibly. Which version of TF you export the file and what version of TF you are running the python script?

2ndly, from the command line, seems you are trying to convert a model of object detection, which is not well supported by the converter tool, there are pretty much gaps (on either onnx spec or converter logic) before we can get it fully converted and runnable.

@M1thun
Copy link
Author

M1thun commented Jan 22, 2019

Voila..!! I have successfully solved this issue and hence closing the issue
I changed some parameters in DetectorActiivity.java for TFlite to make exception for FLOAT model .
There are some compatibility issues with T F android(not TFlite) while deploying the model. Where the datatype of input parameters has to be changed somewhere deep in codes to match with the datatype accepted model the problem for which is not yet resolved :( .

@M1thun M1thun closed this as completed Jan 22, 2019
@pengwa
Copy link
Collaborator

pengwa commented Jan 22, 2019

too bad about the combability between different TF build.

so object detection logic is converted? I cannot believe.... anyway, this is cool.

@ramyrao
Copy link

ramyrao commented May 20, 2019

Voila..!! I have successfully solved this issue and hence closing the issue
I changed some parameters in DetectorActiivity.java for TFlite to make exception for FLOAT model .
There are some compatibility issues with T F android(not TFlite) while deploying the model. Where the datatype of input parameters has to be changed somewhere deep in codes to match with the datatype accepted model the problem for which is not yet resolved :( .

@ramyrao
Copy link

ramyrao commented May 20, 2019

@M1thun

I am facing the same error, can you please let me know what exactly you did to make the error go away. Will be extremely grateful to you for your reply.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants