Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to load the hand detection model #35

Closed
srishtigoelroposo opened this issue Aug 21, 2019 · 31 comments
Closed

Unable to load the hand detection model #35

srishtigoelroposo opened this issue Aug 21, 2019 · 31 comments
Labels
legacy:hands Hand tracking/gestures/etc

Comments

@srishtigoelroposo
Copy link

I am trying to test the given model in my sample android application.
When trying to load the model i face this issue:

java.lang.IllegalStateException: Internal error: Unexpected failure when preparing tensor allocations: Encountered unresolved custom op: Convolution2DTransposeBias.Node number 165 (Convolution2DTransposeBias) failed to prepare.

Code:
AssetFileDescriptor fileDescriptor = activity.getAssets().openFd("palm_detection.tflite");

@jiuqiant
Copy link
Contributor

We have some custom ops in mediapipe/util/tflite/operations.

@srishtigoelroposo
Copy link
Author

Is their any way i can use these custom operations and include them in my tflite setup.

i am very new to this and want help to up the model on android device.

@akamzin
Copy link

akamzin commented Aug 25, 2019

Can any one give a little bit more details on how to incorporate these custom ops to be able to run it in regular python enviroment.

@mgyong mgyong added the legacy:hands Hand tracking/gestures/etc label Aug 27, 2019
@Anton-Prab
Copy link

Got the same error in python when I tried to load the tflite model into the Interpreter.

ValueError: Didn't find custom op for name 'Convolution2DTransposeBias' with version 1
Registration failed. 

Code:

import tensorflow as tf
interpreter = tf.lite.Interpreter(model_path="palm_detection.tflite")

I can see that it can be resolved using the tflite::InterpreterBuilder in C api
but in python tf.lite.Interpreter alone is available InterpreterBuilder is not available

Environment:
Python - 3.7
Tensorflow - 1.14

@mmxuan18
Copy link

mmxuan18 commented Sep 4, 2019

will is possible i compile only the tflite in mediapipe, and use this as instead of offfical tflite?

@lamarrr
Copy link

lamarrr commented Sep 7, 2019

To avoid stress you can use it with Tensorflow C++ Lite. And build with Bazel.
It's faster than compiling the whole Tensorflow. I've done that and it works fine for me.
Just copy the custom op from mediapipe's tensorflow sub folder.
Then Create a class to inherit from the BuiltinOpResolver and add the Custom Op from there just as in mediapipe.

@metalwhale
Copy link

metalwhale commented Sep 16, 2019

To avoid stress you can use it with Tensorflow C++ Lite. And build with Bazel.
It's faster than compiling the whole Tensorflow. I've done that and it works fine for me.
Just copy the custom op from mediapipe's tensorflow sub folder.
Then Create a class to inherit from the BuiltinOpResolver and add the Custom Op from there just as in mediapipe.

@lamarrr, and then can I load the compiled op in python by using tf.load_op_library like that?

@lamarrr
Copy link

lamarrr commented Sep 16, 2019

I really don't think so because it's a Tensorflow lite package and not Tensorflow's.
And I don't think there is an API for loading ops into TensorFlow lite yet in Python, but you can do your research
cc: @fanzhanggoogle @jiuqiant

@metalwhale
Copy link

Thanks to @junhwanjang and his awesome repo, now I can load palm detection model without custom op.

@kulievvitaly
Copy link

Thanks to @junhwanjang and his awesome repo, now I can load palm detection model without custom op.

It is work for me. That is important to use python3.7 and install specified version of tensorflow

@CC10010
Copy link

CC10010 commented Oct 22, 2019

Thanks to @junhwanjang and his awesome repo, now I can load palm detection model without custom op.

It is work for me. That is important to use python3.7 and install specified version of tensorflow

the [repo] has been deleted,can you tell me what is the 'specified version of tensorflow' please?

@junhwanjang
Copy link

junhwanjang commented Oct 22, 2019

Thanks to @junhwanjang and his awesome repo, now I can load palm detection model without custom op.

It is work for me. That is important to use python3.7 and install specified version of tensorflow

the [repo] has been deleted,can you tell me what is the 'specified version of tensorflow' please?

@caocao1989 Tensorflow - 1.13.1 and python 3.7.3

@CC10010
Copy link

CC10010 commented Oct 22, 2019

@junhwanjang thank you very much!!!

@manchengfenxu
Copy link

Thanks to @junhwanjang and his awesome repo, now I can load palm detection model without custom op.

@junhwanjang , except for the 'specified version of tensorflow', is there any other modifications of the usage of tflite intenpreter? I still encountered this error:unresolved custom op: Convolution2DTransposeBias using "tensorflow-lite:1.13.0".

@scm-ns
Copy link

scm-ns commented Nov 18, 2019

@jiuqiant The ops in mediapipe/util/tflite/operations seems to the cpu impls. I am wondering how the model works on the GPU?

@fanzhanggoogle I looked at the op_resolver, which seems to be used by mediapipe when it is using the GPU delegate. But the registrations for ConvTranspose is (nullptr, nullptr, nullptr, nullptr). Which means there is no function in the registration. I am curious as to where the ConvTranspose GPU kernel is implemented.

I am trying to run the TFLite model using the GPU delegate, and I am running into some errors. I am running in a EXC_BAD_ACCESS in the function Convolution2DTransposeBianParser::IsSupported. Do you guys have any info?

@shoutOutYangJie
Copy link

Thanks to @junhwanjang and his awesome repo, now I can load palm detection model without custom op.

@junhwanjang , except for the 'specified version of tensorflow', is there any other modifications of the usage of tflite intenpreter? I still encountered this error:unresolved custom op: Convolution2DTransposeBias using "tensorflow-lite:1.13.0".

do you solve this problem? I meet the same question.

@shoutOutYangJie
Copy link

To avoid stress you can use it with Tensorflow C++ Lite. And build with Bazel.
It's faster than compiling the whole Tensorflow. I've done that and it works fine for me.
Just copy the custom op from mediapipe's tensorflow sub folder.
Then Create a class to inherit from the BuiltinOpResolver and add the Custom Op from there just as in mediapipe.

Can tensorflow lite C++ depoly on Visual studio project and windows system?

@shoutOutYangJie
Copy link

Thanks to @junhwanjang and his awesome repo, now I can load palm detection model without custom op.

can you tell me how to solve the problem, I meet same error when running "interpreter.allocate_tensors()". The repo has been deleted, Please tell me how to avoid the problem or can you share me your tflite which can directly be used to tflite python API without C++.

@shoutOutYangJie
Copy link

@junhwanjang thank you very much!!!

can you share how to resolve this problem?

@shoutOutYangJie
Copy link

Can any one give a little bit more details on how to incorporate these custom ops to be able to run it in regular python enviroment.

have you solved this difficult question?

@shoutOutYangJie
Copy link

Got the same error in python when I tried to load the tflite model into the Interpreter.

ValueError: Didn't find custom op for name 'Convolution2DTransposeBias' with version 1
Registration failed. 

Code:

import tensorflow as tf
interpreter = tf.lite.Interpreter(model_path="palm_detection.tflite")

I can see that it can be resolved using the tflite::InterpreterBuilder in C api
but in python tf.lite.Interpreter alone is available InterpreterBuilder is not available

Environment:
Python - 3.7
Tensorflow - 1.14

So if I just want to run this tflite model on python environment, How to avoid this problem

@junhwanjang
Copy link

@shoutOutYangJie I closed my repo because of some private reason.
You can use "palm_detection_without_custom_op.tflite" in the below link.
https://github.com/metalwhale/hand_tracking

@shoutOutYangJie
Copy link

@shoutOutYangJie I closed my repo because of some private reason.
You can use "palm_detection_without_custom_op.tflite" in the below link.
https://github.com/metalwhale/hand_tracking

Thanks you very much. You save my life!! Could you introduce roughly what important steps you did ? How to generate this new tflite model without custom ops.

@shoutOutYangJie
Copy link

shoutOutYangJie commented Nov 27, 2019 via email

@lip-realmax
Copy link

lo

@lamarrr Hi, could you please share a little bit more on the TF Lite C++ part? E.g. how to build and be linked with applications for C++ usage?

@Akshaysharma29
Copy link

@junhwanjang
Hi Thanks for sharing model. I have same question
#35 (comment)

@ragavendranbala
Copy link

@Akshaysharma29 @shoutOutYangJie Did you find the solution?

@Akshaysharma29
Copy link

@ragavendranbala No I have not changed there custom_ops have you tried that?

@junhwanjang
Copy link

@Akshaysharma29 @shoutOutYangJie @ragavendranbala
Sorry for late reply first.
As I checked their operations manually, actually their custom ops (Conv2dTransposeBiasAdd) was combined with Conv2DTranspose and BiasAdd. So I "rephrased" it as Conv2DTranspose and BiasAdd separately. It was simple solution I found.

@smahadwale2001
Copy link

@Akshaysharma29 @shoutOutYangJie @ragavendranbala
Sorry for late reply first.
As I checked their operations manually, actually their custom ops (Conv2dTransposeBiasAdd) was combined with Conv2DTranspose and BiasAdd. So I "rephrased" it as Conv2DTranspose and BiasAdd separately. It was simple solution I found.

Can you share your code

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
legacy:hands Hand tracking/gestures/etc
Projects
None yet
Development

No branches or pull requests