-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Android hand tracking error:use aar in my own project #310
Comments
For the aar question, we do rename the model from hand_landmark_3d.tflite to hand_landmark.tflite in the source code when we build the 3d hand tracking demo. It's an intended behavior. |
For the question reg. the |
@jiuqiant Got it. Thank you. |
@eknight7 I didn't change the graph, so it was the same as V0.6.6. |
@eknight7 I got the binary graph and aar by command below: |
Okay, from the error message you pasted, i.e. :
it looks like the error is because the packet being pushed into the graph contains a Can you share your camera code if that is not the case? |
@eknight7 I see. I make a mistake. Actually I am prefer to use the gpu version, but I don't known how to change original camera datas or a bitmap, to TextureFrame or SurfaceTexture? |
Converting bitmap to TextureFrame or SurfaceTexture is out of scope of MediaPipe framework usage. Can you get the camera data as a SurfaceTexture directly instead of converting a bitmap ? What are you using to get camera data? |
I tried to look for the aar_example you said:
I looked through v0.6.6, but I don't see the aar_example or mp_hand_tracking_aar. |
@Sara533 |
Have you solved your problem? I have encountered the same problem, and I am very anxious to solve it. |
@liaohong |
Thank you. Does your code work? I'm in "nativeMovePacketToInputStream" unable to perform this function. |
I modify my project as the steps above. |
Can you share your code? |
Hey if you want to use Bitmap instead of SurfaceTexture input then you need to change private final String fragmentShaderCode =
"#extension GL_OES_EGL_image_external : require\n"+
"varying mediump vec2 sample_coordinate;\n"
+ "uniform sampler2D video_frame;\n"
+ "\n"
+ "void main() {\n"
+ " gl_FragColor = texture2D(video_frame, sample_coordinate);\n"
+ "}"; Now you also need to change implementation of public void render(int textureName, Bitmap bmp){
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bmp
, 0);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glUseProgram(program);
GLES20.glUniform1i(frameUniform, 0);
GLES20.glUniformMatrix4fv(textureTransformUniform, 1, false, textureTransformMatrix, 0);
ShaderUtil.checkGlError("glUniformMatrix4fv");
GLES20.glEnableVertexAttribArray(ATTRIB_POSITION);
GLES20.glVertexAttribPointer(
ATTRIB_POSITION, 2, GLES20.GL_FLOAT, false, 0, CommonShaders.SQUARE_VERTICES);
GLES20.glEnableVertexAttribArray(ATTRIB_TEXTURE_COORDINATE);
GLES20.glVertexAttribPointer(
ATTRIB_TEXTURE_COORDINATE,
2,
GLES20.GL_FLOAT,
false,
0,
flipY ? FLIPPED_TEXTURE_VERTICES : TEXTURE_VERTICES);
ShaderUtil.checkGlError("program setup");
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureName);
ShaderUtil.checkGlError("glDrawArrays");
} you can call this method from |
@afsaredrisy |
@Sara533 |
@afsaredrisy `processor = new FrameProcessor( renderer = new TextureRenderer(); destinationTextureId = ShaderUtil.createRgbaTexture(previewWidth, previewHeight); When rgbFrameBitmap is available:
Here is the TextureRenderer: `public class TextureRenderer {
}` |
Hello, thank you for your help. I used your code to get the timestamp correctly. Now I want to get the node that I recognize on my hand. |
Hi |
@Sanerly
|
@afsaredrisy |
@afsaredrisy Thank you very much for your reply |
@afsaredrisy |
@afsaredrisy Thank you so much. You've saved my day. |
use new version success |
I use my own android studio project to get the camera data.
And then transform it to a ARGB_8888 bitmap.
At last I use the above bitmap as input to call public void onNewFrame(final Bitmap bitmap, long timestamp) method in FrameProcessor.class.
But I got errors like:
demo E/FrameProcessor: Mediapipe error: com.google.mediapipe.framework.MediaPipeException: invalid argument: Graph has errors: Packet type mismatch on calculator outputting to stream "input_video": The Packet stores "mediapipe::ImageFrame", but "mediapipe::GpuBuffer" was requested. at com.google.mediapipe.framework.Graph.nativeMovePacketToInputStream(Native Method) at com.google.mediapipe.framework.Graph.addConsumablePacketToInputStream(Graph.java:380) at com.google.mediapipe.components.FrameProcessor.onNewFrame(FrameProcessor.java:286) at org.tensorflow.demo.DetectorActivity.processImage(DetectorActivity.java:509) at org.tensorflow.demo.CameraActivity.onPreviewFrame(CameraActivity.java:267) at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1124) at android.os.Handler.dispatchMessage(Handler.java:105) at android.os.Looper.loop(Looper.java:164) at android.app.ActivityThread.main(ActivityThread.java:6600) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.Zygote$MethodAndArgsCaller.run(Zygote.java:240) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:772)
So does it use gpu as default?
How can I solve the problem?
By the way,I found in the V0.6.6 multi hand aar example, if I just put hand_landmark_3d.tflite instead of hand_landmark.tflite, there will be wrong. The phone got a black window. If I rename hand_landmark_3d.tflite to hand_landmark.tflite, it will get right 3d result.
It seems the code only load tflite named hand_landmark.tflite.
I'm not sure is it a problem.
Anybody help?Thank you.
The text was updated successfully, but these errors were encountered: