Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Android hand tracking error:use aar in my own project #310

Closed
Sara533 opened this issue Dec 10, 2019 · 28 comments
Closed

Android hand tracking error:use aar in my own project #310

Sara533 opened this issue Dec 10, 2019 · 28 comments
Labels
platform:android Issues with Android as Platform

Comments

@Sara533
Copy link

Sara533 commented Dec 10, 2019

I use my own android studio project to get the camera data.
And then transform it to a ARGB_8888 bitmap.
At last I use the above bitmap as input to call public void onNewFrame(final Bitmap bitmap, long timestamp) method in FrameProcessor.class.
But I got errors like:
demo E/FrameProcessor: Mediapipe error: com.google.mediapipe.framework.MediaPipeException: invalid argument: Graph has errors: Packet type mismatch on calculator outputting to stream "input_video": The Packet stores "mediapipe::ImageFrame", but "mediapipe::GpuBuffer" was requested. at com.google.mediapipe.framework.Graph.nativeMovePacketToInputStream(Native Method) at com.google.mediapipe.framework.Graph.addConsumablePacketToInputStream(Graph.java:380) at com.google.mediapipe.components.FrameProcessor.onNewFrame(FrameProcessor.java:286) at org.tensorflow.demo.DetectorActivity.processImage(DetectorActivity.java:509) at org.tensorflow.demo.CameraActivity.onPreviewFrame(CameraActivity.java:267) at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1124) at android.os.Handler.dispatchMessage(Handler.java:105) at android.os.Looper.loop(Looper.java:164) at android.app.ActivityThread.main(ActivityThread.java:6600) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.Zygote$MethodAndArgsCaller.run(Zygote.java:240) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:772)
So does it use gpu as default?
How can I solve the problem?
By the way,I found in the V0.6.6 multi hand aar example, if I just put hand_landmark_3d.tflite instead of hand_landmark.tflite, there will be wrong. The phone got a black window. If I rename hand_landmark_3d.tflite to hand_landmark.tflite, it will get right 3d result.
It seems the code only load tflite named hand_landmark.tflite.
I'm not sure is it a problem.
Anybody help?Thank you.

@jiuqiant
Copy link
Contributor

jiuqiant commented Dec 10, 2019

For the aar question, we do rename the model from hand_landmark_3d.tflite to hand_landmark.tflite in the source code when we build the 3d hand tracking demo. It's an intended behavior.

@eknight7
Copy link

For the question reg. the FrameProcessor, can you share your graph @Sara533 ?

@Sara533
Copy link
Author

Sara533 commented Dec 11, 2019

@jiuqiant Got it. Thank you.

@Sara533
Copy link
Author

Sara533 commented Dec 11, 2019

@eknight7 I didn't change the graph, so it was the same as V0.6.6.

@Sara533
Copy link
Author

Sara533 commented Dec 11, 2019

@eknight7 I got the binary graph and aar by command below:
bazel build -c opt --config=android_arm64 --define 3D=true mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu:binary_graph
bazel build -c opt --fat_apk_cpu=arm64-v8a,armeabi-v7a \ //mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example:mp_hand_tracking_aar

@eknight7
Copy link

Okay, from the error message you pasted, i.e. :

demo E/FrameProcessor: Mediapipe error: com.google.mediapipe.framework.MediaPipeException: invalid argument: Graph has errors: Packet type mismatch on calculator outputting to stream "input_video": The Packet stores "mediapipe::ImageFrame", but "mediapipe::GpuBuffer" was requested. at com.google.mediapipe.framework.Graph.nativeMovePacketToInputStream(Native Method) at com.google.mediapipe.framework.Graph.addConsumablePacketToInputStream(Graph.java:380) at com.google.mediapipe.components.FrameProcessor.onNewFrame(FrameProcessor.java:286) at org.tensorflow.demo.DetectorActivity.processImage(DetectorActivity.java:509) at org.tensorflow.demo.CameraActivity.onPreviewFrame(CameraActivity.java:267) at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1124) at

it looks like the error is because the packet being pushed into the graph contains a mediapipe::ImageFrame, whereas the graph expects it to contain a mediapipe::GpuBuffer. In your custom camera code, are you passing the image to FrameProcessor via a GPU texture container like SurfaceTexture?

Can you share your camera code if that is not the case?

@eknight7 eknight7 added the platform:android Issues with Android as Platform label Dec 11, 2019
@Sara533
Copy link
Author

Sara533 commented Dec 11, 2019

@eknight7 I see. I make a mistake.
There are two functions with same name in FrameProcessor, like this:
public void onNewFrame(final TextureFrame frame)
public void onNewFrame(final Bitmap bitmap, long timestamp)
I used the second one with bitmap, but I didn't change the binary graph and aar to cpu version. so I got the errors above.

Actually I am prefer to use the gpu version, but I don't known how to change original camera datas or a bitmap, to TextureFrame or SurfaceTexture?

@eknight7
Copy link

eknight7 commented Dec 15, 2019

Converting bitmap to TextureFrame or SurfaceTexture is out of scope of MediaPipe framework usage. Can you get the camera data as a SurfaceTexture directly instead of converting a bitmap ? What are you using to get camera data?

@billhung
Copy link

@Sara533

I tried to look for the aar_example you said:

bazel build -c opt --fat_apk_cpu=arm64-v8a,armeabi-v7a \ //mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example:mp_hand_tracking_aar

I looked through v0.6.6, but I don't see the aar_example or mp_hand_tracking_aar.
Can you tell me where to find it please?

@billhung
Copy link

@Sara533
Thanks for the links and quick reply. I will try those.

@liao00
Copy link

liao00 commented Dec 19, 2019

I use my own android studio project to get the camera data.
And then transform it to a ARGB_8888 bitmap.
At last I use the above bitmap as input to call public void onNewFrame(final Bitmap bitmap, long timestamp) method in FrameProcessor.class.
But I got errors like:
demo E/FrameProcessor: Mediapipe error: com.google.mediapipe.framework.MediaPipeException: invalid argument: Graph has errors: Packet type mismatch on calculator outputting to stream "input_video": The Packet stores "mediapipe::ImageFrame", but "mediapipe::GpuBuffer" was requested. at com.google.mediapipe.framework.Graph.nativeMovePacketToInputStream(Native Method) at com.google.mediapipe.framework.Graph.addConsumablePacketToInputStream(Graph.java:380) at com.google.mediapipe.components.FrameProcessor.onNewFrame(FrameProcessor.java:286) at org.tensorflow.demo.DetectorActivity.processImage(DetectorActivity.java:509) at org.tensorflow.demo.CameraActivity.onPreviewFrame(CameraActivity.java:267) at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1124) at android.os.Handler.dispatchMessage(Handler.java:105) at android.os.Looper.loop(Looper.java:164) at android.app.ActivityThread.main(ActivityThread.java:6600) at java.lang.reflect.Method.invoke(Native Method) at com.android.internal.os.Zygote$MethodAndArgsCaller.run(Zygote.java:240) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:772)
So does it use gpu as default?
How can I solve the problem?
By the way,I found in the V0.6.6 multi hand aar example, if I just put hand_landmark_3d.tflite instead of hand_landmark.tflite, there will be wrong. The phone got a black window. If I rename hand_landmark_3d.tflite to hand_landmark.tflite, it will get right 3d result.
It seems the code only load tflite named hand_landmark.tflite.
I'm not sure is it a problem.
Anybody help?Thank you.

Have you solved your problem? I have encountered the same problem, and I am very anxious to solve it.

@Sara533
Copy link
Author

Sara533 commented Dec 19, 2019

@liaohong
I tried transforming a bitmap to texture by a function in ShaderUtil.class named:
public static int createRgbaTexture (Bitmap bitmap).
And then used the texture above to new an AppTextureFrame which implements TextureFrame.
At last called the function public void onNewFrame(final TextureFrame) in FrameProcessor.calss.
But now I get a new problem. I got empty results of handPresence and handLandmarks.
I'm not sure if the method above is right or not.
I'm checking.

@liao00
Copy link

liao00 commented Dec 20, 2019

@liaohong
I tried transforming a bitmap to texture by a function in ShaderUtil.class named:
public static int createRgbaTexture (Bitmap bitmap).
And then used the texture above to new an AppTextureFrame which implements TextureFrame.
At last called the function public void onNewFrame(final TextureFrame) in FrameProcessor.calss.
But now I get a new problem. I got empty results of handPresence and handLandmarks.
I'm not sure if the method above is right or not.
I'm checking.

Thank you. Does your code work? I'm in "nativeMovePacketToInputStream" unable to perform this function.

@Sara533
Copy link
Author

Sara533 commented Dec 26, 2019

I tried transforming a bitmap to texture by a function in ShaderUtil.class named:
public static int createRgbaTexture (Bitmap bitmap).
And then used the texture above to new an AppTextureFrame which implements TextureFrame.
At last called the function public void onNewFrame(final TextureFrame) in FrameProcessor.calss.
But now I get a new problem. I got empty results of handPresence and handLandmarks.
I'm not sure if the method above is right or not.
I'm checking.

I modify my project as the steps above.
The program can be built and run successfully, but the results are not ok.
The log always shows that "Hand presence is false, no hands detected".
And the values of hand landmarks are always the initial values.
The bitmap shows right, and there are no errors in the log.
I don't know what's wrong. Anybody can help me?

@liao00
Copy link

liao00 commented Dec 26, 2019

I tried transforming a bitmap to texture by a function in ShaderUtil.class named:
public static int createRgbaTexture (Bitmap bitmap).
And then used the texture above to new an AppTextureFrame which implements TextureFrame.
At last called the function public void onNewFrame(final TextureFrame) in FrameProcessor.calss.
But now I get a new problem. I got empty results of handPresence and handLandmarks.
I'm not sure if the method above is right or not.
I'm checking.

I modify my project as the steps above.
The program can be built and run successfully, but the results are not ok.
The log always shows that "Hand presence is false, no hands detected".
And the values of hand landmarks are always the initial values.
The bitmap shows right, and there are no errors in the log.
I don't know what's wrong. Anybody can help me?

Can you share your code?

@afsaredrisy
Copy link

afsaredrisy commented Dec 26, 2019

Hey if you want to use Bitmap instead of SurfaceTexture input then you need to change samplerExternalOES to sampler2D here is the shader code you can use inExternalTextureRender
....

 private final String fragmentShaderCode =
            "#extension GL_OES_EGL_image_external : require\n"+
                    "varying mediump vec2 sample_coordinate;\n"
                    + "uniform sampler2D video_frame;\n"
                    + "\n"
                    + "void main() {\n"
                    + "  gl_FragColor = texture2D(video_frame, sample_coordinate);\n"
                    + "}";

Now you also need to change implementation of render(SurfaceTexture surfaceTexture) method of ExternalTextureRender like this.

public void render(int textureName, Bitmap bmp){
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
        GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bmp
                , 0);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,GLES20.GL_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glUseProgram(program);
        GLES20.glUniform1i(frameUniform, 0);
        GLES20.glUniformMatrix4fv(textureTransformUniform, 1, false, textureTransformMatrix, 0);
        ShaderUtil.checkGlError("glUniformMatrix4fv");
        GLES20.glEnableVertexAttribArray(ATTRIB_POSITION);
        GLES20.glVertexAttribPointer(
                ATTRIB_POSITION, 2, GLES20.GL_FLOAT, false, 0, CommonShaders.SQUARE_VERTICES);
        GLES20.glEnableVertexAttribArray(ATTRIB_TEXTURE_COORDINATE);
        GLES20.glVertexAttribPointer(
                ATTRIB_TEXTURE_COORDINATE,
                2,
                GLES20.GL_FLOAT,
                false,
                0,
                flipY ? FLIPPED_TEXTURE_VERTICES : TEXTURE_VERTICES);
        ShaderUtil.checkGlError("program setup");
        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureName);
        ShaderUtil.checkGlError("glDrawArrays");
    }

you can call this method from ExternalTextureConverter as :
renderer.render(outputFrames[0].getTextureName(), bitmapObject);

@Sara533
Copy link
Author

Sara533 commented Dec 30, 2019

@afsaredrisy
Hi, thanks for your reply.
I did what you said above, but I still got hand presence false and hand landmarks xyz(0.0, 0.0, 0.0).
I seems that the function public void render(int textureName, Bitmap bmp) render the bitmap to texture.
About the first step, besides the fragment, does vertex need to change?

@afsaredrisy
Copy link

afsaredrisy commented Dec 30, 2019

@Sara533
Yes it is rendering Bitmap to the texture of FrameBuffer. Can you please share your TextureConverter And TextureRenderer code. Then problem can be address
Vertex shader we don't need to change.

@Sara533
Copy link
Author

Sara533 commented Dec 31, 2019

@afsaredrisy
Here is my code:

`processor = new FrameProcessor(
this,
eglManager.getNativeContext(),
BINARY_GRAPH_NAME,
INPUT_VIDEO_STREAM_NAME,
OUTPUT_VIDEO_STREAM_NAME);

renderer = new TextureRenderer();
renderer.setFlipY(FLIP_FRAMES_VERTICALLY);
renderer.setup();

destinationTextureId = ShaderUtil.createRgbaTexture(previewWidth, previewHeight);
outputFrame = new AppTextureFrame(destinationTextureId, previewWidth, previewHeight);`

When rgbFrameBitmap is available:

renderer.render(outputFrame.getTextureName(), rgbFrameBitmap); outputFrame.setTimestamp(timestamp); outputFrame.setInUse(); processor.onNewFrame(outputFrame);

Here is the TextureRenderer:

`public class TextureRenderer {
private static final FloatBuffer TEXTURE_VERTICES = ShaderUtil.floatBuffer(new float[]{0.0F, 0.0F, 1.0F, 0.0F, 0.0F, 1.0F, 1.0F, 1.0F});
private static final FloatBuffer FLIPPED_TEXTURE_VERTICES = ShaderUtil.floatBuffer(new float[]{0.0F, 1.0F, 1.0F, 1.0F, 0.0F, 0.0F, 1.0F, 0.0F});
private static final String TAG = "TextureRenderer";
private static final int ATTRIB_POSITION = 1;
private static final int ATTRIB_TEXTURE_COORDINATE = 2;
private int program = 0;
private int frameUniform;
private int textureTransformUniform;
private float[] textureTransformMatrix = new float[16];
private boolean flipY;

private final String vertexShaderCode =
        "uniform mat4 texture_transform;\n"
                + "attribute vec4 position;\n"
                + "attribute mediump vec4 texture_coordinate;\n"
                + "varying mediump vec2 sample_coordinate;\n"
                + "\n"
                + "void main() {\n"
                + "  gl_Position = position;\n"
                + "  sample_coordinate = (texture_transform * texture_coordinate).xy;\n"
                + "}";

private final String fragmentShaderCode =
        "#extension GL_OES_EGL_image_external : require\n"
                + "varying mediump vec2 sample_coordinate;\n"
                + "uniform sampler2D video_frame;\n"
                + "\n"
                + "void main() {\n"
                + "  gl_FragColor = texture2D(video_frame, sample_coordinate);\n"
                + "}";

public TextureRenderer() {
}

public void setup() {
    Map<String, Integer> attributeLocations = new HashMap();
    attributeLocations.put("position", 1);
    attributeLocations.put("texture_coordinate", 2);
    this.program = ShaderUtil.createProgram(vertexShaderCode, fragmentShaderCode, attributeLocations);
    this.frameUniform = GLES20.glGetUniformLocation(this.program, "video_frame");
    this.textureTransformUniform = GLES20.glGetUniformLocation(this.program, "texture_transform");
    ShaderUtil.checkGlError("glGetUniformLocation");
}

public void setFlipY(boolean flip) {
    this.flipY = flip;
}

public void render(int textureName, Bitmap bmp) {
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    ShaderUtil.checkGlError("glActiveTexture");
    GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bmp, 0);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,GLES20.GL_LINEAR);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_LINEAR);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
    ShaderUtil.checkGlError("glTexParameteri");
    GLES20.glUseProgram(program);
    ShaderUtil.checkGlError("glUseProgram");
    GLES20.glUniform1i(frameUniform, 0);
    ShaderUtil.checkGlError("glUniform1i");
    GLES20.glUniformMatrix4fv(textureTransformUniform, 1, false, textureTransformMatrix, 0);
    ShaderUtil.checkGlError("glUniformMatrix4fv");
    GLES20.glEnableVertexAttribArray(ATTRIB_POSITION);
    GLES20.glVertexAttribPointer(
            ATTRIB_POSITION, 2, GLES20.GL_FLOAT, false, 0, CommonShaders.SQUARE_VERTICES);
    GLES20.glEnableVertexAttribArray(ATTRIB_TEXTURE_COORDINATE);
    GLES20.glVertexAttribPointer(
            ATTRIB_TEXTURE_COORDINATE,
            2,
            GLES20.GL_FLOAT,
            false,
            0,
            flipY ? FLIPPED_TEXTURE_VERTICES : TEXTURE_VERTICES);
    ShaderUtil.checkGlError("program setup");
    GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
    ShaderUtil.checkGlError("glDrawArrays");
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureName);
    ShaderUtil.checkGlError("glBindTexture");
    GLES20.glFinish();
}

public void release() {
    GLES20.glDeleteProgram(this.program);
}

}`

@Sanerly
Copy link

Sanerly commented Dec 31, 2019

@afsaredrisy
这是我的代码:

`处理器=新FrameProcessor(
此,
eglManager.getNativeContext(),
BINARY_GRAPH_NAME,
INPUT_VIDEO_STREAM_NAME,
OUTPUT_VIDEO_STREAM_NAME);

renderer = new TextureRenderer();
renderer.setFlipY(FLIP_FRAMES_VERTICALLY);
renderer.setup();

destinationTextureId = ShaderUtil.createRgbaTexture(previewWidth,PreviewHeight);
outputFrame = new AppTextureFrame(destinationTextureId,previewWidth,previewHeight);`

当rgbFrameBitmap可用时:

renderer.render(outputFrame.getTextureName(), rgbFrameBitmap); outputFrame.setTimestamp(timestamp); outputFrame.setInUse(); processor.onNewFrame(outputFrame);

这是TextureRenderer:

公共类TextureRenderer {
私有静态最终FloatBuffer TEXTURE_VERTICES = ShaderUtil.floatBuffer(new float [] {0.0F,0.0F,1.0F,0.0F,0.0F,1.0F,1.0F,1.0F});
私有静态最终FloatBuffer FLIPPED_TEXTURE_VERTICES = ShaderUtil.floatBuffer(new float [] {0.0F,1.0F,1.0F,1.0F,0.0F,0.0F,1.0F,0.0F});
私有静态最终String TAG =“ TextureRenderer”;
私有静态最终int ATTRIB_POSITION = 1;
私有静态最终int ATTRIB_TEXTURE_COORDINATE = 2;
private int程序= 0;
private int frameUniform;
private int textureTransformUniform;
私有float [] textureTransformMatrix = new float [16];
私人布尔FlipY;

private final String vertexShaderCode =
        "uniform mat4 texture_transform;\n"
                + "attribute vec4 position;\n"
                + "attribute mediump vec4 texture_coordinate;\n"
                + "varying mediump vec2 sample_coordinate;\n"
                + "\n"
                + "void main() {\n"
                + "  gl_Position = position;\n"
                + "  sample_coordinate = (texture_transform * texture_coordinate).xy;\n"
                + "}";

private final String fragmentShaderCode =
        "#extension GL_OES_EGL_image_external : require\n"
                + "varying mediump vec2 sample_coordinate;\n"
                + "uniform sampler2D video_frame;\n"
                + "\n"
                + "void main() {\n"
                + "  gl_FragColor = texture2D(video_frame, sample_coordinate);\n"
                + "}";

public TextureRenderer() {
}

public void setup() {
    Map<String, Integer> attributeLocations = new HashMap();
    attributeLocations.put("position", 1);
    attributeLocations.put("texture_coordinate", 2);
    this.program = ShaderUtil.createProgram(vertexShaderCode, fragmentShaderCode, attributeLocations);
    this.frameUniform = GLES20.glGetUniformLocation(this.program, "video_frame");
    this.textureTransformUniform = GLES20.glGetUniformLocation(this.program, "texture_transform");
    ShaderUtil.checkGlError("glGetUniformLocation");
}

public void setFlipY(boolean flip) {
    this.flipY = flip;
}

public void render(int textureName, Bitmap bmp) {
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
    GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
    ShaderUtil.checkGlError("glActiveTexture");
    GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bmp, 0);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER,GLES20.GL_LINEAR);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER,GLES20.GL_LINEAR);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
    ShaderUtil.checkGlError("glTexParameteri");
    GLES20.glUseProgram(program);
    ShaderUtil.checkGlError("glUseProgram");
    GLES20.glUniform1i(frameUniform, 0);
    ShaderUtil.checkGlError("glUniform1i");
    GLES20.glUniformMatrix4fv(textureTransformUniform, 1, false, textureTransformMatrix, 0);
    ShaderUtil.checkGlError("glUniformMatrix4fv");
    GLES20.glEnableVertexAttribArray(ATTRIB_POSITION);
    GLES20.glVertexAttribPointer(
            ATTRIB_POSITION, 2, GLES20.GL_FLOAT, false, 0, CommonShaders.SQUARE_VERTICES);
    GLES20.glEnableVertexAttribArray(ATTRIB_TEXTURE_COORDINATE);
    GLES20.glVertexAttribPointer(
            ATTRIB_TEXTURE_COORDINATE,
            2,
            GLES20.GL_FLOAT,
            false,
            0,
            flipY ? FLIPPED_TEXTURE_VERTICES : TEXTURE_VERTICES);
    ShaderUtil.checkGlError("program setup");
    GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
    ShaderUtil.checkGlError("glDrawArrays");
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureName);
    ShaderUtil.checkGlError("glBindTexture");
    GLES20.glFinish();
}

public void release() {
    GLES20.glDeleteProgram(this.program);
}

}`

Hello, thank you for your help. I used your code to get the timestamp correctly. Now I want to get the node that I recognize on my hand.
image
I don't know how to parse "Packet"

@afsaredrisy
Copy link

afsaredrisy commented Dec 31, 2019

Hi
Have a look on this.
https://github.com/afsaredrisy/MediapipeHandtracking_GPU_Bitmap_Input
I uploaded same work with Bitmap.

@afsaredrisy
Copy link

@Sanerly
are you looking for this ?

byte[] landmarksRaw = PacketGetter.getProtoBytes(packet);
                    try {
                        NormalizedLandmarkList landmarks = NormalizedLandmarkList.parseFrom(landmarksRaw);
                        if (landmarks == null) {
                            Log.d(TAG, "[TS:" + packet.getTimestamp() + "] No hand landmarks.");
                            return;
                        }
                        // Note: If hand_presence is false, these landmarks are useless.
                        Log.d(
                                TAG,
                                "[TS:"
                                        + packet.getTimestamp()
                                        + "] #Landmarks for hand: "
                                        + landmarks.getLandmarkCount());
                        Log.d(TAG, getLandmarksDebugString(landmarks));
                    } catch (Exception e) {
                        Log.e(TAG, "Couldn't Exception received - " + e);
                        return;
                    }
                });

@Sara533
Copy link
Author

Sara533 commented Jan 3, 2020

@afsaredrisy
Thanks a lot for your help. I have solved my problem.

@Sanerly
Copy link

Sanerly commented Jan 4, 2020

@afsaredrisy Thank you very much for your reply

@Sanerly
Copy link

Sanerly commented Jan 4, 2020

@afsaredrisy
image
Thanks for your help last time. At present, I want to calculate gestures based on landmarks, "OK", "Rock", how to calculate its x, y,

@mgyong mgyong closed this as completed Jan 6, 2020
@billhung
Copy link

@afsaredrisy Thank you so much. You've saved my day.
Much appreciation for you to spend your New Year Eve answering questions here too. You are a coding hero that deserves my tremendous respect.

@putdoor
Copy link

putdoor commented Mar 4, 2023

use new version success

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:android Issues with Android as Platform
Projects
None yet
Development

No branches or pull requests

9 participants