Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hand tracking, desktop version #56

Closed
omnishore1 opened this issue Aug 26, 2019 · 8 comments
Closed

hand tracking, desktop version #56

omnishore1 opened this issue Aug 26, 2019 · 8 comments
Labels

Comments

@omnishore1
Copy link

hi is there any desktop version, i follow some instruction for desktop, but some files is missing
ERROR: Skipping 'mediapipe/examples/desktop/hand_tracking:hand_tracking_tflite': no such package 'mediapipe/examples/desktop/hand_tracking': BUILD file not found in any of the following directories.

any solution for windows desktop
thanks

@mgyong mgyong assigned mgyong and unassigned mgyong Aug 26, 2019
@mgyong
Copy link

mgyong commented Aug 26, 2019

@omnishore1 There is only Android & iOS versions of handtracking. We do not have any desktop [linux] example for hand tracking yet. We plan to release something shortly and will update this issue when we do.
Regarding windows, pls refer to this comment #44

@mgyong mgyong closed this as completed Aug 26, 2019
@mgyong mgyong added the platform:desktop desktop label Aug 27, 2019
@LeviWadd
Copy link

LeviWadd commented Sep 3, 2019

Any idea when this might be (days/weeks/months) @mgyong? It would be really cool if we could have the .pb files for hand_tracking, instead of just the .tflite

@solarjoe
Copy link

I am also very interested in hand tracking for desktop. Is there a way to get this running using the e.g. the //mediapipe/models:hand_landmark.tflite and mediapipe/models:palm_detection.tflite from the Android example?

@LeviWadd
Copy link

@solarjoe You can get the hand_landmark.tflite model running in Linux by using an interpreter. Not sure if it's possible to get it running in Windows. The palm_detection model is a little more tricky as you need to provide Custom Ops. Looking at the git history, seems the TF team made steps towards making this easy, but reverted the changes.

@solarjoe
Copy link

solarjoe commented Sep 19, 2019

Thanks! And I found https://github.com/wolterlw/hand_tracking and #62

@solarjoe
Copy link

@LeviWadd, what you you mean with interpreter? Tensorflow? Can you point me to some sample code?

@LeviWadd
Copy link

LeviWadd commented Sep 19, 2019

@solarjoe Thanks for dropping the link to that repo here. That looks very useful.

Sure, I mean something like:
(EDIT: interpreter has moved to tf.lite.Interpreter now I believe)

import numpy as np
import tensorflow as tf

# Load TFLite model and allocate tensors.
interpreter = tf.contrib.lite.Interpreter(model_path="converted_model.tflite")
interpreter.allocate_tensors()

# Get input and output tensors.
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()

# Test model on random input data.
input_shape = input_details[0]['shape']
input_data = np.array(np.random.random_sample(input_shape), dtype=np.float32)
interpreter.set_tensor(input_details[0]['index'], input_data)

interpreter.invoke()
output_data = interpreter.get_tensor(output_details[0]['index'])
print(output_data)

Taken from Nupur Garg's answer on https://stackoverflow.com/questions/50902067/how-to-import-the-tensorflow-lite-interpreter-in-python

It's also worth mentioning that if you play around with the indices of the input/output details you can get the probability that a hand is present in the supplied image.

@mgyong
Copy link

mgyong commented Oct 2, 2019

@omnishore1 @solarjoe We have just released v0.6.2 that has hand tracking desktop example. Pls check it out!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants