The application uses a Leap Motion to track the users hand and uses Machine Learning to convert given standard (American Sign Language) hand gestures to text and audio.
Given the training set provided, the team was able to demonstrate correct gesture predictions for up to 24 letters and one phrase over 90% of the times