Follow these instructions.
See mobile, desktop, web and Google Coral examples.
We will soon create Documentation to help understand the changes we made in this framework. MediaPipe Read-the-Docs or docs.mediapipe.dev
Check out the Examples page for tutorials on how to use MediaPipe. Concepts page for basic definitions
A web-based visualizer is hosted on viz.mediapipe.dev. Please also see instructions here.
Search MediaPipe Github repository using Google Open Source code search
This project is part of a senior design project at Purdue University. My team and I are attempting to create an ASL translator using the Mediapipe Framework. We have so far been able to translator ASL letters. We are going to attempt to interpret words as well in ASL.
"MediaPipe has made it extremely easy to track the hands of the user. We take the data and interpret it into the ASL letter it is likely to be"
- Hand Tracking This shows the handtracking Mediapipe has already created.
Gifs of each category came from this database: https://asl-lex.github.io/asl-lex/index.html
Architecture was based off of this repo: https://github.com/Tachionstrahl/SignLanguageRecognition
Converted model to work on CPU, translate ASL phrases, and work on MacOS