banner



Google Open-Sources Hand Tracking AI "MediaPipe" for Smartphones

google ai rebrand

Google has open up-sourced an AI that is capable of recognizing manus shapes and motions in real-time earlier this calendar week. This motility by Google will help a lot of aspiring developers to implement gesture recognition capabilities to their app.

The software giant showed off the characteristic earlier in Computer Vision and Pattern Recognition(CVPR) 2019 conference which took place in June. The source code for the AI is now bachelor on GitHub which you can bank check out from here. You may likewise download arm64 APK here, and a version with 3D mode here.

MediaPipe is a cross-platform framework which can be implemented for building pipelines to process perceptual information of different formats(sound and video). This is made possible by applying machine learning techniques to identify 21 3D keypoints of a hand from a single frame of an image.

"The ability to perceive the shape and motion of easily can be a vital component in improving the user experience across a variety of technological domains and platforms," reads Google AI blog post.

Google employs three AI models in MediaPipe which they call BlazePalm, manus landmark model, and gesture recognizer. The palm detector model(BlazePalm) is responsible for analyzing a frame and returning an oriented hand bounding box while hand landmark model is used for returning 3D hand key points from a cropped image region and gesture recognizer is used for classifying previously computed keypoint configurations into a set of gestures.

google palm detector

The coolest part of this manus-tracking AI is the ability to place gestures. Researchers say that the AI is able to recognize common hand signs similar "Thumbs up", closed fist, "OK", "Rock", and "Spiderman". Pretty absurd, right? Accept a wait at the GIF below to watch the AI in action.

"We believe that publishing this technology can give an impulse to new creative ideas and applications by the members of the enquiry and developer community at large.", wrote Valentin Bazarevsky and Fan Zhang, Inquiry Engineers at Google.

The futurity goals of the researchers at Google AI is to enhance the functionality and efficiency of the AI. This might include extended back up to gestures, more fast and accurate tracking, and support for dynamic gestures.

Source: https://beebom.com/google-open-sources-hand-tracking-ai/

Posted by: joneswarry1979.blogspot.com

0 Response to "Google Open-Sources Hand Tracking AI "MediaPipe" for Smartphones"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel