Google Open-Sources Real-Time Gesture Recognition Tech

Google relied on computer vision and machine learning to research a better way to perceive hand shapes and motions in real-time, for use in gesture control systems, sign language recognition and augmented reality. The result is the ability to infer up to 21 3D points of a hand (or hands) on a mobile phone from a single frame. Google, which demonstrated the technique at the 2019 Conference on Computer Vision and Pattern Recognition, also put the source code and a complete use case scenario on GitHub. Continue reading Google Open-Sources Real-Time Gesture Recognition Tech

Google, Infineon Prototype New Gesture Tracking Technology

A new generation of chips is making gesture tracking more accurate. German company Infineon Technologies AG has paired its radar chips with Google’s algorithms to create Soli technology, enabling devices to detect smaller gestures from several meters away. The first Soli technology devices, presented at the Google I/O developer conference, are prototypes of an LG Electronics smartwatch and a Harman Kardon loudspeaker. Recognizable gestures include hand movements such as those required to wind a watch. Continue reading Google, Infineon Prototype New Gesture Tracking Technology

Myriad Applications Envisioned for Facial Recognition Tech

New technology allows computers to be programmed to recognize facial expressions — even the most subtle, fleeting expressions. Using frame-by-frame video analysis, computer software can read the muscular changes within people’s faces that indicate a range of emotions. Many predict such software will be used via computer webcams to rate how users respond to certain content — like games or videos — and cater to those users’ perceived needs or desires accordingly. Continue reading Myriad Applications Envisioned for Facial Recognition Tech