Search

Tuesday 9 May 2017

Google's Project SOLI to bring Gestures into another level

Google is living in the future right now. They're in the middle of developing a new interaction sensor using radar technology that can track movement with crazy accuracy. It's only the size of a small computer chip and can be inserted into everyday objects and things we use daily. 




Project Soli is that wearable, but it’s not the wearable you might think it is. It’s not a watch; it’s you.
Google ATAP knows your hand is the best method you have for interaction with devices, but not everything is a device. Project Soli wants to make your hands and fingers the only user interface you’ll ever need.



To make that happen, Project Soli is really a radar that is small enough to fit into a wearable like a smartwatch. The small radar picks up on your movements in real-time, and uses movements you make to alter its signal.



At rest, the hand is actually moving slightly, which end up as a baseline response on the radar. Moving the hand away from or side-to-side in relation to the radar changes the signal and amplitude. Making a fist or crossing fingers also changes the signal.





To make the signal make sense to an app or service, ATAP will have APIs that tap into the deep machine learning of Project Soli.


So as we know the Project Soli tracks movements with a tiny radar to give devices the ability of recognizing human gestures and then trigger corresponding responses, as seen in the video above. Google has already devised a PC-sized radar emitter that’s no bigger than a dime, and it will be available for testing next year.

No comments:

Post a Comment