- Google’s next-gen Pixel smartphone will come with the “Project Soli” Aware sensor.
- Android Q already has two new hand gestures for media playback control, but more could be added along the way.
- All major manufacturers are focusing on hand gestures right now, and accuracy is the key to intriguing consumers.
Back in January, we reported that Google’s “Project Soli” had received FCC approval, so it was moving forward with speed. Truth is that we didn’t expect to see its first, even tentative implementation on a smartphone like a Pixel 4, which is expected to be presented very soon. Apparently, we will, as rumors claim that the next-gen Pixel 4 smartphone is equipped with the required radar sensor (Aware) that is developed in the “Project Soli” labs, and which is capable of receiving hand gesture inputs with amazing precision.
Interacting with a device without ever touching the screen is undoubtedly the future in the world of the smartphone. LG has already released models that feature its new “Air Motion” interaction system, giving users some form of non-touching control like screen unlocking, device muting, volume setting, and more. We are only in the beginning, as these systems can potentially do a lot more, and with the Soli chip finding its way inside the Pixel 4 device, the news on that part is auspicious. So, the question that arises now is what new hand gestures the new Pixel phone will be able to recognize?
By looking into the also upcoming Android Q code, two hand gestures that have been added are “Skip” and “Silence”, clearly pointing to media playback control. Beta testers of Android Q cannot test these hand gestures, as the hardware is not available to them yet, but the additions indicate that we should expect the functionality soon, and not in the distant future. Of course, the Soli chip will open up a host of novel capabilities, so maybe we will see a lot more than just controlling the device’s volume by hand. These are all assumptions, and we’ll have to wait and see what lands with the actual device.
Google is not playing alone in this field. Apple is developing its own TrueDepth camera, Microsoft has been developing gesture recognition for the Kinect for many years now, and Samsung is also working feverishly to perfect their own hand gesture recognition systems. Of course, LG was the first manufacturer to present something working to the smartphone audience, but it’s still too early to tell what impact this new technology will have on the sales of smartphones in general and whether people will pay more to be able to control their devices without ever touching the screen.
Do you see air gestures as a real game-changer, or as yet another needless gimmick? Let us know where you stand in the comments down below, and don’t forget to check our socials on Facebook and Twitter for more tech news like this one.