After it bucked the rumor mill and “leaked” an official image of the Pixel 4 in June, Google continued its unorthodox strategy by dropping a 22-second video about the Pixel 4 on Monday, way before its anticipated October launch. The video and accompanying blog showed off two new features of the phone: face unlock (similar to Apple’s Face ID) and motion sense — the latter of which lets you “skip songs, snooze alarms, and silence phone calls” by waving a hand in front of the phone’s embedded radar chip. It’s a feature we don’t need but deeply want, judging by the number of gestural interface referenced in sci-fi movies such as Minority Report.
My first conscious encounter with touchless controls was with the iMotion CarPlay Direct Connect. Created by Monster Cable in 2011, it was a car charger that plugged into a cigarette lighter and let me control my iPod or iPhone. By waving my hand left or right in front of its motion sensor, I could skip and pause songs without having to pick up my iPod. But it was incredibly bulky and unreliable. Sometimes it wouldn’t register my motions at all, but it would maddeningly pause my music any time I reached for a drink in my cup holder. Still, I told just about everybody who stepped into my car what I used it for, and continued to use the stupid thing for a solid month before I became too frustrated and packed it away in my center console to collect dust.
But touchless controls are compelling and the idea of using them with our computers, TVs, home appliances and phones still endures. And for people with a mobility or physical disability, gesture controls can be even more of a boon. By getting rid of physical barriers, our interactions with technology becomes that much more frictionless, and transforms into something both natural and supernatural. To reach a point where we can intuitively control a device with our bodies or minds is the ideal, and the end result will feel something not unlike magic. That’s why I forgave that touchless car charger for so long: It gave me a flawed but alluring glimpse of a future that we’ve been inching towards.
And inching we are. Currently, Samsung is working on software that lets you control its smart TVs with your brain, Comcast wants you to change the TV channel with your eyes and we may be soon navigating Netflix with our eyes too.
Phone companies, which make perhaps the most handled and looked-at devices in our daily lives, are also keen on gestural interfaces. We’re far from the day when we can type out text messages with our eyes, accept or reject calls with a simple nod or shake of our heads, or remotely trigger a phone’s camera shutter with a literal snap of our fingers. But companies are trying. Back in 2013, Samsung equipped its Galaxy S4 with Air Gesture. By slowly gliding your whole hand over the phone’s camera, you could navigate through photos in the Gallery. It was limited and worked slowly, and the feature never quite took off.
In that same vein, the more recent LG G8 ThinQ has Air Motion. Equipped with an IR sensor and transmitter, the G8 can track and read hand movements. By pinching your fingers and thumbs together, you can swipe to launch certain apps, pause or play media. And you can adjust the volume by miming a twist of a jog dial. Novel as it may be, the feature is clumsy to use and LG didn’t feature it in its following LG V50 ThinQ. (Though who knows, LG may include it in the G8’s successor.)
Apple is yet another high-profile phone maker reportedly working on a touchless interface. In April 2018, Bloomberg reported that the tech giant was working on a way to control some phone tasks by moving your finger closer to the screen. The technology would build off Apple’s 3D Touch, which calls up additional controls and menu items based on how hard you press the display.
From what we can tell, the Pixel 4’s motion sense is still relatively limited — music playback, snoozing and muting calls is a ways away from the completely touchless future we all aspire towards. But for Google to put this in its next flagship and not, say, a Pixel 4A where the stakes are lower, I expect it to work more reliably and smoothly than Samsung and LG’s implementations even with its limitations.
With low Pixel phone sales year over year, not only must the Pixel 4’s motion sense be persuasive enough for Google to remain competitive in the phone industry, it must work exceptionally well to stay two steps ahead of Apple. Looking to the future is one thing, but executing it well enough is a different kind of success that Google both wants and needs.