While Wednesday’s Google I/O event largely hyped the company’s biggest AI initiatives, the company also announced updates to its machine learning suite that powers Google Lens and Google Meet features like object tracking and recognition. , gesture control and of course face detection. The latest update allows app developers to create Snapchat-like face filters and hand tracking, among other things, where the company shows off a GIF that is definitely not Memoji.
This update supports a special project announced at the I/O Developers Keynote: an open-source accessibility application called Project Gameface that lets you play games… with your face. During the keynote, Google played a very Wes Anderson-esque mini-documentary revealing a tragedy that led the company to design Gameface.
Game streamer Lance Carr, who goes by the name GimpyG and whose rare form of muscular dystrophy left him unable to use his hands, was streaming a Hearthstone session when a fire broke out in his garage. He was able to leave before it spread to the rest of his house, but unfortunately the equipment that allowed him to enjoy his favorite pastime – his head-tracking mouse and gaming PC – had to be left behind and was destroyed.
Replacing this stuff isn’t cheap. Head-tracking mouse gear can cost hundreds of dollars, and that’s to say nothing of its gaming setup. Google’s new software aims to remove one of those costly barriers.
The company says it worked with Carr to design a piece of software that would allow anyone with a webcam to control their computer using head movements and gestures, all translated to the screen by the Windows-only software.
I’ve tried it out, and it’s very cool, even in its nascent state. It’s not installing rather as easy as downloading an installer – after all, it’s on GitHub. You need to download the 0.3.30 release from the right side of the screen, extract the resulting .zip file, and open the app by navigating to it in the Windows run.exe program browse menu. Easy, see?
However, once you open the app, the interface is refreshingly simple. The home screen offers four options. There is Camera, where you select your webcam. Cursor Speed allows you to adjust mouse speed (in any of the four directions), mouse pointer vibration, flicker reduction, and even how long to hold a gesture before triggering an action. Two other menus let you associate gestures with various mouse and keyboard buttons and other actions, including reorienting the mouse. extreme necessary because it often loses sight of the center.
I immediately knew which game I wanted to try it with: one of my all-time favorites, Rez. Rez is an on-rails shooter that debuted on the Dreamcast in the heady days of the early s, and it only has three in-game actions: move a cursor; fire your weapon; and activating the oh-panic-shoot-all special ability. (That the game’s battle to infiltrate a cutting-edge AI is thematically appropriate for Google I/O Week somehow didn’t occur to me until I was writing this.)
It took some tinkering, but after three tries with Gameface’s settings, I was able to set the sensitivities for all gestures, and I dove in. The learning curve wasn’t as hard as I expected! Turning my head to move the reticle around and opening my mouth to hold down the fire button while selecting my targets was easy (and surprisingly low latency), and re-aiming the cursor quickly became second nature, perhaps because I had to use it so often. I wasn’t nearly as good at it as I was with a traditional keyboard and mouse setup for the Steam version, but I saw it coming.
Gameface is very neat, but currently it is also limited. With just six facial gestures to control, you’ll soon run out of input – don’t expect to be playing intense, complicated FPS games with it any time soon. But I saw combining it with something like a voice input app to get more control. And who knows? Perhaps Google will add more gesture recognition options later to give players more ways to play.