Like Kinect, but better: WiSee startup uses WiFi signals to read hand gestures

Imagine sitting on your couch at home and, with the wave of an arm, turning up the volume on your stereo, or flipping the channel on a television set across the room. A few more hand waves could shut off the lights in distant rooms, or cause a fireplace to blaze to life.

Imagine sitting on your couch at home and, with the wave of an arm, turning up the volume on your stereo, or flipping the channel on a television set across the room. A few more hand waves could shut off the lights in distant rooms, or cause a fireplace to blaze to life.

This is the promise of WiSee, a sensing system designed by a team at the University of Washington that uses WiFi signals to read gestures throughout the home—it can even “see” through walls.

Unlike Microsoft’s Kinect, for example, WiSee doesn’t need a sensor. It can be used with any WiFi-connected device, such as an iPhone or a laptop, and a WiSee-enabled router to pick up movements. After testing nine different gestures in an office and a two-bedroom apartment, the team found it was accurate 94 per cent of the time.

Of course, flailing an arm to change the music risks making the user look a little ridiculous—but that’s a risk early adopters must be willing to take. Just look at the people wearing Google Glass.