Technology

Stuck between invention and implementation

How good intentions got in the way of a revolutionary new technology
Pranav Mistry demos SixthSense

Earlier this week I wrote about SixthSense, an incredibly cool invention by Pranav Mistry of MIT’s Media Lab. SixthSense is an inexpensive “wearable gestural interface” that employs a projector and a camera to read human gestures. It sees what you see, and turns any surface into a screen.

In his excellent 2009 TED talk, Mistry shares a few possible applications for SixthSense, all of which merely hint at the device’s potential to truly integrate the physical and digital worlds.  Mistry saved the best for last: the whole thing was to be released as an open source technology. If you wanted to make your own SixthSense at home, he’d provide the plans for free. If you wanted to improve upon it, or write an app for it, then have at it! The standing ovation Mistry received at TED was followed by gushing press attention and two major awards. Clearly, this was the next big thing.

So what happened? SixthSense seems to have disappeared right after Mistry unveiled it! The “instructions” link on the SixthSense website, promising to show you how to make your own prototype for $350, leads to a “coming soon” screen. No information is given on what’s happening with the device or when we can expect it.

I promised to find out what happened in between invention and implementation, and I did.

I reached Pranav Mistry at his MIT office and did my best to find a polite way of asking, “Dude, what’s taking so long?” Mistry began his explanation somewhat defensively. “Things take time,” he said, and assured me that much progress was being made. Such as? Well, for one thing, projectors have become a lot smaller and cheaper. And those silly colored finger thimbles he uses in the TED video? They’re history. Thanks to a depth-sensing camera similar to the one used on Xbox’s Kinect, SixthSense can now read hand gestures without the gaudy little accessories. Pilot tests are underway, and the public’s patience will be rewarded.

But as we chatted on, a more complicated picture emerged. I asked him why he had not at least released the plans for the SixthSense prototype, so that others could start developing it. It turns out that Mistry and his team hacked the prototype together using proprietary Microsoft code libraries, and they were now in the process of writing new code from scratch that will be theirs to give away. “These things will be solved,” he promised.

But there are other obstacles. MIT’s Media Lab has sponsors—Samsung, Intel, and Google among them. According to their website, high-level sponsors of the Media Lab have “royalty-free license rights, in perpetuity, to patents registered during their period of sponsorship”. Exactly what these sponsors intend to do with SixthSense (if anything) remains unclear.

“I am a student. It’s not my expertise to market products,” Mistry explained. “I’m not interested in making money from this. SixthSense is not a product, it’s a vision. There will not be one SixthSense device. If Steve Jobs comes out with a SixthSense product tomorrow, that would be a good day for me.”

I’ve known enough technologists to believe that Mistry is 100% sincere about this. It’s common for developers to favour progress over paycheck. They’re confident that if they can deploy their ideas as widely as possible, they’ll find a way to get compensated in time. The greater threat is obscurity. Many an ingenious invention has died on the laboratory shelf awaiting the right marketing moment or patent approval.

Given this, I asked, wouldn’t it have been better if SixthSense didn’t have to wait for redundant code to be written, or for MIT’s corporate sponsors to see a business angle? Would it not have been better if Mistry had developed SixthSense in his garage rather than at the world’s foremost technological institution?

“It would have been better,” he said. “But I cannot easily escape from reality”.