The “natural user interface” is coming, and perhaps sooner than expected.
Besides its 40,000 software developers, Microsoft (NASDAQ: MSFT) has a well-regarded, basic research organization staffed with some 850 scientists.
Every year, as part of an ongoing initiative to cross-pollinate between the two very different types of organizations, Microsoft holds a three-day “TechFest” where the scientists put on a mini-trade show to demonstrate their latest brainstorms and other projects to developers, who might put them to use on the product side of the house.
At TechFest 2011 this week, much of the emphasis is on technologies and experiments meant to make using computers more intuitive. The idea is to make interacting with a world of machines more natural, and thus the term natural user interface, or “NUI,” evolved.
For example, think of creating photo-realistic 3D talking heads, or recording 3D scans using a normal digital camera, or performing facial recognition inside a video.
“In the near future, a television or an Xbox will be able to recognize people in the living room, home video will be annotated automatically and become searchable, and TV watchers will be able to get information about an unfamiliar actor, athlete or singer just by pointing to the person on the screen,” Microsoft Research said in a statement.
NUI can include a lot of different technologies, such as voice response and recognition, gesture recognition, multi-touch screens and facial identification, as well as various combinations of technologies.
Craig Mundie, Microsoft’s chief research and strategy officer, showed some demonstrations of NUI concepts in late January at the World Economic Forum in Davos, Switzerland.
Perhaps the most visible contributions from MSR to date can be found in Microsoft’s Kinect wireless 3D game controller, which began selling before Thanksgiving and has already sold some 10 million units, the company said this week.
Gestural recognition in 3D underlies Kinect, and some observers view the technology, which is being adapted to work with computers as well as with game consoles, as key improvement to the usability of computers in the near future.
Another example with a strong contribution from MSR is the company’s Surface table-top computer.
At TechFest this year, Microsoft researchers are also demonstrating technologies that may impact life sciences. For instance, MSR researchers showed high-performance 3D graphics processing used to aid in analysis of images generated for colon cancer screenings, complete with a gesture-based control interface.
In late January, a team of engineering students at the University of Washington adapted a Kinect to a medical robot to provide force feedback to the surgeon.
Microsoft Research is experimenting with the NUI in terms of what might be termed an old-generation NUI — the pen.
“In the real world, people hold tools such as pens, paintbrushes, sketching pencils, knives and compasses differently, and we enable a user to alter the grip on a digital pen to switch between functionalities — this enables a natural UI on the pen,” MSR said.