FEATURE SERIES: FOCUS
Beyond the Touchscreen: The Human Body as User Interface
A look at how researchers are testing "imaginary interfaces" as an alternative to the usual mode of mobile-device interaction.
October 20 , 2013
Imagine making a phone call or checking your e-mail by tapping the palm of your hand or tugging your earlobe.
It may sound like the stuff of science fiction, but two separate groups of researchers in Germany are testing technologies that use the human body as an integrated component of the user interface. Researchers at the Hasso Plattner Institute have designed an interface dubbed "the Imaginary Phone" situated within the palm of the user's hand. Meanwhile, researchers at the Technical University of Darmstadt have created a prototype called "EarPut" that uses the ear's surface to input commands from user to computer.
"This is likely to be something that could actually work for real users, not just something that sounds good or looks cool in a demo," says usability and design expert Jakob Nielsen, Ph.D., a principal at the Nielsen Norman Group in Fremont, Calif. "It's part of a more general trend to move the user interface away from the traditional computer."
Look, Mom—No Eyes!
The Imaginary Phone researchers designated points on the palm representing number pad elements, news, e-mail and so on. The prototype requires a camera mounted on the user's chest to "watch" the interaction performed by the hand and a Bluetooth earpiece to provide audio feedback to the user. As each point on the palm is touched, a specific mobile function is activated and announced by a computerized voice.
The team began researching what they call "imaginary interfaces" as an alternative to the usual mode of mobile-device interaction. Specifically, they wanted to create an interface that didn't require the user to look at the device at all—or, in researcher Sean Gustafson's words, that was "less greedy" with the user's visual channel.
"Interaction with mobile phones today involves taking a physical device out of your pocket, looking directly at it and using your fingers to interact with it," says Gustafson, a Ph.D. student in human-computer interaction at the Institute near Berlin. "This is all fine, except that it takes time and removes us from the environment. We wanted to leave the user's vision available to experience the world."
The Imaginary Phone prototype appears promising. Gustafson and fellow researchers Bernhard Rabe and Patrick Baudisch found that users could select targets on the palm extremely accurately and could readily associate device functionality with parts of the body. "This suggests body-based interfaces could be the foundation of future mobile interactive systems," Gustafson says.
The "EarPut" team also focused on eyes-free mobile interaction, but they used the human ear as the interface for interactions, such as touching, grasping and mid-air gestures like hovering or swiping the hand near the ear. The team's main goal was to improve accessories worn behind the ear, such as headsets or glasses, in an unobtrusive way, says researcher Roman Lissermann.
Why the ear? Because, Lissermann says, the ear readily lends itself to the task of augmenting devices we already wear. Moreover, the ear makes it easy for people to input commands using just one hand, which they can do so reliably without visual attention due to the human sense of "proprioception"—the innate human ability to sense the position of our body parts in relation to each other.
Here, too, the research looks promising: the Darmstadt researchers, which included Jochen Huber, Aristotelis Hadjakos and Max Mühlhäuser, found people can target up to four salient regions on their ear effectively and precisely.
A 20-Year Trend
According to Nielsen, another benefit of the research is that we never go anywhere without our bodies, making it nearly impossible to forget, lose or misplace a mobile device with a body-based user interface. Also beneficial is the more detailed degree of feedback that comes with touching one's own body versus, say, a touchscreen.
"Feedback is a crucial point in usability because knowing what's happening is key to allowing the human to control the system," Nielsen says. "Visual feedback is definitely very important, but you're not getting that tactile feedback that is a strength of making literally our own bodies the user interface."
Nielsen says he expects mobile devices that use this so-called haptic feedback to become a long-term trend, noting it took 20 years for the computer mouse to evolve to the point where it was ready for use in popular systems like the Macintosh. "It's not something we expect to see in systems in the next two to three years," he says. "This is something that's more of a 20-year trend. That's just how history is."
A Remote Controller for the Internet of Everything?
So how will this development mesh with another major trend, The Internet of Everything, in which the networked connections that make up the Web expand to include people, process, data and things?
In this sensor-rich world, Lissermann says, an EarPut device could be used as a remote controller. The wearer could use it to do everything from adjusting the volume on her iPod to switching on a computer to surfing TV channels—all with a few taps or swipes of the ear.
"Your remote controller will always be with you on a device you already wear, such as glasses or earphones," Lissermann says. "EarPut would be a sensor that would augment these normal, behind-the-ear devices by allowing you to use your own ear for interaction."
Nielsen says designers of such devices face two main hurdles: one is the technical challenge of making the devices small and cheaply enough for mass adoption; the other is designing the devices in a user-friendly way.
"History shows that when we get these new capabilities, the first designs tend to be overboard and not that good," Nielsen says, noting that, in developing countries, a mobile device may be the only connected device a user has. "So let's remember that we're designing for billions of people. Something like this can expand the user base even further."
The contents or opinions in this feature are independent and may not necessarily represent the views of Cisco. They are offered in an effort to encourage continuing conversations on a broad range of innovative technology subjects. We welcome your comments and engagement.
We welcome the re-use, republication, and distribution of "The Network" content. Please credit us with the following information: Used with the permission of http://thenetwork.cisco.com/.