- Natural user interface
-
In computing, a natural user interface, or NUI, is the common parlance used by designers and developers of computer interfaces to refer to a user interface that is effectively invisible, or becomes invisible with successive learned interactions, to its users. The word natural is used because most computer interfaces use artificial control devices whose operation has to be learned. A NUI relies on a user being able to quickly transition from novice to expert. While the interface requires learning, that learning is eased through design which gives the user the feeling that they are instantly and continuously successful. This can be aided by technology which allows users to carry out relatively natural motions, movements or gestures that they quickly discover control the computer application or manipulate the on-screen content. A common misunderstanding of the "Natural User Interface" is that it is somehow mimicry of nature or that some inputs to a computer are somehow more 'natural' than others. In truth, the goal is to make the user feel like a natural.
Contents
History
In 2006 Christian Moore established an open research community with the goal to expand discussion and development related to NUI technologies.[1] In a 2008 conference presentation "Predicting the Past," August de los Reyes, a Principal User Experience Director of Surface Computing at Microsoft described the NUI as the next evolutionary phase following the shift from the command-line interface (CLI) to the graphical user interface (GUI).[2]
In the CLI, users had to learn an artificial means of input, the keyboard, and a series of codified inputs, that had a limited range of responses, where the syntax of those commands was strict.
Then, when the mouse enabled the GUI, users could more easily learn the mouse movements and actions, and were able to explore the interface much more. The GUI relied on metaphors for interacting with on-screen content or objects. The 'desktop' and 'drag' for example, being metaphors for a visual interface that ultimately was translated back into the strict codified language of the computer.
An example of the misunderstanding of the term NUI was demonstrated by demonstrations at the Consumer Electronics Show in 2010. "Now a new wave of products is poised to bring "natural user interfaces"—as these methods of controlling electronics devices are called—to an even broader audience."[3]
In 2010 Microsoft's Bill Buxton reiterated the importance of the NUI within Microsoft Corporation with a video discussing the technologies and its future potential. [4]
Early examples
- Multi-Touch
When Bill Buxton was asked about the iPhone's interface, he responded "Multi-touch technologies have a long history. To put it in perspective, the original work undertaken by my team was done in 1984, the same year that the first Macintosh computer was released, and we were not the first."[5]
Multi-Touch is a technology which could enable a natural user interface. However, most UI toolkits used to construct interfaces executed with such technology are traditional GUI interfaces.
Examples of interfaces commonly referred to as NUI
- Perceptive Pixel
One example is the work done by Jefferson Han on multi-touch interfaces. In a demonstration at TED in 2006, he showed a variety of means of interacting with on-screen content using both direct manipulations and gestures. For example, to shape an on-screen glutinous mass, Jeff literally 'pinches' and prods and pokes it with his fingers. In a GUI interface for a design application for example, a user would use the metaphor of 'tools' to do this, for example, selecting a prod tool, or selecting two parts of the mass that they then wanted to apply a 'pinch' action to. Han showed that user interaction could be much more intuitive by doing away with the interaction devices that we are used to and replacing them with a screen that was capable of detecting a much wider range of human actions and gestures.
- Microsoft Surface
Microsoft Surface takes similar ideas on how users interact with content, but adds in the ability for the device to optically recognise objects placed on top of it. In this way, users can trigger actions on the computer through the same gestures and motions as Jeff Han's touchscreen allowed, but also objects become a part of the control mechanisms. So for example, when you place a wine glass on the table, the computer recognises it as such and displays content associated with that wine glass. Placing a wine glass on a table is a natural thing for a person to do, hence it fits as part of a natural user interface.
- 3D Immersive Touch
3D Immersive Touch is defined as the direct manipulation of 3D virtual environment objects using single or multi-touch surface hardware in multi-user 3D virtual environments. Coined first in 2007 to describe and define the 3D natural user interface learning principles associated with Edusim. Immersive Touch natural user interface now appears to be taking on a broader focus and meaning with the broader adaption of surface and touch driven hardware such as the iPhone, iPod touch, iPad, and a growing list of other hardware. Apple also seems to be taking a keen interest in “Immersive Touch” 3D natural user interfaces over the past few years.
- Xbox Kinect
Xbox Kinect is a product from Xbox that uses spatial gestures for interaction instead of a game controller. According to Microsoft's page, Kinect is designed for a "a revolutionary new way to play: no controller required."[6]
- Dragon Naturally Speaking
Dragon Naturally Speaking is a speech recognition software package developed and sold by Nuance Communications for Windows and Mac personal computers.
See also
- Kinetic user interface
- Organic user interface
- Tangible user interface
- Post-WIMP
- Spatial navigation
- Touch user interface
- Eye tracking
- Edusim
- Graphical User Interface (GUI)
Notes
- ^ Moore, Christian (2006-07-15). "New Community Open". NUI Group Community. http://nuigroup.com/log/comments/forums_launched/.
- ^ de los Reyes, August (2008-09-25). "Predicting the Past". Web Directions South 2008. Sydney Convention Centre: Web Directions. http://www.webdirections.org/resources/august-de-los-reyes-predicting-the-past/.
- ^ Wingfield, Nick (2010-01-05). "Body in Motion: CES to Showcase Touch Gizmos". Wall Street Journal. http://channel9.msdn.com/posts/LarryLarsen/CES-2010-NUI-with-Bill-Buxton/.
- ^ Buxton, Bill (2010-01-06). "CES 2010: NUI with Bill Buxton". Microsoft Research. http://channel9.msdn.com/posts/LarryLarsen/CES-2010-NUI-with-Bill-Buxton/.
- ^ Buxton, Bill. "Multi-Touch Systems that I Have Known and Loved". Bill Buxton. http://www.billbuxton.com/multitouchOverview.html.
- ^ "Xbox.com Project Natal". http://www.xbox.com/en-US/live/projectnatal/.
References
External links
- Examples and Further Reading
- http://www.vimeo.com/channels/nui - Vimeo's NUI Channel showcasing NUI Interfaces
- http://nuigroup.com/faq/ - NUI Group FAQ
- http://wiki.nuigroup.com/Natural_User_Interface NUI Group's Wiki Entry on Natural User Interface
- Academic research
- NiCE (Natural User Interfaces for Collaborative Environments) - NUI project at the Media Interaction Lab of the Upper Austria University of Applied Sciences
- Natural user interaction in virtual/augmented reality - NUI research at the University of North Carolina at Chapel Hill Department of Computer Science
- Multimodal Human-Computer Interaction: a constructive and empirical study - Research from the University_of_TampereInterfaccia utente naturale
Categories:- User interfaces
- History of human–computer interaction
Wikimedia Foundation. 2010.