- Gesture recognition
Gesture Recognition is a topic in
computer science andlanguage technology with the goal of interpreting humangesture s via mathematicalalgorithms . Gestures can originate from any bodily motion or state but commonly originate from theface orhand . Current focuses in the field include emotion recognition from the face and hand gesture recognition. Many approaches have been made using cameras andcomputer vision algorithms to interpretsign language . However, posture andproxemics can be object to gesture recognition, too. [Matthias Rehm, Nikolaus Bee, Elisabeth André, [http://mm-werkstatt.informatik.uni-augsburg.de/files/publications/199/wave_like_an_egyptian_final.pdf Wave Like an Egyptian - Accelerometer Based Gesture Recognition for Culture Specific Interactions] , British Computer Society, 2007]Gesture Recognition can be seen as a way for computers to begin to understand human
body language , thus building a richer bridge between machines and humans than primitivetext user interface s or evenGUI s (Graphical User Interfaces), which still limit the majority of input to keyboard and mouse.Gesture Recognition enables humans to interface with the machine (HMI) and interact naturally without any mechanical devices. Using the concept of Gesture Recognition, it is possible to point a finger at the
computer screen so that the cursor will move accordingly. This could potentially make conventionalinput devices such as mouse, keyboards and even touch-screens redundant.Gesture Recognition can be conducted with techniques from
computer vision andimage processing ."Gesture recognition and pen computing:"
* Historically (prior to about 2001) the term gesture recognition was generally used to refer specifically to handwriting gestures, such as inking on agraphics tablet or mouse gesture recognition. This is computer interaction through the drawing of symbols with a pointing device cursor (see discussion atPen computing ). Strictly speaking the term mouse strokes could be used instead of mouse gestures since this implies written communication, making a mark to represent a symbol.Uses of Gesture Recognition
Gesture Recognition is useful for processing information from humans which is not conveyed through speech or type. As well, there are various types of gestures which can be identified by computers.
*Sign language recognition. Just as speech recognition can transcribe speech to text, certain types of gesture recognition software can transcribe the symbols represented through
sign language into text. [Thad Starner, Alex Pentland, [http://citeseer.comp.nus.edu.sg/cache/papers/cs/405/ftp:zSzzSzwhitechapel.media.mit.eduzSzpubzSztech-reportszSzTR-306.ps.gz/starner95visual.ps.gz Visual Recognition of American Sign Language Using Hidden Markov Models] , Massachusetts Institute of Technology]
*Directional indication through pointing. Pointing has a very specific purpose in our society, to reference an object or location based on its position relative to ourselves. The use of Gesture Recognition to determine where a person is pointing is useful for identifying the context of statements or instructions. This application is of particular interest in the field ofrobotics . [Kai Nickel, Rainer Stiefelhagen, [http://isl.ira.uka.de/~stiefel/papers/nickel_journal_article_in_press.pdf Visual recognition of pointing gestures for human-robot interaction] , Image and Vision Computing, vol 25, Issue 12, December 2007, pp 1875-1884]
*Control through facial gestures. Controlling a computer through facial gestures is a useful application of Gesture Recognition for users who may not physically be able to use a mouse or keyboard.Eye tracking in particular may be of use for controlling cursor motion or focusing on elements of a display.
*Alternative computer interfaces. Foregoing the traditional keyboard and mouse setup to interact with a computer, strong Gesture Recognition could allow users to accomplish frequent or common tasks using hand or face gestures to a camera.
*Immersive game technology. Gestures can be used to control interactions within video games to try and make the game player's experience more interactive or immersive.
*Virtual controllers. For systems where the act of finding or acquiring a physical controller could require too much time, gestures can be used as an alternative control mechanism. Controlling secondary devices in a car, or controlling a television set are examples of such usage. [William Freeman, Craig Weissman, [http://www.merl.com/reports/docs/TR1994-024.pdf Television control by hand gestures] , Mitsubishi Electric Research Lab, 1995]
*Affective computing. InAffective computing , gesture recognition is used in the process of identifying emotional expression through computer systems.Input devices
The ability to track a person's movements and determine what gestures they may be performing can be achieved through various tools. Although there is a large amount of research done in image/video based Gesture Recognition, there is some variation within the tools and environments used between implementations.
*Depth-aware cameras. Using specialized cameras one can generate a depth map of what is being seen through the camera at a short range, and use this data to approximate a 3d representation of what is being seen. These can be effective for detection of hand gestures due to their short range capabilities. [Yang Liu, Yunde Jia, A Robust Hand Tracking and Gesture Recognition Method for Wearable Visual Interfaces and Its Applications, Proceedings of the Third International Conference on Image and Graphics (ICIG’04), 2004]
*Stereo cameras . Using two cameras whose relations to one another are known, a 3d representation can be approximated by the output of the cameras. This method uses more traditional cameras, and thus does not hold the same distance issues as current depth-aware cameras. To get the cameras' relations, one can use a positioning reference such as alexian-stripe orinfrared emitters. [Kue-Bum Lee, Jung-Hyun Kim, Kwang-Seok Hong, An Implementation of Multi-Modal Game Interface Based on PDAs, Fifth International Conference on Software Engineering Research, Management and Applications, 2007]
*Controller-based gestures. These controllers act as an extension of the body so that when gestures are performed, some of their motion can be conveniently captured by software. Mouse gestures are one such example, where the motion of the mouse is correlated to a symbol being drawn by a person's hand, as is theWii Remote , which can study changes in acceleration over time to represent gestures. [Per Malmestig, Sofie Sundberg, [http://www.tricomsolutions.com/academic_reports.html SignWiiver - implementation of sign language technology] ] [Thomas Schlomer, Benjamin Poppinga, Niels Henze, Susanne Boll, [http://wiigee.sourceforge.net/download_files/gesture_recognition_with_a_wii_controller-schloemer_poppinga_henze_boll.pdf Gesture Recognition with a Wii Controller] , Proceedings of the 2nd international Conference on Tangible and Embedded interaction, 2008]
*Single camera. A normal camera can be used for gesture recognition where the resources/environment wouldn't be convenient for other forms of image-based recognition. Although not necessarily as effective as stereo or depth aware cameras, using a single camera allows a greater possibility of accessibility to a wider audience. [Wei Du, Hua Li, Vision based gesture recognition system with single camera, 5th International Conference on Signal Processing Proceedings, 2000]Challenges of Gesture Recognition
There are many challenges associated with the accuracy and usefulness of Gesture Recognition software. For image-based gesture recognition there are limitations on the equipment used and
image noise . Images or video may not be under consistent lighting, or in the same location. Items in the background or distinct features of the users may make recognition more difficult.The variety of implementations for image-based gesture recognition may also cause issue for viability of the technology to general usage. For example, recognition using stereo cameras or depth-detecting cameras are not currently commonplace. Video or web cameras can give less accurate results based on their limited resolution."Gorilla arm"
"Gorilla arm" was a side-effect that destroyed vertically-oriented touch-screens as a mainstream input technology despite a promising start in the early 1980s. [ [http://community.zdnet.co.uk/blog/0,1000000567,10008314o-2000331777b,00.htm Windows 7? No arm in it - Mixed Signals - Rupert Goodwins's Blog at ZDNet.co.uk Community ] ]
Designers of touch-menu systems failed to notice that humans aren't designed to hold their arms in front of their faces making small motions. After more than a very few selections, the arm begins to feel sore, cramped, and oversized -- the operator looks like a gorilla while using the touch screen and feels like one afterwards. This is now considered a classic cautionary tale to human-factors designers; "Remember the gorilla arm!" is shorthand for "How is this going to fly in real use?".
Gorilla arm is not a problem for specialist short-term-use uses, since they only involve brief interactions which do not last long enough to cause gorilla arm.
ee also
*
Pen computing Discussion of gesture recognition for tablet computers
*Mouse gesture
*Computer Vision
*Gesture s
*Hidden Markov Model
*Language Technology External links
* [http://www.youtube.com/watch?v=FqzmfeUg8Oc SignWiiver] --A gesture recognition system using a Nintendo Wii-controller
* [http://www.cybernet.com Cybernet Systems Corporation] --Commercial gesture recognition products
* [http://www.igesture.org iGesture Framework for Pen and Mouse-based Gesture Recognition] --Free Java-based gesture recognition framework
* [http://perception.inrialpes.fr/people/Cuzzolin/review.html A Gesture Recognition Review] --Compendium of references
* [http://ls7-www.cs.uni-dortmund.de/research/gesture/vbgr-table.html Vision Based Hand Gesture Recognition Systems] --List of available gesture recognition systems and their features
* [http://www.movesinstitute.org/~kolsch/HandVu/HandVu.html HandVu] --Open-source vision-based hand gesture interface
* [http://www.bruceongames.com/2007/10/02/the-future-it-is-all-a-gesture/ The future, it is all a Gesture] --Gesture interfaces and video gaming
* [http://www.luminvision.co.uk/interactive.htm Interactive Projection Systems] --Commercial Motion Tracking and Gesture Recognition ProductsFootnotes
Wikimedia Foundation. 2010.