- Visual odometry
In
robotics andcomputer vision , visual odometry is the process of determining the position and orientation of a robot by analyzing the associated camera images. It has been used in a wide variety of robotic applications, such as on theMars Exploration Rover s.cite journal
author = Maimone, M.
coauthors = Cheng, Y.; Matthies, L.
year = 2007
title = Two years of Visual Odometry on the Mars Exploration Rovers
journal = Journal of Field Robotics
volume = 24
issue = 3
pages = 169-186
url = http://www-robotics.jpl.nasa.gov/publications/Mark_Maimone/rob-06-0081.R4.pdf
accessdate = 2008-07-10]Overview
In
navigation ,odometry is the process of determining the position and orientation of a vehicle by measuring the wheel rotations through devices such asrotary encoder s. While useful for many wheeled or tracked vehicles, traditional odometry techniques cannot be applied to robots with non-standard locomotion methods, such as legged robots. In addition, odometry universally suffers from precision problems, since wheels tend to slip and slide on the floor, and the error increases even more when the vehicle runs on non-smooth surfaces. As the errors accumulate over time, the odometry readings become increasingly unreliable.Visual odometry is the process of determining equivalent odometry information using only camera images. Compared to traditional odometry techniques, visual odometry is not restricted to a particular locomotion method, and can be utilized on any robot with a sufficiently high quality camera.
Algorithm
Most existing approaches to visual odometry are based on the following stages.
# Acquire input
stereo camera s, oromnidirectional camera s.cite conference
author = Corke, P.
coauthors = Strelow, D.; Singh, S.
year =
title = Omnidirectional visual odometry for a planetary rover
conference =
booktitle = Intelligent Robots and Systems, 2004.(IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on
volume = 4
publisher =
url = http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1390041
conferenceurl = ]
# Image correction: applyimage processing techniques for lens distortion removal, etc
# Feature detection: define interest operators, and match features across frames and constructoptical flow field.
## Use correlation to establish correspondence of two images, and no long term feature tracking.
## Feature extraction and correlation (Lucas–Kanade method ).
##Construct optical flow field.
# Check flow field vectors for potential tracking errors and remove outliers.cite conference
author = Campbell, J.
coauthors = Sukthankar, R.; Nourbakhsh, I.; Pittsburgh, I.R.
year =
title = Techniques for evaluating optical flow for visual odometry in extreme terrain
conference =
booktitle = Intelligent Robots and Systems, 2004.(IROS 2004). Proceedings. 2004 IEEE/RSJ International Conference on
volume = 4
publisher =
url = http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1389991
conferenceurl = ]
# Estimation of the camera motion of the camera from the optical flow.cite journal
author = Sunderhauf, N.
coauthors = Konolige, K.; Lacroix, S.; Protzel, P.
year = 2005
title = Visual odometry using sparse bundle adjustment on an autonomous outdoor vehicle
journal = Levi, Schanz, Lafrenz, and Avrutin, editors, Tagungsband Autonome Mobile Systeme
pages = 157-163
url = http://www.tu-chemnitz.de/etit/proaut/index.download.df493a7bc2c27263f7d8ff467ea84879.pdf
accessdate = 2008-07-10] cite journal
author = Konolige, K.
coauthors = Agrawal, M.; Bolles, R.C.; Cowan, C.; Fischler, M.; Gerkey, B.P.
year = 2006
title = Outdoor mapping and navigation using stereo vision
journal = Proc. of the Intl. Symp. on Experimental Robotics (ISER)
url = http://www.springerlink.com/index/g442h0p7n313w1g2.pdf
accessdate = 2008-07-10] cite journal
author = Cheng, Y.
coauthors = Maimone, M.W.; Matthies, L.
year = 2006
title = Visual Odometry on the Mars Exploration Rovers
journal = IEEE Robotics and Automation Magazine
volume = 13
issue = 2
pages = 54-62
url = http://ieeexplore.ieee.org/iel5/100/31467/101109RA2006CHENG.pdf?arnumber=101109RA2006CHENG
accessdate = 2008-07-10]
## Choice 1:Kalman filter for state estimate distribution maintenance.
## Choice 2: find the geometric and 3D properties of the features that minimize acost function based on the re-projection error between two adjacent images. This can be done by mathematical minimization orrandom sampling .
# Periodic repopulation of trackpoints to maintain coverage across the image.ee also
*
Optical flow
*Odometry
*Dead reckoning References
Wikimedia Foundation. 2010.