- Particle filter
Particle filters, also known as sequential
Monte Carlo method s (SMC), are sophisticated modelestimation techniques based onsimulation .They are usually used to estimate
Bayesian models and are the sequential ('on-line') analogue ofMarkov chain Monte Carlo (MCMC) batch methods and are often similar toimportance sampling methods. If well-designed, particle filters can be much faster than MCMC. They are often an alternative to the Extended Kalman filter (EKF) or Unscented Kalman filter (UKF) with the advantage that, with sufficient samples, they approach the Bayesian optimal estimate, so they can be made more accurate than either the EKF or UKF. The approaches can also be combined by using a version of theKalman filter as a proposal distribution for the particle filter.Goal
The particle filter aims to estimate the sequence of hidden parameters, for , based only on the observed data for . All Bayesian estimates of follow from the
posterior distribution . In contrast, the MCMC or importance sampling approach would model the full posterior .Model
Particle methods assume and the observations can be modeled in this form:
* is a first order
Markov process such that
*:and with an initial distribution .
*The observations are conditionally independent provided that are known:In other words, each only depends on ::One example form of this scenario is
::
where both and are mutually independent and identically distributed sequences with known
probability density function s and and are known functions.These two equations can be viewed as state space equations and look similar to the state space equations for theKalman filter . If the functions and were linear, and if both and wereGaussian , the Kalman filter finds the exact Bayesian filtering distribution. If not, Kalman filter based methods are a first-order approximation. Particle filters are also an approximation, but with enough particles can be much more accurate.Monte Carlo approximation
Particle methods, like all sampling-based approaches (e.g., MCMC), generate a set of samples that approximate the filtering distribution . So, with samples, expectations with respect to the filtering distribution are approximated by
:
and , in the usual way for Monte Carlo, can give all the moments etc. of the distribution up to some degree of approximation.
Sampling Importance Resampling (SIR)
"Sampling importance resampling (SIR)" is a very commonly usedparticle filtering algorithm, which approximates the filteringdistribution by a weighted setof P particles
: .
The "importance weights" are approximations tothe relative posterior probabilities (or densities) of the particlessuch that .
SIR is a sequential (i.e., recursive) version of
importance sampling .As in importance sampling, the expectation of a function can be approximated as a weighted average:
For a finite set of particles, the algorithm performance is dependent on the choice of the"proposal distribution"
: .
The "optimal proposal distribution" is given as the "target distribution":
However, the transition prior is often used as importance function, since it is easier to draw particles (or samples) and perform subsequent importance weight calculations:: "Sampling Importance Resampling" (SIR) filters with transition prior as importance function are commonly known as bootstrap filter and
condensation algorithm ."Resampling" is used to avoid the problem of degeneracy of thealgorithm, that is, avoiding the situation that all but one of theimportance weights are close to zero. The performance of the algorithmcan be also affected by proper choice of resampling method. The"
stratified resampling " proposed by Kitagawa (1996) is optimal interms of variance.A single step of sequential importance resampling is as follows:
:1) For draw samples from the "proposal distribution"
::
:2) For update the importance weights up to a normalizing constant:
::
:3) For compute the normalized importance weights:
::
:4) Compute an estimate of the effective number of particles as
::
:5) If the effective number of particles is less than a given threshold , then perform resampling:
::a) Draw particles from the current particle set with probabilities proportional to their weights. Replace the current particle set with this new one.
::b) For set
The term " Sequential Importance Resampling" is also sometimes used when referring to SIR filters.
Sequential Importance Sampling (SIS)
* Is the same as Sampling Importance Resampling, but without the resampling stage.
"Direct version" algorithm
The "direct version" algorithm is rather simple (compared to other particle filtering algorithms) and it uses composition and rejection.To generate a single sample at from :
:1) Set p=1
:2) Uniformly generate L from
:3) Generate a test from its distribution
:4) Generate the probability of using from where is the measured value
:5) Generate another uniform u from
:6) Compare u and
::6a) If u is larger then repeat from step 2
::6b) If u is smaller then save as and increment p
:7) If p > P then quit
The goal is to generate P "particles" at using only the particles from .This requires that a Markov equation can be written (and computed) to generate a based only upon .This algorithm uses composition of the P particles from to generate a particle at and repeats (steps 2-6) until P particles are generated at .
This can be more easily visualized if is viewed as a two-dimensional array.One dimension is and the other dimensions is the particle number.For example, would be the Lth particle at and can also be written (as done above in the algorithm).Step 3 generates a "potential" based on a randomly chosen particle () at time and rejects or accepts it in step 6.In other words, the values are generated using the previously generated .
Other Particle Filters
*
Auxiliary particle filter
*Gaussian particle filter
*Unscented particle filter
*Monte Carlo particle filter
*Gauss-Hermite particle filter
*Cost Reference particle filter
*Rao-Blackwellized particle filter See also
*
Ensemble Kalman filter
*Recursive Bayesian estimation References
* cite book
author = Doucet, A.
coauthors = De Freitas, N.; Gordon, N.J.
year = 2001
title = Sequential Monte Carlo Methods in Practice
publisher = Springer
isbn =* cite book
author = Cappe, O.
coauthors = Moulines, E.; Ryden, T.
year = 2005
title = Inference in Hidden Markov Models
publisher = Springer
isbn =* cite book
author = Liu, J.
year = 2001
title = Monte Carlo strategies in Scientific Computing
publisher = Springer
isbn =* cite book
author = Ristic, B.
coauthors = Arulampalam, S.; Gordon, N.
year = 2004
title = Beyond the Kalman Filter: Particle Filters for Tracking Applications
publisher = Artech House
isbn =* cite journal
author = Doucet, A.
coauthors = Godsill, S.; Andrieu, C.;
year = 2000
title = On Sequential Monte Carlo Methods for Bayesian Filtering
journal = Statistics and Computing
volume = 10
issue = 3
pages = 197-208
url = http://www.springerlink.com/content/q6452k2x37357l3r/* cite journal
author = Arulampalam, M.S.
coauthors = Maskell, S.; Gordon, N.; Clapp, T.;
year = 2002
title = A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking
journal = IEEE Transactions on Signal Processing
volume = 50
issue = 2
pages = 174–188
url = http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=978374
doi = 10.1109/78.978374* cite journal
author = Cappe, O.
coauthors =Godsill, S.; Moulines, E.;
year = 2007
title = An overview of existing methods and recent advances in sequential Monte Carlo
journal = Proceedings of IEEE
volume = 95
issue = 5
doi = 10.1109/JPROC.2007.893250
pages = 899* cite journal
author = Kotecha, J.H.
coauthors = Djuric, P.;
year = 2003
title =Gaussian Particle filtering
volume = 51
issue = 10
journal = IEEE Transactions Signal Processing* cite journal
author = Haug, A.J.
year = 2005
title = A Tutorial on Bayesian Estimation and Tracking Techniques Applicable to Nonlinear and Non-Gaussian Processes
journal = The MITRE Corporation, USA, Tech. Rep., Feb
url = http://www.mitre-corporation.net/work/tech_papers/tech_papers_05/05_0211/05_0211.pdf
accessdate = 2008-05-06* cite journal
author = Pitt, M.K.
coauthors = Shephard, N.
year = 1999
title = Filtering Via Simulation: Auxiliary Particle Filters
journal = Journal of the American Statistical Association
volume = 94
issue = 446
pages = 590–591
url = http://www.questia.com/PM.qst?a=o&se=gglsc&d=5002321997
accessdate = 2008-05-06
doi = 10.2307/2670179External links
* [http://www-sigproc.eng.cam.ac.uk/smc/ Sequential Monte Carlo Methods (Particle Filtering)] homepage on University of Cambridge
* [http://www.cs.washington.edu/ai/Mobile_Robotics/mcl/ Dieter Fox's MCL Animations]
* [http://web.engr.oregonstate.edu/~hess/ Rob Hess' free software]
* [http://babel.isa.uma.es/mrpt/index.php/Particle_Filters Tutorial on particle filtering] with the MRPT C++ library, and a [http://babel.isa.uma.es/mrpt/index.php/Paper:An_Optimal_Filtering_Algorithm_for_Non-Parametric_Observation_Models_in_Robot_Localization_(ICRA_2008)#Video mobile robot localization video] .
Wikimedia Foundation. 2010.