by Thomas Reineking
Abstract:
This paper derives a particle filter algorithm within the Dempster-Shafer framework. Particle filtering is a well-established Bayesian Monte Carlo technique for estimating the current state of a hidden Markov process using a fixed number of samples. When dealing with incomplete information or qualitative assessments of uncertainty, however, Dempster-Shafer models with their explicit representation of ignorance often turn out to be more appropriate than Bayesian models. The contribution of this paper is twofold. First, the Dempster-Shafer formalism is applied to the problem of maintaining a belief distribution over the state space of a hidden Markov process by deriving the corresponding recursive update equations, which turn out to be a strict generalization of Bayesian filtering. Second, it is shown how the solution of these equations can be efficiently approximated via particle filtering based on importance sampling, which makes the Dempster-Shafer approach tractable even for large state spaces. The performance of the resulting algorithm is compared to exact evidential as well as Bayesian inference.
Reference:
Particle filtering in the Dempster-Shafer theory (Thomas Reineking), In International Journal of Approximate Reasoning, Elsevier, volume 52, 2011.
Bibtex Entry:
@Article{Reineking2011,
author = {Thomas Reineking},
title = {Particle filtering in the Dempster-Shafer theory},
journal = {International Journal of Approximate Reasoning},
year = {2011},
volume = {52},
number = {8},
pages = {1124--1135},
month = {nov},
abstract = {This paper derives a particle filter algorithm within the Dempster-Shafer framework. Particle filtering is a well-established Bayesian Monte Carlo technique for estimating the current state of a hidden Markov process using a fixed number of samples. When dealing with incomplete information or qualitative assessments of uncertainty, however, Dempster-Shafer models with their explicit representation of ignorance often turn out to be more appropriate than Bayesian models. The contribution of this paper is twofold. First, the Dempster-Shafer formalism is applied to the problem of maintaining a belief distribution over the state space of a hidden Markov process by deriving the corresponding recursive update equations, which turn out to be a strict generalization of Bayesian filtering. Second, it is shown how the solution of these equations can be efficiently approximated via particle filtering based on importance sampling, which makes the Dempster-Shafer approach tractable even for large state spaces. The performance of the resulting algorithm is compared to exact evidential as well as Bayesian inference.},
doi = {10.1016/j.ijar.2011.06.003},
publisher = {Elsevier},
url = {10.1016/j.ijar.2011.06.003">http://dx.doi.org/10.1016/j.ijar.2011.06.003},
}