Multi-Sensor Fusion and Active Perception for Autonomous Deep Space Navigation
by , ,
Abstract:
Keeping track of the current state is a crucial task for mobile autonomous systems, which is referred to as state estimation. To solve that task, information from all available sensors needs to be fused, which includes relative measurements as well as observations of the surroundings. In a dynamic 3D environment, the pose of an agent has to be chosen such that the most relevant information can be observed. We propose an approach for multi-sensor fusion and active perception within an autonomous deep space navigation scenario. The probabilistic modeling of observables and sensors for that particular domain is described. For state estimation, we present an Extended Kalman Filter, an Unscented Kalman Filter, and a Particle Filter, which all operate on a manifold state space. Additionally, an approach for active perception is proposed, which selects the desired attitude of the spacecraft based on the knowledge about the dynamics of celestial objects, the kind of information they provide as well as the current uncertainty of the filters. We evaluated the localization performance of the algorithms within a simulation environment. The filters are compared to each other and we show that our active perception strategy outperforms two other information intake approaches.
Reference:
Multi-Sensor Fusion and Active Perception for Autonomous Deep Space Navigation (David Nakath, Joachim Clemens, Kerstin Schill), In 21st International Conference on Information Fusion (FUSION), IEEE, 2018.
Bibtex Entry:
@inproceedings{nakath2018multi,
	author={Nakath, David and Clemens, Joachim and Schill, Kerstin},
	title = {Multi-Sensor Fusion and Active Perception for Autonomous Deep Space Navigation},
	booktitle={21st International Conference on Information Fusion (FUSION)},
	year={2018},
	month=jul,
  pages={2596-2605},
	publisher={IEEE},
	url={https://ieeexplore.ieee.org/document/8455788},
	doi={10.23919/ICIF.2018.8455788},
	abstract={Keeping track of the current state is a crucial task
	for mobile autonomous systems, which is referred to as state
	estimation. To solve that task, information from all available
	sensors needs to be fused, which includes relative measurements
	as well as observations of the surroundings. In a dynamic 3D
	environment, the pose of an agent has to be chosen such that
	the most relevant information can be observed. We propose an
	approach for multi-sensor fusion and active perception within
	an autonomous deep space navigation scenario. The probabilistic
	modeling of observables and sensors for that particular domain is
	described. For state estimation, we present an Extended Kalman
	Filter, an Unscented Kalman Filter, and a Particle Filter, which all
	operate on a manifold state space. Additionally, an approach for
	active perception is proposed, which selects the desired attitude
	of the spacecraft based on the knowledge about the dynamics
	of celestial objects, the kind of information they provide as
	well as the current uncertainty of the filters. We evaluated the
	localization performance of the algorithms within a simulation
	environment. The filters are compared to each other and we
	show that our active perception strategy outperforms two other
	information intake approaches.}
}