Visual-Multi-Sensor Odometry with Application in Autonomous Driving
by , ,
Abstract:
Relative localization in GNSS-denied environments is an essential task in advanced driver assistance systems or autonomous driving. In this paper, we present a loosely coupled visual-multi-sensor odometry algorithm for relative localization. Inertial Measurement Unit (IMU), vehicle speed, and steering angle measurements are fused in an Unscented Kalman Filter (UKF) to estimate the vehicle’s pose and velocity as well as the IMU biases. Relative pose estimates between two camera frames are used to update the UKF and further increase precision in the estimated localization. Our system is able to localize a vehicle in real-time from arbitrary states such as an already moving car which is a challenging scenario. We evaluate our visual-multi-sensor algorithm on real-world datasets recorded in inner-city and rural areas and compare it two state-of-the-art Visual-Inertial Odometry (VIO) algorithms. We report a lower relative odometry error in particular at the start of motion estimation with lower computational cost.
Reference:
Visual-Multi-Sensor Odometry with Application in Autonomous Driving (Andreas Serov, Joachim Clemens, Kerstin Schill), In 93rd IEEE Vehicular Technology Conference (VTC2021-Spring), IEEE, 2021.
Bibtex Entry:
@inproceedings{serov2021visual,
  author={Serov, Andreas and Clemens, Joachim and Schill, Kerstin},
  booktitle={93rd IEEE Vehicular Technology Conference (VTC2021-Spring)}, 
  title={Visual-Multi-Sensor Odometry with Application in Autonomous Driving}, 
  year={2021},
  pages={1-7},
  organization={IEEE},
  publisher={IEEE},
  abstract={Relative localization in GNSS-denied environments is an essential task in advanced driver assistance systems or autonomous driving. In this paper, we present a loosely coupled visual-multi-sensor odometry algorithm for relative localization. Inertial Measurement Unit (IMU), vehicle speed, and steering angle measurements are fused in an Unscented Kalman Filter (UKF) to estimate the vehicle’s pose and velocity as well as the IMU biases. Relative pose estimates between two camera frames are used to update the UKF and further increase precision in the estimated localization. Our system is able to localize a vehicle in real-time from arbitrary states such as an already moving car which is a challenging scenario. We evaluate our visual-multi-sensor algorithm on real-world datasets recorded in inner-city and rural areas and compare it two state-of-the-art Visual-Inertial Odometry (VIO) algorithms. We report a lower relative odometry error in particular at the start of motion estimation with lower computational cost.},
  url={https://ieeexplore.ieee.org/document/9448847},
  doi={10.1109/VTC2021-Spring51267.2021.9448847},
  keywords={atcity}
}