DH-PTAM: A Deep Hybrid Stereo Events-Frames Parallel Tracking And Mapping System - SIAM - Signal, Image et AutoMatique
Article Dans Une Revue IEEE Transactions on Intelligent Vehicles Année : 2024

DH-PTAM: A Deep Hybrid Stereo Events-Frames Parallel Tracking And Mapping System

Résumé

This paper presents a robust approach for a visual parallel tracking and mapping (PTAM) system that excels in challenging environments. Our proposed method combines the strengths of heterogeneous multi-modal visual sensors, including stereo event-based and frame-based sensors, in a unified reference frame through a novel spatio-temporal synchronization approach. We employ deep learning-based feature extraction and description for estimation to enhance robustness further. We also introduce an end-to-end parallel tracking and mapping optimization layer complemented by a simple loop-closure algorithm for efficient SLAM behavior. Through comprehensive experiments on both small-scale and large-scale real-world sequences of VECtor and TUM-VIE benchmarks, our proposed method (DH-PTAM) demonstrates superior performance in terms of robustness and accuracy in adverse conditions, especially in large-scale HDR scenarios. Our implementation's research-based Python API is publicly available on GitHub for further research and development: \url{https://github.com/AbanobSoliman/DH-PTAM}.
Fichier principal
Vignette du fichier
DH_PTAM__A_Deep_Hybrid_Stereo_Events_Frames_Parallel_Tracking_And_Mapping_System.pdf (12.99 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04620377 , version 1 (21-06-2024)

Identifiants

Citer

Abanob Soliman, Fabien Bonardi, Désiré Sidibé, Samia Bouchafa. DH-PTAM: A Deep Hybrid Stereo Events-Frames Parallel Tracking And Mapping System. IEEE Transactions on Intelligent Vehicles, In press, pp.1-10. ⟨10.1109/TIV.2024.3412595⟩. ⟨hal-04620377⟩
149 Consultations
18 Téléchargements

Altmetric

Partager

More