An Embedded Multi-Sensor Data Fusion Design for Vehicle Perception Tasks - Université Polytechnique des Hauts-de-France Accéder directement au contenu
Article Dans Une Revue Journal of Communications Année : 2018

An Embedded Multi-Sensor Data Fusion Design for Vehicle Perception Tasks

Résumé

Nowadays, multi-sensor architectures are popular to provide a better understanding of environment perception for intelligent vehicles. Using multiple sensors to deal with perception tasks in a rich environment is a natural solution. Most of the research works have focused on PC-based implementations for perception tasks and very few concerns have been addressed for customized embedded designs. In this paper, we propose a Multi-Sensor Data Fusion (MSDF) embedded design for vehicle perception tasks using stereo camera and Light Detection and Ranging (LIDAR) sensors. A modular and scalable architecture based on Zynq-7000 SoC was designed.
Fichier non déposé

Dates et versions

hal-03400988 , version 1 (25-10-2021)

Identifiants

Citer

Mokhtar Bouain, Karim Mohamed Abedallah Ali, Denis Berdjag, Nizar Fakhfakh, Rabie Ben Atitallah. An Embedded Multi-Sensor Data Fusion Design for Vehicle Perception Tasks. Journal of Communications, 2018, 13 (1), pp.8-14. ⟨10.12720/jcm.13.1.8-14⟩. ⟨hal-03400988⟩
69 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More