Sensor data fusion for lateral safe applications
13th World Congress & Exhibition on Intelligent Transport Systems and Services
ExCel London, United Kingdom
Thursday 12 October 2006
Technical Session 128 : Intelligent Vehicle - LDW
Authors : Angelos Amditis, Nikolaos Floudas, Aris Polychronopoulos (ICCS), Dirk Bank (DaimlerChrysler AG), Bas van den Broek (TNO Defense, Security and Safety), Fred Oechsle (Robert Bosch GmbH)
This paper describes the algorithms that are being developed for the perception layer of the PReVENT subproject LATERAL SAFE. These algorithms aim at achieving a reliable representation of the objects and their kinematics, present at the lateral and rear field of the ego-vehicle. The work presented in this paper is within the fields of radar tracking, sensor network processing, image and stereo vision processing, and integration and fusion of sensor-level processed data. The perception layer of LATERAL SAFE is a distributed sensor-level fusion system that processes in a central level the tracks of four tracking systems: a rear looking long range radar, two lateral short range radar networks and a system of lateral and rear looking cameras.