Visualizing Sensor and Prototyping ADAS/AD Algorithms (Forward Collision Warning example)
17 Sep 2020 | 15:00-16:00 (GMT+8)
Online via Webex


About The Event

More Autonomy = More Sensors = More Raw Data to Visualize and Process

Full Autonomy requires more Sensors and more raw Data to process.

Several sensor types are available to AV manufacturers, with radar, LiDAR, camera, audio, and thermal being the common options. As you’d expect, each sensor offers its own unique strengths and weaknesses, as well as applicability for certain autonomous functions.





Forward collision warning (FCW) is an important feature in driver assistance and automated driving systems, where the goal is to provide correct, timely, and reliable warnings to the driver before an impending collision with the vehicle in front. To achieve the goal, vehicles are equipped with forward-facing vision and radar sensors. Sensor fusion is required to increase the probability of accurate warnings and minimize the probability of false warnings.
For the purposes of this example, a test car (the ego vehicle) was equipped with various sensors and their outputs were recorded. The sensors used for this example were:
  1. Vision sensor, which provided lists of observed objects with their classification and information about lane boundaries. The object lists were reported 10 times per second. Lane boundaries were reported 20 times per second.
  2. Radar sensor with medium and long range modes, which provided lists of unclassified observed objects. The object lists were reported 20 times per second.
  3. IMU, which reported the speed and turn rate of the ego vehicle 20 times per second.
  4. Video camera, which recorded a video clip of the scene in front of the car. Note: This video is not used by the tracker and only serves to display the tracking results on video for verification.


The Speakers

Supamith S
Application Engineer
Minh Cong
Application Engineer

Register Now

View Upcoming Events