Most modern autonomous or semi-autonomous vehicles are equipped with sensor suites that contain multiple sensors.
Rotational and translational transformations are required to calibrate and fuse data from these sensors.
Fusing lidar data with corresponding camera data is particularly useful in the perception pipeline.
The lidar and camera calibration (LCC) workflow serves this purpose. It uses the checkerboard pattern calibration method.
Lidar Toolbox™ provides lidar camera calibration functionality, which is an essential step in combining data from lidar and a camera in a system.
Cameras provide rich color information, while lidar sensors provide accurate 3D structural and locational information of objects.
When fused, you can enhance the performance of perception and mapping algorithms for autonomous driving and robotics applications.