How do autonomous machines see the world around them? How do their different sensor systems work together to create the view of the world? How can the machine combine information from multiple sources like laser scanners and cameras? The glue for all this is based on the two key concepts of robotics: sensor fusion and calibration.
Introduction
Automation and autonomy of various mobile machines is becoming more and more common these days. Cars are driving without human driver, street sweepers are cleaning the streets autonomously and packets get delivered without human intervention.
Automation of machines relies heavily on various types of sensors such as LiDARs (Light Detection and Ranging), Radars, Cameras, IMUs (Inertial Measurement Units) and GNSS (Global Navigation Satellite System) receivers. With additional sensors one can estimate variables that are not observable with a single sensor. Multiple sensors can also be used to reduce noise for increased accuracy or additional robustness. This is called Sensor Fusion.
How does this exactly then work? For example, how can we combine GNSS and LiDAR measurements together, since GNSS gives us our position in latitude, longitude, height format and LiDAR gives us a set of points (point cloud) that are within some range from the LiDAR itself. The answer contains two aspects: point cloud registration and calibration. The explanation to the first one in very short is: based on features in the point cloud, determine the movement of the LiDAR by comparing the previous and current point clouds. The second one is the actual topic of this blog post.
Intrinsic and extrinsic
Calibration consists of two main concepts: intrinsic and extrinsic calibration. Intrinsic calibration means basically compensating any internal manufacturing flaws or inequalities within the sensor. Examples of these are camera lens distortions, LiDAR individual laser beam positions and orientations and IMU acceleration sensor errors.
Extrinsic calibration refers to estimating relative positions, orientations and synchronisation offsets of the sensors with respect to each other or some well known position on the mobile machine. Once the relative
positions, orientations and synchronisation offsets between the sensors and the machine are known, we can transform the individual sensor measurements into a single frame of reference and time instant and thus fuse them together.
Case example
For example if the machine has two LiDARs that detect only approximately 50 percent each of a human walking towards the machine, there is a risk that the machine cannot identify the human based on individual LiDAR measurements. But if the LiDAR measurements are transformed into the single frame of reference, then the machine is able to identify the human. See Figure 2 for illustration.
For our example with movement estimation using GNSS and LiDAR, once we know the relative position and offsets of the sensors, we can combine their movement estimates into one similarly as done with the LiDAR measurements in Figure 2. As a side note, the combining of position estimates can be done for example using a Kalman filter.
Figure 2: Illustration of fusing two LiDAR measurements together in the machine’s frame of reference. T refers to the homogeneous transformation from one frame to another. For example LiDAR1 T Machine means transformation from Machine’s frame of reference to LiDAR’s frame of reference.
Calibration is a must
As discussed above, the calibration plays an important role in the automation of mobile machines. Without it, the sensor data will contain conflicting information and become corrupted. Without proper data, the sensed world around the machine becomes fragmented and distorted and therefore makes it harder for the machine to detect the most important pieces of information from it. Usually these pieces of information play a crucial role for example in operational safety of the machine. That being said, it is always worth the trouble to spend time on properly calibrating the sensors.
By M.Sc.(Tech.) Henri Varjotie, Senior Robotics Engineer and M.Sc.(Tech.) Janne Paanajärvi, Specialist/Lead Robotics Engineer