CALIBRATION

THE ART OF SENSOR CALIBRATION AND INTEGRATION

By  M.Sc.(Tech.) Henri Varjotie and M.Sc.(Tech.) Janne Paanajärvi

How do autonomous machines see the world around them? How do their different sensor systems work together to create the view of the world? How can the machine combine information from multiple sources like laser scanners and cameras? The glue for all this is based on the two key concepts of robotics: sensor fusion and calibration.

INTRODUCTION

Automation and autonomy of various mobile machines is becoming more and more common these days. Cars are driving without human driver, street sweepers are cleaning the streets autonomously and packets get delivered without human intervention.

Automation of machines relies heavily on various types of sensors such as LiDARs (Light Detection and Ranging), Radars, Cameras, IMUs (Inertial Measurement Units) and GNSS (Global Navigation Satellite System) receivers. With additional sensors one can estimate variables that are not observable with a single sensor. Multiple sensors can also be used to reduce noise for increased accuracy or additional robustness. This is called Sensor Fusion.

How does this exactly then work? For example, how can we combine GNSS and LiDAR measurements together, since GNSS gives us our position in latitude, longitude, height format and LiDAR gives us a set of points (point cloud) that are within some range from the LiDAR itself. The answer contains two aspects: point cloud registration and  calibration. The explanation to the first one in very short is: based on features in the point cloud, determine the movement of the LiDAR by comparing the previous and current point clouds. The second one is the actual topic of this blog post.

INTRINSIC AND EXTRINSIC

Calibration consists of two main concepts: intrinsic and extrinsic calibration. Intrinsic calibration means basically compensating any internal manufacturing flaws or inequalities within the sensor. Examples of these are camera lens distortions, LiDAR individual laser beam positions and orientations and IMU acceleration sensor errors.

Extrinsic calibration refers to estimating relative positions, orientations and synchronisation offsets of the sensors with respect to each other or some well known position on the mobile machine. Once the relative
positions, orientations and synchronisation offsets between the sensors and the machine are known, we can transform the individual sensor measurements into a single frame of reference and time instant and thus fuse them together.

Figure 1: Calibration in progress.

CASE EXAMPLE

For example if the machine has two LiDARs that detect only approximately 50 percent each of a human walking towards the machine, there is a risk that the machine cannot identify the human based on individual LiDAR measurements. But if the LiDAR measurements are transformed into the single frame of reference, then the machine is able to identify the human. See Figure 2 for illustration.

For our example with movement estimation using GNSS and LiDAR, once we know the relative position and offsets of the sensors, we can combine their movement estimates into one similarly as done with the LiDAR measurements in Figure 2. As a side note, the combining of position estimates can be done for example using a Kalman filter.

 

Figure 2: Illustration of fusing two LiDAR measurements together in the machine’s frame of reference. T refers to the homogeneous transformation from one frame to another. For example LiDAR1 T Machine means transformation from Machine’s frame of reference to LiDAR’s frame of reference.

CALIBRATION IS A MUST

As discussed above, the calibration plays an important role in the automation of mobile machines. Without it, the sensor data will contain conflicting information and become corrupted. Without proper data, the sensed world around the machine becomes fragmented and distorted and therefore makes it harder for the machine to detect the most important pieces of information from it. Usually these pieces of information play a crucial role for example in operational safety of the machine. That being said, it is always worth the trouble to spend time on properly calibrating the sensors.

M.Sc. HENRI VARJOTIE

M.Sc.(Tech.), Robotics & Software Engineer, Project Manager & robotics enthusiastic since 2017. Master of Science (Robotics and Automation), Aalto University, Finland, 2019.

M.Sc. JANNE PAANAJÄRVI

M.Sc.(Tech.), D.Sc.(Tech.) candidate, Specialist, Robotics & Software Engineer, Master of Science (Robotics and Automation), Helsinki University of Tehnology (later Aalto University), Finland, 2002.


READ OTHER TECH POSTINGS

More posts

PEAMS PROJECT’S AGV INTEROPERABILITY DEMO

JOINT PRESS RELEASE 17.1.2023 In the spring of 2021, Forum for Intelligent Machines ry (FIMA) partners launched a collaborative research project called Platform Economy of Autonomous Mobile Machines Software Development (PEAMS). FIMA is an industry-driven

Read More

XMAS2022 GREETINGS

HELPING SANTA – REGARDLESS OF THE WEATHER Climate change has dire consequences here on the higher latitudes, especially during the winter months. Weather is fluctuating heavily, and we are going from snowless to deep snow

Read More

DIGIRAIL IN THE PILOTING PHASE

WINTER TESTING ON RAILWAYS In the Digirail project, the automatic train control system will be renewed. Finland’s current automatic train control system (JKV) is approaching the end of its life cycle. As part of that

Read More

THE LONG ROAD TOWARDS ROS2

PLAYING THE LONG GAME IN THE OPEN-SOURCE FRAMEWORK/MIDDLEWARE LEAGUE Originally published as FIMA blog post 28.11.2022 (fima.fi) Introducing a full-blooded Field & Service robot is not for the faint-hearted. The machine consists of a big

Read More

NEW BUSINESS FINLAND PROJECTS: CASE SWARM

BOOM-BOOM-BOOM, part 3 GIM Robotics is a proud member of three new Business Finland funded projects. INFRA CONSTRUCTION MACHINE GROUP AIMS TO SAVE ENERGY AND IMPROVE PERFORMANCE As stated in our two previous posts, in

Read More

Our website uses cookies to get the best user experience. By continuing to use the site, you accept cookies. Read more