CALIBRATION

THE ART OF SENSOR CALIBRATION AND INTEGRATION

By  M.Sc.(Tech.) Henri Varjotie and M.Sc.(Tech.) Janne Paanajärvi

How do autonomous machines see the world around them? How do their different sensor systems work together to create the view of the world? How can the machine combine information from multiple sources like laser scanners and cameras? The glue for all this is based on the two key concepts of robotics: sensor fusion and calibration.

INTRODUCTION

Automation and autonomy of various mobile machines is becoming more and more common these days. Cars are driving without human driver, street sweepers are cleaning the streets autonomously and packets get delivered without human intervention.

Automation of machines relies heavily on various types of sensors such as LiDARs (Light Detection and Ranging), Radars, Cameras, IMUs (Inertial Measurement Units) and GNSS (Global Navigation Satellite System) receivers. With additional sensors one can estimate variables that are not observable with a single sensor. Multiple sensors can also be used to reduce noise for increased accuracy or additional robustness. This is called Sensor Fusion.

How does this exactly then work? For example, how can we combine GNSS and LiDAR measurements together, since GNSS gives us our position in latitude, longitude, height format and LiDAR gives us a set of points (point cloud) that are within some range from the LiDAR itself. The answer contains two aspects: point cloud registration and  calibration. The explanation to the first one in very short is: based on features in the point cloud, determine the movement of the LiDAR by comparing the previous and current point clouds. The second one is the actual topic of this blog post.

INTRINSIC AND EXTRINSIC

Calibration consists of two main concepts: intrinsic and extrinsic calibration. Intrinsic calibration means basically compensating any internal manufacturing flaws or inequalities within the sensor. Examples of these are camera lens distortions, LiDAR individual laser beam positions and orientations and IMU acceleration sensor errors.

Extrinsic calibration refers to estimating relative positions, orientations and synchronisation offsets of the sensors with respect to each other or some well known position on the mobile machine. Once the relative
positions, orientations and synchronisation offsets between the sensors and the machine are known, we can transform the individual sensor measurements into a single frame of reference and time instant and thus fuse them together.

Figure 1: Calibration in progress.

CASE EXAMPLE

For example if the machine has two LiDARs that detect only approximately 50 percent each of a human walking towards the machine, there is a risk that the machine cannot identify the human based on individual LiDAR measurements. But if the LiDAR measurements are transformed into the single frame of reference, then the machine is able to identify the human. See Figure 2 for illustration.

For our example with movement estimation using GNSS and LiDAR, once we know the relative position and offsets of the sensors, we can combine their movement estimates into one similarly as done with the LiDAR measurements in Figure 2. As a side note, the combining of position estimates can be done for example using a Kalman filter.

 

Figure 2: Illustration of fusing two LiDAR measurements together in the machine’s frame of reference. T refers to the homogeneous transformation from one frame to another. For example LiDAR1 T Machine means transformation from Machine’s frame of reference to LiDAR’s frame of reference.

CALIBRATION IS A MUST

As discussed above, the calibration plays an important role in the automation of mobile machines. Without it, the sensor data will contain conflicting information and become corrupted. Without proper data, the sensed world around the machine becomes fragmented and distorted and therefore makes it harder for the machine to detect the most important pieces of information from it. Usually these pieces of information play a crucial role for example in operational safety of the machine. That being said, it is always worth the trouble to spend time on properly calibrating the sensors.

M.Sc. HENRI VARJOTIE

M.Sc.(Tech.), Robotics & Software Engineer, Project Manager & robotics enthusiastic since 2017. Master of Science (Robotics and Automation), Aalto University, Finland, 2019.

M.Sc. JANNE PAANAJÄRVI

M.Sc.(Tech.), D.Sc.(Tech.) candidate, Specialist, Robotics & Software Engineer, Master of Science (Robotics and Automation), Helsinki University of Tehnology (later Aalto University), Finland, 2002.


READ OTHER TECH POSTINGS

More posts

VAMOS ECOSYSTEM

COMPANIES WORKING IN AUTONOMY AND SMART TECHNOLOGIES HAVE JOINED FORCES VAMOS Ecosystem is a Finnish innovation cluster that focuses on autonomous mobility in smart spaces. It brings together companies that jointly collaborate, compete, and create

Read More

WHAT COULD GO WRONG?

LOGISTICS IS JUST ABOUT MOVING SOMETHING FROM ONE PLACE TO ANOTHER, RIGHT? Out of all relevant industrial domains for hardcore mobile robotics engineers, it is very difficult to find anything more extensive than logistics. The

Read More

European Robotics Forum

PROPELLER HEADS, SUITS, AND BUREAUCRATS European Robotics Forum 2022, 28-30 June 2022, Rotterdam, Ahoy Centre GIM Robotics ends its spring/early summer European Tour with a bang at ERF2022. Our guys have been zigzagging across Europe

Read More

TOC EUROPE 2022

PUTTING YELLOW HELMETS ON TOC EUROPE 2022 – FOR GLOBAL CONTAINER SUPPLY CHAIN PROFESSIONALS 14 – 16 June 2022, Rotterdam, Ahoy Centre GIM Robotics has been developing state-of-the-art solutions for all-weather GNSS- and supportive infrastructure-free

Read More

TROMBIA FREE SYSTEM LAUNCH

WATCH OUT THE LAUNCH AT IFAT 2O22 (30.5 10:00AM CET) A true Game changer, Trombia Free, has been rocking the street sweeping business since its launch in autumn 2020. GIM Robotics has been involved in

Read More

PARTNERSHIP ANNOUNCEMENT

NAVITEC SYSTEMS – GIM ROBOTICS PARTNERSHIP Finland-based GIM Robotics – 3D environmental modeling, localization, and situational awareness wizards – and Navitec Systems – the world leader in natural feature-based navigation – are partnering to jointly

Read More

Our website uses cookies to get the best user experience. By continuing to use the site, you accept cookies. Read more