LOCALISATION AND ENVIRONMENTAL MODELING

MOBILE ROBOTS – WHERE ARE THEY?

By Juhana Ahtiainen, D.Sc.(Tech.), Lead Robotics Engineer, Team Leader, Positioning and Mapping

We at GIM Robotics are specialized in developing mobile robots. A mobile robot can be any moving machine that is controlled by a software based on sensor data input. Our proprietary software stack combined with a supported set of sensors can make any mobile machine more intelligent, autonomous and safe. One of the key elements of our software stack is a positioning and mapping module, which takes care of the localisation and environmental modeling. Without that a proper autonomous operation would be impossible.

LOCALISATION AND ENVIRONMENTAL MODELING

All mobile robots must know where they are. Autonomous operation relies on constantly estimating the location and the orientation of the robot, that is, the pose of the robot. This process is called positioning and it is one of the fundamental problems in robotics. Without positioning, robots could not navigate from point to point and autonomous driving would be impossible. Positioning can be described as the process of determining the pose of a robot relative to a given map of the environment.

Mapping refers to the process of creating a consistent world model of the robot’s operating environment. Mapping is tightly coupled with positioning since creating a spatial model requires knowing the robot’s pose. A common approach for creating a world model is called simultaneous localization and mapping (SLAM) where the robot constructs a map of an unknown environment while localizing itself with respect to the map. However, once a map of the environment is available, that map can be reused for positioning when operating in the same environment.

WHY IS THIS DIFFICULT?

Satellite positioning systems are affordable and widely used for estimating position. Unfortunately this technology alone is not good enough for mobile robots that continuously require reliable and accurate pose information in real-time. Furthermore, the robots must be able to operate in real environments that are often complex and can be exposed to adverse weather conditions. The robots might also need to operate indoors where satellite positioning is not possible.

Fulfilling these requirements needs sensor fusion, that is, combining data from multiple different sensor modalities. The main benefit of sensor fusion is that different measurements complement each other. For example, point cloud data from an open field is not very informative but satellite positioning works well on open fields. On the other hand, satellite positioning does not work indoors but typically indoor point cloud data are rich in information.

HOW DO WE DO IT?

Our positioning and mapping module fuses multiple sensor modalities to ensure pose estimate reliability in all situations. Typically we use point cloud data, satellite positioning, accelerations, angular velocities, and wheel revolutions. Normally laser scanners are used to gather the point cloud data but any sensor that can produce point clouds can be used with our system.

The positioning and mapping module consists of five main components: motion estimation, point cloud rectification, mapping, map validation, and map-based positioning. The motion estimation component estimates vehicle ego-motion based on all available sensor data, excluding the point cloud data. The ego-motion estimate is used to compensate for the distortion in point cloud data caused by the sensor movement during data acquisition.

We use SLAM approach to create a map of the operating environment using the rectified point cloud data. Our mapping component is capable of running in real-time but typically we create the map prior to the deployment of the mobile robots to ensure high quality maps. The resulting  probabilistic, compact, and accurate environment representation is globally consistent, robust to outliers, and tolerant to dynamic objects and noisy sensory data.

To validate the map quality, we have a semi-automatic map validation tool that automatically identifies potential problem areas that are visualized to the user on a graphical user interface along with the created maps. The user can then validate the map quality after visual inspection or edit the problematic map areas.

After validating the map quality, we are ready to reliably estimate the mobile robot pose with our map-based positioning component that uses the rectified point cloud data and the maps created prior deployment. The probabilistic and efficient map-based positioning approach is based on a particle filter and provides smooth, accurate, and global 6D pose estimates in real-time. The probabilistic approach is inherently robust to outliers and enables operations also in adverse weather conditions.

WHERE ARE THE MOBILE ROBOTS?

Our probabilistic positioning approach enables all-weather operations. Even though it is based on probabilities, it is accurate down to just a few centimeters. Moreover, this accuracy is repeatable, also in adverse weather conditions. With our technology and expertise, any mobile machine can be turned into a mobile robot. With our positioning solution, these robots can operate anywhere.

DR. JUHANA AHTIAINEN

D.Sc.(Tech.), Lead Robotics Engineer, Team Leader, Positioning and Mapping

Juhana Ahtiainen has 20+ years of robotics experience. He has been working with heavy mobile machinery since his early days at the university. His master’s and doctoral theses focused on the practical use of our core algorithms in field and service robots and he has put his magic inside machines and robots ranging from the Finnish Defence Forces to the Australian Centre for Field Robotics.

Juhana leads our team of 10+ extremely talented robotic engineers solely dedicated to research, development and maintenance of our high quality mapping and localization products. The team also supports our customer projects and partnership development programs.


READ OTHER TECH POSTINGS

More posts

TIME TO RELAX AND REFLECT

WE HAVE BEEN BREAKING MANY RECORDS IN H1/2023 Before we all change trousers and shoes to shorts and flip-flops and start applying 50+ sunscreens to our pale faces, we thought it would be appropriate to

Read More

TRANSFORMING RESEARCH INTO REALITY

SUCCESS STORIES IN COMMERCIALISING AUTONOMOUS NON-ROAD MACHINERY 5th Annual Automation & Robotics for Non-Road Applications | 12-13 JUNE 2023, Düsseldorf, Germany Our CTO, Dr. Jose Luis Peralta, is our best traveling salesman. He has kindly

Read More

MEET US IN TAMPERE OR IN DÜSSELDORF

GIM ROBOTICS ON THE ROAD, OR SHOULD WE SAY OFF-ROAD Future Mobile Work Machine, SIX event | 30 MAY 2023, Tampere, Finland 5th Annual Automation & Robotics for Non-Road Applications | 12-13 JUNE 2023, Düsseldorf,

Read More

CROSSING THE CHASM WITH GIM ROBOTICS

BUILDING A STRONG AND SAFE BRIGDE, TOGETHER Our CTO Jose Luis Peralta participated in the Autonomous Off-Highway Machinery Technology Summit in Berlin a few weeks ago. See details here. The summit was a true success.

Read More

LOGIMAT 2023

INDOOR & OUTDOOR NAVIGATION FOR THE LOGISTICS INDUSTRY​ GIM Robotics, with some real 3D environmental modeling, localization, and situational awareness wizards, and Navitec Systems, the world leader in natural feature navigation and fleet control were

Read More

Our website uses cookies to get the best user experience. By continuing to use the site, you accept cookies. Read more