Simultaneous localization and mapping (SLAM) is one of the fundamental problems in autonomous robotics. The problem arises when robots navigate in unknown environments: in order to build a map, the robot must be localized accurately. For precise localization, a detailed map is required. Therefore, localization and mapping must be carried out simultaneously.
In many of our projects, we have rich data from 3D laser scanners available that provide precise geometric measurements. We develop SLAM systems for autonomous robots in both structured and unstructured environments and generate large, geo-referenced 3D maps that combine GPS, IMU, and other sensors. The map is created and maintained in real-time during online operation of our autonomous robot, in order to be able to use it for other algorithms such as path-planning. If the robot approaches a previously visited region, it is able to detect changes in the environment, e.g. a parked car on a parking lot that was empty on the first visit. We also incooperate prior knowledge into our mapping procedures in order to create semantic maps of the environment.
We developed a software to visualize the sensor data. Furthermore, algorithms have been developed to determine the drivability of the environment. Environment perception plays an important role for autonomous system. In order to classify the environment, data of multiple sensors is fused. For proper data fusion, sensors are previously calibrated extrinsically with respect to the coordinate system of the 3D laser range finder. The surrounding terrain needs to be analyzed thoroughlyin order to estimate the drivability of the terrain surface. We use a Markov Random Field in order to classify the environment in real-time based on the fused camera-lidar data. Read more.
NavigationPath planning in unstructured environments is a challenging task for autonomous systems. Robot have to react quickly to changes, needs to deal with uncertainties and errors while respecting the terrain drivability in real-time. We developed a new kind of outdoor navigation, called spline templates, in order to solve this task for our robots. This also includes the creation of a global path planning map, to which the current sensor data can be compared to and which can be used for long term navigation.
Another objective of our research group is the recognition and processing of dynamic changes in the environment of the autonomous vehicles. Dynamic objects, such as vehicles or pedestrians, require an enhanced approach. Their movement needs to be detected early and reliably and needs to be tracked over time. Thereby, collisions with the own trajectory can be detected and avoided successfully. Read more.