Robot Localization and Mapping

A Simultaneous Localization and Mapping (SLAM) program for a human-sized robot.

Client
Michigan Robotics Lab
Project Type
Software Engineering
Date
May 2017
 - 
Aug 2017
Services
Programming, Mathematics

Simultaneous Localization and Mapping with a FETCH Robot

My assignment for my independent summer research internship in the Michigan Robotics Lab was to create software that could map the environment of a human-sized sized robot and track the robot's position within that environment using data from the wheels and a spinning infrared sensor. This goal, commonly known as Simultaneous Localization And Mapping (SLAM), is a challenging problem in robotics.

Two person-sized rolling robots with one long arm each.

SLAM is difficult because there is no easy way to determine a locational 'anchor' for either the robot or the map of nearby walls. Any new infrared scan data for the map can only exist in relationship to where the robot currently is. However, the robot's current location cannot be ascertained solely from the odometry (the sensors in the wheels that measure how far it has rolled) because of slips and bumps. Therefore, in order to successfully solve the SLAM problem, the robot must correct any odometry errors by aligning itself with the map it has already created.

In short, the map is created relative to the robot's position, and the robot's position is verified relative to the map. This circular reasoning loop helps keep both the map and the position accurate over a longer period of time.

My Utility

My solution to this problem was a map visualization program written in Unity Game Engine in C#. This program was directly connected to the torrent of data from the FETCH robot through a Linux data streaming server. The program used advanced linear algebra to perform 3D transformations of the incoming infrared rangefinder data from the rapidly spinning sensor. After processing, this data became a series of points that outlined nearby walls and surfaces that was rendered into a map.

Screenshot of a blue sphere moving down a hallway map made out of tiny red dots

The odometry of the robot was adjusted using this map to ensure that a small wheel slip would not cause the next map scan to be inaccurate. To achieve this, I implemented a cutting-edge algorithm called Iterative Closest Point that compared the data points of the latest scans to the existing map and adjusted the robot's estimated position and angle accordingly.

A scientific diagram showing the gradual alignment of two curves using the Iterative Closest Point algorithm.
The alignment process of the Iterative Closest Point Algorithm (Diagram: Smistad et. al.)

Lessons Learned From Robotics

Working on SLAM was a valuable experience for me because there is no perfect solution. This project gave me the chance to continually tune the algorithm to achieve better results over the course of the summer. Along the way, I solved a wide range of programming challenges, including but not limited to:

  • Connecting to a high-volume and high-frequency data stream of JSON from an advanced robot
  • Keeping odometry and LIDAR data in perfect synchronization
  • Rendering thousands of data points without destroying performance
  • Recovering from network lagspikes
  • Preventing accuracy drift buildups
  • Handling scans of reflective and translucent surfaces
  • Optimizing the computation loop to process more data and improve fidelity

This experience with solving a difficult engineering and optimization problem has equipped me to handle a host of subsequent challenges I have encountered as a designer and programmer.

Screenshot of a partially-constructed map of a cluttered room. The robot navigates through smoothly.
Exploring the clutter of the University of Michigan Robotics Lab