Home Skip to the content.

Lab 11: Localization (Real)

Lab Objective

In Lab 10, I demonstrated the effectiveness of the Bayes filter in simulation. In this lab, I will perform localization with the Bayes filter on the actual robot. Only the update step of the Bayes filter will be used, as the motion is very noisy and therefore the prediction step will not be helpful.

Bayes Filter Simulation

Three files were provided to us to help accomplish this lab: lab11_sim.ipynb, lab11_real.ipynb, and localization_extras.py.

Our first task was to test localization in simulation again, but this time with the code given to us instead of our own implementation. I did this by running the simulation notebook, and the result is shown below:

Simulation result

Bayes Filter (Real)

We were given an optimized Bayes filter, which was used in Python for the localization process. The primary objective was to enable the robot to rotate precisely on its axis and gather ToF readings. This task is similar to Lab 9, where the focus was to map the environment. Because the PID controller was already optimized for rotating and gathering readings, I reused that code.

We had to implement this within the perform_observation_loop() function, which is part of the RealRobot() class. When this function is called, the robot rotates a full 360°, outputting a NumPy array of ToF distance values and angular position values. I modified my notification handler from previous labs and used that to interpret the Bluetooth values. Note that I also used the asyncio function so that the script waits for the robot to finish its spin before proceeding.

Code implementation
Notification handler

This collects 18 ToF sensor readings, beginning with the robot at 0 degrees, and rotating in increments of 20 degrees to 340 degrees. In Lab 9, I collected as many data points as possible, but the world.yaml helper file expects 18 data points, so this part was modified for this task. By rotating the robot exactly 20° and stopping it to take a distance measurement before rotating again, the data collection was very precise, which is important for accurate localization.

Results

My results indicate that my robot has a very high confidence in its updated position, as the probability that the robot is at its belief is nearly 1 for all locations. The predicted locations from the update step also match well with the actual ground truth of the robot. All of the localization results accurately determined the x coordinate. However, some of the locations were off by one foot in the y direction. The blue point represents the belief of the robot and the red point represents the ground truth. Also note that the robot thinks it is offset by around 10°, which is accurate since my robot still rotates about 10° too much. My intention is to fix this before Lab 12 in order to successfully navigate through the map.

Location: (-3, -2)

Location -3,-2 photo
Location -3,-2 data

Location: (5, -3)

Location 5,-3 photo
Location 5,-3 data

Location: (5, 3)

Location 5,3 photo
Location 5,3 data

Location: (0, 3)

Location 0,3 photo
Location 0,3 data

Location: (0, 0)

Location 0,0 photo
Location 0,0 data