For the final project in Intro to Robotics & Mechatronics, we were tasked with building pairs of robots which "waltzed" by moving in circles around each other while also following a circle of tape on the floor and avoiding other pairs of dancers. The robots were to begin waltzing when they recognized that music was playing. For this project, my partner, Ronan, and I used three different microcontrollers: the Arduino-based Wio terminal, the LEGO Spike Prime, and the Raspberry Pi 4. View it on GitHub here.
This project required implementation of several complex features, including music recognition, partner detection, navigation, and obstacle avoidance. While my partner took lead on the music recognition and driving radius specification, I develped algorithms to detect tape on the floor, navigate along the tape, and keep the partners together while avoiding drifting.
To navigate along the line of tape, I came up with an algorithm which involved the robots varying their driving speeds and radii to shift the center of their rotation about one another. This method relied on recognizing when each robot crossed the line of tape. To detect the tape, I used a light sensor, and wrote code that used the K-Nearest-Neighbors algorithm. We then trained this model to distinguish between the carpet and the floor.
To keep the robots continually spinning around each other without drifting apart, I used computer vision to track a feature on one of the robots, and wrote a proportional controller to keep my robot's partner in the center of its field of view. To avoid obstacles, Ronan and I incorporated ultrasonic sensors on our robots, which would change course if an object was in the way. Ultimately, our pair of "dancers" successfully completed all of the objectives.
In the first week of the project, our main goals were to develop the music recognition capability and get the LEGO-based robot to drive in circles while avoiding obstacles. Since it has a built-in microphone, we used the Wio for audio processing. Using the Wio's microphone, we were able to differentiate between music and silence, but we ran into a problem—when the LEGO motors were spinning, the PWM signal created a sound which the Wio interpreted as music. Since we were only able to analyze the amplitude content of the sound, we couldn’t distinguish between music frequencies and motor frequencies, making this a hard problem to work around. In order to use more processing power for this problem, we decided to switch our music recognition to the Raspberry Pi.
During this phase of the project, I also wrote code which received music recognition signals over serial from the Wio, and began driving if these signals indicated that music was playing. To add obstacle avoidance, I used an ultrasonic sensor and wrote a proportional controller to slow down the robot based on how close an obstacle was. This method worked fairly well for large obstacles, but the LEGO ultrasonic sensor was too noisy for it to be reliable.
A prototype of the robot using the Wio terminal to recognize music and begin driving.
In the second week of the project, we switched audio processing onto the Raspberry Pi using a USB microphone. Ronan wrote a script based on this tutorial which recorded a chunk of sound pressure levels and computed a fast Fourier Transform (FFT) using numpy. From the FFT data, we could find peak frequencies, and compare them to the main frequencies generated by the motors. If there were loud frequencies that weren’t attributed to motor noise, we sent a ‘1’ over a serial connection to the Spike Prime, otherwise, a ‘0’ was sent. On the LEGO Spike Prime, I wrote code which used a queue to hold the last 10 binary values received from the Pi. If any of these were a ‘1’, the robot would begin to drive.
Next we tackled the problem of making the robots “waltz”. The robots needed to spin around each other, while following a line of tape around the classroom floor. To tackle this problem, I came up with a navigation algorithm. If the robots drove around each other at equal radius and velocity, the center of the circle they followed would be directly between them. However, if they both maintained constant angular velocity, but one of them had a larger radius (and thus higher linear velocity), the center would be offset towards the slower robot. This way, we can move the center of the circle in whatever direction we choose. To follow the line of tape on the floor, one robot would start fast while the other was slow; they would each drive in a semicircle until they recognized that they ran over the line of tape, and then they would switch speeds. This would continually offset the center of the circle in the direction we wanted, causing them to slowly drift along the line of tape while revolving around each other.
The first step in implementing this algorithm was for the robot to consistently recognize when it drove over the tape. To do this, I used a light sensor which could measure how much light was reflected off a given surface. Since the tape was more reflective than the floor, it could detect a difference. I used the K-Nearest-Neighbors machine learning algorithm to train the robot on the two cases, and switch speed when it recognized tape. This technique worked very consistently. To hold a constant radius between the robots, we decided to physically connect them. This ultimately caused more problems than it fixed, because the robots would occasionally pull each other over.
The robot recognizing a line of tape on the floor and adjusting its speed.
Physically connecting the robots caused them to lose traction with the floor.
The first decision we made in the third week of the project was to disconnect the robots from one another. This meant we would have to find a new way to ensure the robots stayed a constant distance from each other as they rotated. We began by finetuning a "fixed radius" algorithm Ronan had written early on. We ran motor tests to characterize how the PWM signals corresponded to wheel velocity, and used this to hardcode the robots' driving radius. Using this method, we were able to get the robots to drive at a set radius, but over time the robots would drift apart or eventually crash into each other.
The robots drifting out of alignment with a hardcoded driving radius.
Using computer vision to track a green sticker.
In order to fix the drifting problem, I used a Raspberry Pi camera and computer vision to keep Ronan’s robot in the center of my robot’s field of view. This way, my robot would always be moving tangent to the circle so they would not drift towards or away from each other. To do this, I wrote code based on this blog which uses OpenCV to track a green sticker on Ronan’s robot. I then wrote a proportional controller which changed the wheel speeds of my robot based on how far from the center of the frame the sticker was. After some tuning of the control gains, this method ultimately worked extremely well and made our robots able to spin around each other indefinitely without ever crashing.
At the final waltz, all of our subsystems ended up integrating together quite well. Our robots successfully recognized music and followed the line of tape while rotating around each other at a constant radius. If we had more time, we would have liked to continue working on ways to avoid collisions with other robot pairs. Ultimately, we were really happy with how consistently our dancers performed.
The robots waltzing at the final demonstration.