Here's a sped-up video of our robot dancing to some music.
(Look out for familiar dance moves like the ~wave~ and the whip)



Objective

This project uses a Raspberry Pi to control a robot that dances in response to music that is played in its environment. The audio is captured through a microphone, processed in real-time to detect the bpm (beats per minute) and times of the beats of the song. These are then used to implement pre-programmed dance moves on the beat of the music. For entertainment value, the robot is programmed to perform robotic imitations of popular dance moves.


Introduction

For Snowball, we created a robot with the Raspberry Pi that could dance to the beat of music. We implemented the beat detection aspect by importing a python library called librosa, but everything else was done by us manually in python. We used the python threading module to implement basic threading which allowed us to simultaneously capture audio from the microphone while calculating bpm and performing dance moves. The main code threads between microphone pick up, beat detection, and dance move calculations. The robot ‘dances’ by moving either/both of its two arms, which consist of 3 servos each. The servos were connected by 3D printed servo mounts which allowed the more visual appearance of an ‘arm’. The robot was able to manipulate these servos and do the robot, the whip, the naenae, and the wave!

Our completed robot!


Design and Testing

To accomplish this much larger project, we decided the best approach was to tackle smaller project individually and combine them at the end. The high-level tasks, which will be further expanded upon, are as follows:

  1. Microphone Audio Processing
  2. Dance Implementation
  3. Physical Construction

Microphone Audio Processing

In order to do real time audio acquisition and processing, we needed to run non-blocking calls for microphone sample reading in a thread. To do this, we used alsaaudio’s pulse code modulation tool, and read data from our USB microphone 1000 samples at a time. At first, we used pyaudio’s acquisition code, but we ended up switching over to alsaaudio due to the better non-blocking calls. We also bought an “Ice blue snowball” microphone which was able to record with much higher quality compared to the cheaper microphone that we found in lab. Beat estimation is a bit complex in real time, there are many algorithms, most used the spectrogram of the sound and do analysis on that to detect patterns in specific frequencies. Librosa’s librosa.beat.beat_track uses an algorithm from the following paper:
Ellis, Daniel PW. “Beat tracking by dynamic programming.” Journal of New Music Research 36.1 (2007): 51-60. http://labrosa.ee.columbia.edu/projects/beattrack/

This graph displays how unreliable the lab's mic was when we played audio for a clear distinct, regular, drum beat.
This graph displays how much clearer the input from the ICE microphone was in silence and how drastic the chage was when we tapped on the microphone.

The newer microphone was less choppy, more sensitive and had less noise the lab’s microphone.
Librosa’s librosa.beat.beat_track returns two things, a beat per minute estimate as well as an estimate of the times in the sound array when it thinks the beats are. We store both and use both to help the robot dance. The bpm estimate is stored along with the last 10 bpm’s estimated and then the mode out of all of them is used to produce the most accurate estimate. If two bpm’s are tied for the code, then the mean between the is used. Times of beats are used to make the robot dance on beat, which I will explain in the next section.


Dance Implementation

The dance moves were implemented with code that set the position on our standard servos. It helped that they were not continuous rotation, which effectively automatically tracked the orientation of each servo and was able to spin to the same location consistently. We used a variety of functions put together to implement the dance moves. First there were low level functions to move individual servos. Then, there was a function to wait for the next beat, given the time the beats were computed, and the time of the last beat, and these can be used to compute the time of the next closest beat from the current time. Another function to take a list of moves and then move on each beat. And lastly, a function to generate the list of moves given an input array. The easiest way to demonstrate how the amount of time to wait was is to show the code:


I will explain this further. time.time() returns an absolute time, which is some large number, so the only way to make sense of it is to compare different calls to time.time(). To start, we compute the time that the last beat that we know about occuring, which is the time the last read from the microphone, subtracted out by the amount of time to get us to the beginning of the array, and then added on time to get us to the last beat within the array. We then get the distance between that last beat and the time that we were going to wait, “timeToWait” time, which is the current time plus space number of beats, which was the time we were going to wait if we only used the bpm estimate. Then, we compute a time that is an integer number of beats since the last beat and get an integer that is the closest amount of time to space number of beats away from the current beat. Later on in the code, if this time estimate is too far off the space number of beats away from the current time, we throw out the estimate and go with the space number of beats away form the current time instead.


Physical Construction

For the physical construction of our robot, we had originally wanted it to be more of a cylindrical shape, however, we decided to laser cut the pieces of our body, so a rectangular prism was an easier shape to build. The robot body’s walls were constructed out of laser cut acrylic, ⅛ inch thick. It was red acrylic, for style. We designed the pieces in adobe illustrator and used a laser cutter in one of the ECE labs. This also encouraged us to learn how to use a laser cutter which was very cool. The top of the rectangular prism was made with a hole so that the stand for the microphone could fit through and the microphone could rest on top of our robot. The robot body was assembled using fast drying super glue and tape. The arms of Snowball were 3D printed using CAD drawings modified in Autodesk Inventor. Since the servos we used were basic 9g micro servos, a very common hobby servo, there were many resources online available to us. We found a 9g micro servo arm design from thingiverse (see references) which we then modified a bit for our needs: the design was thickened overall to fit our 3D printer’s needs, the extraneous parts like the gripper and the spring mechanism were removed from the design, and everything was scaled to our specifications.
This was the result of one or our initial arm 3D print tests:

Once the arms were constructed and securely mounted on the body, we completed our wiring. A small breadboard was located on the inside to coordinate the wires from the servos to the Raspberry Pi. The microphone sat on the top, as previously mentioned, because it looked like a head, and also because this position would allow it to listen to sound from any direction without being obstructed by its own body.
Here's a close up of one of our constructed servo arms:

The servos were all connected as in the following visual schematic:


Results

Our robot performed fantastically. Our live demo was great, the wave was a fan favorite. We did not achieve the extra goals of being able to put on a camera on the robot and interact with its surroundings. This was mainly because we did not put wheels on the bottom of the robot, which was for stability. The robot’s arms swinging rapidly caused the robot to shake and this would be a high risk if the robot was moving laterally as well. Check our our demo if you haven't already!


Conclusion

At the conclusion of our project, we were definitely able to move the robot arms on the beat, which was the biggest achievement. It was great that we could align all the individual components of our project so that this happened. We decided that we needed to be in sync with the beat to the nearest 0.1 seconds, which was how close it needed to be to look aligned to the human eye. While the robot arms moved to the beat, it still wasn’t perfect. There were many code workarounds and hacks that got the robot to dance in a much more appealing way. As a result, sometimes the robot misses a beat, or guesses the beat wrong and moves a couple times offbeat, etc.


Future Work

In the near future, we are considering making our robot and code into a demo on display in the ECE building such that it is just always on and dancing periodically. The would not require any work to implement, the robot could have a song preloaded and play a song while it danced. As far as some of our initial hopes for the robot which were not met, The wheels probably could’ve been implemented on the robot with not too much more effort, however, having a camera and being able to interact with the environment would require a lot more work. We would probably recognize objects using opencv and have logic to touch objects that were gauged to be close by and intractable, in another thread. Another problem with this idea thought is that it would be hard to make it real time, as computer vision is pretty computationally intensive.


Work Distribution

Most of the time we spent of the project was programming and brainstorming for programming. In terms of specializations, Radhika handled all of the hardware wiring and servo setup, and Raymond handled some high level code design for organization. The project main code was pair programmed. We brainstormed the implementation of the project together!

Code Appendix

Commented Program Listing

The code file is available at https://github.com/ryx2/ECE-5725-Embedded-OS-Final-Project

Bill Of Materials

References and Acknowledgments

References

3D Arm Model for 9g Servos:https://www.thingiverse.com/thing:34829
Librosa library: https://librosa.github.io/librosa/
Alsaaudio library:https://larsimmisch.github.io/pyalsaaudio/libalsaaudio.html

Acknowledgments

We would like to thank our course instructor Joseph Skovira and the TA's for their support during this course. We especially thank the TA's for our section, Adam Halverson and Brendon Jackson.