This project uses a Raspberry Pi to control a robot that dances in response to music that is played in its environment. The audio is captured through a microphone, processed in real-time to detect the bpm (beats per minute) and times of the beats of the song. These are then used to implement pre-programmed dance moves on the beat of the music. For entertainment value, the robot is programmed to perform robotic imitations of popular dance moves.
For Snowball, we created a robot with the Raspberry Pi that could dance to the beat of music. We implemented the beat detection aspect by importing a python library called librosa, but everything else was done by us manually in python. We used the python threading module to implement basic threading which allowed us to simultaneously capture audio from the microphone while calculating bpm and performing dance moves. The main code threads between microphone pick up, beat detection, and dance move calculations. The robot ‘dances’ by moving either/both of its two arms, which consist of 3 servos each. The servos were connected by 3D printed servo mounts which allowed the more visual appearance of an ‘arm’. The robot was able to manipulate these servos and do the robot, the whip, the naenae, and the wave!
To accomplish this much larger project, we decided the best approach was to tackle smaller project individually and combine them at the end. The high-level tasks, which will be further expanded upon, are as follows:
In order to do real time audio acquisition and processing, we needed to run non-blocking calls for microphone sample reading in a thread. To do this, we used alsaaudio’s pulse code modulation tool, and read data from our USB microphone 1000 samples at a time. At first, we used pyaudio’s acquisition code, but we ended up switching over to alsaaudio due to the better non-blocking calls. We also bought an “Ice blue snowball” microphone which was able to record with much higher quality compared to the cheaper microphone that we found in lab. Beat estimation is a bit complex in real time, there are many algorithms, most used the spectrogram of the sound and do analysis on that to detect patterns in specific frequencies. Librosa’s librosa.beat.beat_track uses an algorithm from the following paper: Ellis, Daniel PW. “Beat tracking by dynamic programming.” Journal of New Music Research 36.1 (2007): 51-60. http://labrosa.ee.columbia.edu/projects/beattrack/
The newer microphone was less choppy, more sensitive and had less noise the lab’s microphone. Librosa’s librosa.beat.beat_track returns two things, a beat per minute estimate as well as an estimate of the times in the sound array when it thinks the beats are. We store both and use both to help the robot dance. The bpm estimate is stored along with the last 10 bpm’s estimated and then the mode out of all of them is used to produce the most accurate estimate. If two bpm’s are tied for the code, then the mean between the is used. Times of beats are used to make the robot dance on beat, which I will explain in the next section.
The dance moves were implemented with code that set the position on our standard servos. It helped that they were not continuous rotation, which effectively automatically tracked the orientation of each servo and was able to spin to the same location consistently. We used a variety of functions put together to implement the dance moves. First there were low level functions to move individual servos. Then, there was a function to wait for the next beat, given the time the beats were computed, and the time of the last beat, and these can be used to compute the time of the next closest beat from the current time. Another function to take a list of moves and then move on each beat. And lastly, a function to generate the list of moves given an input array. The easiest way to demonstrate how the amount of time to wait was is to show the code:
For the physical construction of our robot, we had originally wanted it to be more of a cylindrical shape, however, we decided to laser cut the pieces of our body, so a rectangular prism was an easier shape to build. The robot body’s walls were constructed out of laser cut acrylic, ⅛ inch thick. It was red acrylic, for style. We designed the pieces in adobe illustrator and used a laser cutter in one of the ECE labs. This also encouraged us to learn how to use a laser cutter which was very cool. The top of the rectangular prism was made with a hole so that the stand for the microphone could fit through and the microphone could rest on top of our robot. The robot body was assembled using fast drying super glue and tape. The arms of Snowball were 3D printed using CAD drawings modified in Autodesk Inventor. Since the servos we used were basic 9g micro servos, a very common hobby servo, there were many resources online available to us. We found a 9g micro servo arm design from thingiverse (see references) which we then modified a bit for our needs: the design was thickened overall to fit our 3D printer’s needs, the extraneous parts like the gripper and the spring mechanism were removed from the design, and everything was scaled to our specifications. This was the result of one or our initial arm 3D print tests:
Our robot performed fantastically. Our live demo was great, the wave was a fan favorite. We did not achieve the extra goals of being able to put on a camera on the robot and interact with its surroundings. This was mainly because we did not put wheels on the bottom of the robot, which was for stability. The robot’s arms swinging rapidly caused the robot to shake and this would be a high risk if the robot was moving laterally as well. Check our our demo if you haven't already!
At the conclusion of our project, we were definitely able to move the robot arms on the beat, which was the biggest achievement. It was great that we could align all the individual components of our project so that this happened. We decided that we needed to be in sync with the beat to the nearest 0.1 seconds, which was how close it needed to be to look aligned to the human eye. While the robot arms moved to the beat, it still wasn’t perfect. There were many code workarounds and hacks that got the robot to dance in a much more appealing way. As a result, sometimes the robot misses a beat, or guesses the beat wrong and moves a couple times offbeat, etc.
In the near future, we are considering making our robot and code into a demo on display in the ECE building such that it is just always on and dancing periodically. The would not require any work to implement, the robot could have a song preloaded and play a song while it danced. As far as some of our initial hopes for the robot which were not met, The wheels probably could’ve been implemented on the robot with not too much more effort, however, having a camera and being able to interact with the environment would require a lot more work. We would probably recognize objects using opencv and have logic to touch objects that were gauged to be close by and intractable, in another thread. Another problem with this idea thought is that it would be hard to make it real time, as computer vision is pretty computationally intensive.
Most of the time we spent of the project was programming and brainstorming for programming. In terms of specializations, Radhika handled all of the hardware wiring and servo setup, and Raymond handled some high level code design for organization. The project main code was pair programmed. We brainstormed the implementation of the project together!
3D Arm Model for 9g Servos:https://www.thingiverse.com/thing:34829 Librosa library: https://librosa.github.io/librosa/ Alsaaudio library:https://larsimmisch.github.io/pyalsaaudio/libalsaaudio.html
We would like to thank our course instructor Joseph Skovira and the TA's for their support during this course. We especially thank the TA's for our section, Adam Halverson and Brendon Jackson.