PiDog can follow your voice commands, be trained to recognize your hand gestures, and can also be taught new voice commands just like a real dog! Our goal with this project was to create an intelligent robot that simulates the training of a pet with the help of speech recognition, computer vision, and machine learning. PiDog knows how to follow direct voice commands to perform actions like moving forward, rotating, turning, etc., and every time it is given a voice command, it looks for any hand gestures to associate with the command. After a few tries, it will learn to follow the hand gesture directly! Once it knows a few hand gestures, you can also say new commands to it and show a series of gestures for it to perform complex actions like forward --> turn right --> forward --> rotate. Throughout this process, PiDog also interacts with you with its cute expressions and barks!
The PiDog has been developed using a Raspberry Pi 4B as the final project of ECE 5725 Design with Embedded OS class at Cornell University.