CONCLUSION


Unexpected Changes

Through extensive testing, we found that the PocketSphinx library was not well suited for our application. Although the library allowed local speech recognition, it would take over 30 seconds to understand small phrases. We wanted our project to be robust and have as little delay as possible to increase usability. We switched to using the Google Speech Recognition API because of the speed and high accuracy of prediction. In the end, we realized that the speech recognition library was difficult to get working. While some aspects of it worked, other aspects, such as the microphone input, would not work. Fortunately, we were able to come up with unique methods of working around the library issues. For example, instead of using the microphone given in the SpeechRecognition library, we took an alternative approach and saved the speech as a wav file. The SpeechRecognition library could then read this file to understand what the user was saying. Although it was difficult to get the speech recognition started, it was simple to implement in Python once we had installed the proper tools.

Future Work

With more time, we are confident that we can add much more functionality to our robot. We would put more effort into finding a GPIO library that is more robust and enables the robot to travel more smoothly and quickly towards the object. We had initially planned on building a robot that can not only move towards an object of a specified color but also latch onto it and bring it back to its starting position. We were not able to implement this due to time constraints but we would love to tackle this problem as we have most of the functionality implemented. Currently, our robot stops about a foot and a half away from the object, and we would need to modify this so that the robot can move right up to the object. This would likely require introducing a short distance sensor. We would also need some type of plowing system that can be attached to the front of the robot so that it can push a small object forward. We could use a laser emitter and receiver that can detect when the laser beam is disrupted. Once the object falls into this later, the robot detects that it can has the object in its plow system and that it can begin to return to its initial position.

Our robot currently detects only blue and green colors, but we could make it detect more colors. We could work on improving its detection range, enabling it to locate smaller objects at a further distance. We could add more supported commands. Some commands such as "Go To Blue Object" or "Go To Green Object," would only require small modifications to the code. Given the boundless applications of the Raspberry Pi and the extensive Pi Camera and OpenCV librarires, the room for growth and improvement is limitless.

Reflection

We spent a significant amount of time setting up the speech recognition and computer vision libraries. This required endless hours of painstaking debugging and grueling confusion. Even after the installations and configurations seemed to be working properly, we still spent countless hours working simply trying to become comfortable with these libraries and familiarize ourselves with their inner workings so that we could use them effectively. And thus we recognize that working through this project broadened our skillset in ways that extend beyond just an understading of a few more software systems. We learned to collaborate productively as a team and split work in a way that was efficient and sensible. We learned to work through hurdles, no matter how difficult or strenuous they seemed. We learned to ask for help, from our instructors, from our peers, and from each other. We stressed and worried that things wouldn't be done in time for our final demo, and our accomplishments reflect not only on the knowledge and skills that we have developed over our past seven semester as Cornell ECE undergrads but also our drive and sincere motivation to build something meaningful.

Acknowledgements

We would like to thank our Professor, Dr. Joe Skovira, for his teaching and guidance throughout the semester. We would also like to thank our Teaching Assistants for all the help they provided. We thoroughly enjoyed this class and appreciate the skills and knowledge we gained from it.

Copyright © , Judy Stephen and Cameron Boroumand

Using theme by BlackTie.co