ECE 5725 Final Project

Felicia: An Animatronic Face
Annabel Lian (ayl48) and Becky Lee (bl478)



Introduction

For our final embedded operating systems project, we made a cardboard animatronic face that makes different facial expressions, moves its mouth to music, and vomits confetti. Our crafty project combined mechanical and electrical concepts as we integrated the cardboard facial components with positional servo motors and controlled them with the Raspberry Pi. The animatronic eyes are an interesting mechanical feature of our project as we used gimbals, ping pong balls, and wire to enable horizontal and vertical rotational motion to mimic real eye movements.



Project Objective:

  • Play audio and control facial expressions from the Raspberry Pi
  • Control facial expressions using servo motors and the RPi.GPIO library
  • Render buttons on the piTFT to allow users to change facial expressions
  • Use a fan to control the confetti coming out of the mouth

Design


Figure 1

Figure 1: Hardware Schematic


Cardboard Facial Structure

For the mechanical design, we drew inspiration from the animatronic face from the pyroelectro website. All the face components were created by drawing and cutting shapes from cardboard. For the face alone, there were four pieces. We cut out two parentheses shaped pieces for the eyebrows and colored them with black marker. An oval was cut out for the head and it was then cut in half to create the mouth. Two eye holes that were slightly larger than the diameter of a ping pong ball were cut out from the top half piece. Lastly, two 1.5 cm squares were punctured above the eyes so that we could put the output shaft of the servos through them for the eyebrows.

Figure 2

Figure 2: Cardboard Face


For the supporting structure holding up our face, we used a rectangular U shaped structure to hold the upper piece of the head and each of the horizontal levels shown below. We had to improvise and add supports where necessary to keep the horizontal levels up. The top level holds the motors that control the eyes’ horizontal motions. The middle level holds the fan, a paper tube containing DIY confetti (white paper towel cut up into small bits) and the motor controlling the mouth.


Figure 3

Figure 3: Support Structure of Face


Eyeballs

The eyeballs are the most mechanically interesting component of our project. We 3D printed the black ring of our makeshift gimbal. The ring has 4 holes located on the “North, East, South, and West” sides. We drew pupils onto ping pong balls to create the eyes, cut a large 1 inch hole in the back side and punctured small holes from North to South. Hard, thick pink wires were cut and shaped so that it could act as a support rod for the eye. We placed the ping pong ball inside of the black ring and aligned the holes so that we could stick the support rod through both of them simultaneously. Then, we glued the pink rod to the black ring so that they could stay in place. We did not glue the ping pong ball to the rod because the rod is the eye’s axis of rotation. Next, we secured the ring and ping pong ball onto the cardboard structure by inserting small bolts through the East and West holes of the black ring and the cardboard. We used nuts to loosely keep the bolts in place so that the eyeball would not fall out of the cardboard structure but the ring could still rotate on the bolts.

Figure 4

Figure 4: Eyeball Mechanism


After creating the basic eyeball structures, we added “tendons” to the eyes using breadboard wire. We cut short lengths of wire and stripped both ends of each piece. We did not use any specific measurements and experimentally determined what the optimal lengths for the tendons should be. For the vertical eye motion, the servos were mounted on the sides of the cardboard U structure and the wire was wrapped around the servo horn and the top of the black ring. Then they were secured in place using hot glue and tape. For the horizontal eye motion, the servos were mounted on the top level of the cardboard structure and the wire tendons were used to puncture small holes through the ping pong balls and then wrapped in the fashion shown below. (The wire tendon first goes through the small hole and then around the large hole for wrapping.) The other end of the tendon is wrapped around the servo horn. The wires had to be twisted and shaped in order to ensure that the eyeballs moved in a semi realistic fashion.


Figure 5

Figure 5: Completed Eyeball Mechanism


Mouth

We glued a rectangular piece orthogonal to the bottom mouthpiece to act as a lever. This rectangular piece was then glued onto the servo horn on the middle level of the cardboard structure. When the servo rotates, the mouth opens and closes.

Figure 6

Figure 6: Front View of Mouth


Figure 7

Figure 7: Back View of Mouth


Fan
We created an electrical circuit for the fan which consists of an npn transistor, a diode, and a 1 kΩ resistor. We used a PWM signal that was outputted to GPIO 16 and went through the 1 kΩ resistor to the base of the npn transistor. The fan was connected to the diode and the collector of the npn transistor. The 5V external power supply was also connected to the collector of the npn transistor in order to provide the fan with the appropriate voltage needed in order to operate.


Figure 8

Figure 8: Fan Circuit Schematic


Software Design

For our software design, we decided to use RPi.GPIO in order to send a PWM (pulse width modulated) signal to our servo motors. We used GPIOs 5 and 26 for the left and right eyebrows respectively. We used GPIO 19 for the mouth. GPIOs 6 and 20 were used for the left vertical eye movement and right vertical eye movement respectively. We used GPIO 13 for the left horizontal eye movement and GPIO 12 for the right horizontal eye movement. Per the datasheet, we used a frequency of 50 Hz for our PWM signal. In order to map the desired angle to a duty cycle, we first found the pulse width corresponding to the desired angle and then we roughly approximated it by using the formula (pulse width/period) * 100. We adjusted the duty cycle accordingly. Further details for the calibration of the duty cycle for the motors can be found in the testing section.

For each motion that we wanted (ie look up, look down, look forward, etc.), we created functions that would change the position of the face(ie pwm6.ChangeDutyCycle(2) and pwm20.ChangeDutyCycle(12) to make the eyes look down vertically). We then integrated those specific functions that changed certain parts of the face and put them into the main while loop so that we could change the face to the desired facial expression. We also made individual functions for the different expressions such as angry, sad, excited, and happy for testing purposes.

For our fan, we also generated a PWM signal from the RPi.GPIO library. We set up GPIO 16 to output a PWM signal with a frequency of 102 Hz per the data sheet and a duty cycle of 99 so that the fan would fully turn on when we wanted to blow the confetti out of the mouth.

For our buttons, we used the pygame and pigame libraries in order to render buttons onto the piTFT. Inside our while loop, we first render the circles and text for each individual button by specifying the coordinates that the buttons should be drawn on the piTFT screen, the size of the buttons, and the size, color, and position of our text.

Then, in order to recognize touch events from the user, we use a for loop to get the coordinates for a specific button press. We would check which button was pressed by using if statements to check the s and y positions recorded and see if they satisfy the following range of x and y values for a specific button such as happy. Depending on which but ton was pressed, a certain sequence of motions would occur sequentially.


Figure 9

Figure 9: Button Display on piTFT



In order to play the audio, we used the pygame.mixer library. We first initialized the mixer by using the following command pygame.mixer.init() and then inside each button press event, we loaded the .wav file to the pygame mixer. We used a while loop to read the .wav file and continue playing until the audio file finished.

Together, we integrated the individual functions that controlled the facial expressions of the face and the button display into the main while loop. In the main while loop, we first render all of the buttons and blit them. Then, we initialize the pygame mixer. Afterwards, we wait for touch events and then check conditional statements to see which button was pressed. In each conditional for the button press, we both wrote code that manually changed each micro servo motor to the correct position as well as call functions that change parts of the face. After the face is in the correct position in order to display the correct facial expression, we start playing the audio file. At the end of the while loop, we display all of the buttons on the piTFT screen by calling pygame.dispaly.flip().

Figure 10

Figure 10: Overall Software Schematic


Testing

Software Design
In order to calibrate the duty cycle for each motor so we could get the angle that we desired, we first used the equation that was described in the design section. However, we found out that due to variations in individual micro servo motors, we had to calibrate each micro servo motor by changing the duty cycle until we got the desired angle.

Also, we ran into issues regarding the integration of the button display on the piTFT and the motor control. We tried to use multithreading where we would put our motor control for each facial expression into a thread. Then, when a button was pressed, the thread corresponding to the correct facial expression would start and change the motors to the correct position. When we tested it, we discovered that we were not correctly joining the threads for our motor control to the main thread which contained our while loop. We tried to debug this by looking for similar issues online and asking the TAs for help but ultimately, we were unable to resolve this issue. Instead, we settled for directly adding the functions for our motor control into the conditionals for the button press detections.

A similar issue arose when we were trying to integrate the audio playback into the button press detection code. We tried adding the audio playback code into the button press detection code. Then, we added the motor control code inside the while loop that plays our audio file. However, while the audio file was playing, the motor control functions were not working properly. We resolved this issue by placing the motor control functions outside of the while loop that plays the audio file.


Results

On a visual level, our animatronic face was able to achieve the intended idea of the project where the face makes distinct expressions, moves its mouth to music, and vomits confetti. However, the software structure and functionality did not meet our original expectations. We had intended to have all the components of the face move simultaneously and much more smoothly. This would have required threads as positional servos are not time-based unlike DC motors. We also wanted the mouth movements to sync with the words of each song. However, this would have required audio parsing which would have taken up more time than what was allotted to us, given that we had never learned how to do this before. In the end, we settled for pre-programmed mouth movements that would somewhat match up with the song.


Conclusion

Overall, we were able to incorporate the skills we learned in the lab such as using pygame and pigame libraries to render buttons onto the piTFT and detect button presses. Also, we used the RPi.GPIO library to control our blue micro servo 9g motors by using PWM signals and changing the duty cycle. Our project was ultimately successful since we were able to take the button presses and then play the audio and change the facial expressions accordingly. However, we did run into some difficulties. We were unable to get multithreading to work since we could not figure out how to properly join the thread we created for the motor control back to our main thread. Therefore, we could not smoothly transition from one facial expression to another. We had to sequentially change the eyebrow, eye, and mouth movements.


Future Work

Our next steps would be to incorporate threads so that we can simultaneously play the audio and control the motors for our face. Then, it would allow us to synchronize the facial movements with the audio better. We would also be able to resolve the issue with our motors moving sequentially instead of in parallel (all at once). Furthermore, we would want to explore audio parsing (from speech to text) so that we could automatically change facial expressions based on key words such as “happy” or “angry”. Then, we would not have to rely on the buttons on the piTFT to manually change the facial expressions.


Work Distribution

Group Photo

Annabel Lian (ayl48) and Becky Lee (bl478)

We worked on the entire project together simultaneously. Annabel designed the mechanical part and worked on the actuation code. Becky tested the mechanical part, made the buttons, and audio. We integrated and tested everything together.

We did not really split up any work besides the buttons.


Parts List

    Total: $4.62


References

[1] Animatronics
[2] Micro Servo Datasheet
[3] Fan Circuitry Schematic
[4] Fan Datasheet
[5] Playing Audio with the RPi
[6] RPi GPIO Document

Code Appendix


        // final_face.py
        import time
        import pygame, pigame
        from pygame.locals import*
        import os
        import RPi.GPIO as GPIO
        import subprocess
        from threading import Thread
        import pigpio


        os.putenv('SDL_VIDEODRIVER','fbcon')
        os.putenv('SDL_FBDEV','/dev/fb1')

        os.putenv('SDL_MOUSEDRV','dummy')
        os.putenv('SDL_MOUSEDEV','/dev/null')
        os.putenv('DISPLAY','')

        pygame.init()
        pitft = pigame.PiTft()

        pygame.mouse.set_visible(False)
        WHITE = 255, 255, 255
        BLACK = 0,0,0

        screen =pygame.display.set_mode((320,240))
        my_font = pygame.font.Font(None, 30)
        screen.fill(BLACK)

        #GPIO Quit Button
        GPIO.setmode(GPIO.BCM)

        GPIO.setup(27, GPIO.IN, pull_up_down = GPIO.PUD_UP)

        def GPIO27_callback(channel):
        global code_run
        code_run = False
        pygame.quit()

        GPIO.add_event_detect(27, GPIO.FALLING, callback=GPIO27_callback, bouncetime = 300)

        #################################################################################################
        #################################### Face and Face Functions ####################################
        #################################################################################################

        pi_hw = pigpio.pi()   # connect to pi gpio daemon

        #Set up GPIO pins (For motors using Rpi GPIO)

        #Eyebrows
        GPIO.setup(5, GPIO.OUT) #Left
        GPIO.setup(26, GPIO.OUT) #Right

        #Mouth
        GPIO.setup(19, GPIO.OUT)

        #Eyes Vertical
        GPIO.setup(6, GPIO.OUT) #Left
        GPIO.setup(20, GPIO.OUT) #Right

        #Eyes Horizontal
        GPIO.setup(13, GPIO.OUT) #Left
        GPIO.setup(12, GPIO.OUT) #Right

        #Initialize motors that are using RpiGPIO
        pwm5 = GPIO.PWM( 5, 50) #Second arg is frequency
        pwm5.start(0) #Duty cycle

        pwm26 = GPIO.PWM(26, 50) #Second arg is frequency
        pwm26.start(0) #Duty cycle

        pwm19 = GPIO.PWM(19, 50) #Second arg is frequency
        pwm19.start(0) #Duty cycle

        #Left vertical eye
        pwm6 = GPIO.PWM( 6, 50) #Second arg is frequency
        pwm6.start(0) #Duty cycle
        #Right vertical eye
        pwm20 = GPIO.PWM( 20, 50) #Second arg is frequency
        pwm20.start(0) #Duty cycle

        #Left Horizontal eye
        pwm13 = GPIO.PWM( 13, 50) #Second arg is frequency
        pwm13.start(0) #Duty cycle
        #Right horizontal eye
        pwm12 = GPIO.PWM( 12, 50) #Second arg is frequency
        pwm12.start(0) #Duty cycle

        #Set GPIO 16 as fan control
        GPIO.setup(16, GPIO.OUT)

        #initialize fan PWM signal
        pwm16 = GPIO.PWM(16, 102) #run at 102 Hz 
        pwm16.start(0) #start at 0 duty cycle

        #Movement functions
        def left_brow():
            pwm5.ChangeDutyCycle(5)
            time.sleep(0.5)
            pwm5.ChangeDutyCycle(10)
            time.sleep(0.5)

        def right_brow():
            pwm26.ChangeDutyCycle(5)
            time.sleep(0.5)
            pwm26.ChangeDutyCycle(8)
            time.sleep(0.5)

        def mouth():
            pwm22.ChangeDutyCycle(6)
            time.sleep(0.5)
            pwm19.ChangeDutyCycle(7)
            time.sleep(0.5)
            pwm19.ChangeDutyCycle(6)
            time.sleep(0.5)
            pwm19.ChangeDutyCycle(7)
            time.sleep(0.5)

        def vomit():
            pwm19.ChangeDutyCycle(6)
            time.sleep(0.5)
            pwm19.ChangeDutyCycle(8)
            time.sleep(0.5)
            pwm16.ChangeDutyCycle(99)
            time.sleep(8)
            pwm16.ChangeDutyCycle(0)
            time.sleep(1)

        #Eyes look down
        def look_down():
            pwm6.ChangeDutyCycle(2)
            time.sleep(0.25)
            pwm20.ChangeDutyCycle(12)
            time.sleep(0.25)

        #Eyes look up
        def look_up():
            pwm6.ChangeDutyCycle(12)
            time.sleep(0.25)
            pwm20.ChangeDutyCycle(2)
            time.sleep(0.25)

        #Eyes look left
        def look_left():
            pwm13.ChangeDutyCycle(10) #go to left
            time.sleep(0.25)
            pwm12.ChangeDutyCycle(2) #go to left
            time.sleep(0.25)

        def look_right():
            pwm13.ChangeDutyCycle(1) #go to right
            time.sleep(0.25)
            pwm12.ChangeDutyCycle(10) #go to right
            time.sleep(0.25)

        def look_forward():
            pwm13.ChangeDutyCycle(5) #go to right
            time.sleep(0.25)
            pwm12.ChangeDutyCycle(5) #go to right
            time.sleep(0.25)
            pwm6.ChangeDutyCycle(6)
            time.sleep(0.25)
            pwm20.ChangeDutyCycle(6)
            time.sleep(0.25)

        #facial expressions
        def neutral():
            neutral_run = True
            pwm5.ChangeDutyCycle(7)
            time.sleep(0.5)
            pwm26.ChangeDutyCycle(6)
            time.sleep(0.5)
            pwm19.ChangeDutyCycle(6)
            neutral_run = False


        def angry():
            angry_run= True
            pwm13.ChangeDutyCycle(10)
            time.sleep(0.5)
            pwm12.ChangeDutyCycle(10)
            time.sleep(0.5)
            pwm5.ChangeDutyCycle(10)
            time.sleep(0.5)
            pwm26.ChangeDutyCycle(4)
            time.sleep(0.5)
            pwm19.ChangeDutyCycle(6)
            time.sleep(0.5)
            pwm19.ChangeDutyCycle(4)
            time.sleep(0.5)
            angry_run = False


        def excited():
            excited_run = True
            look_up()
            pwm13.ChangeDutyCycle(10)
            time.sleep(0.5)
            pwm12.ChangeDutyCycle(10)
            time.sleep(0.5)
            pwm5.ChangeDutyCycle(10)
            time.sleep(0.5)
            pwm26.ChangeDutyCycle(7)
            time.sleep(0.5)
            pwm19.ChangeDutyCycle(8)
            time.sleep(0.5)
            vomit()
            excited_run = False

        def sad():
            sad_run= True
            #Eyes look down
            look_down()
            pwm13.ChangeDutyCycle(1) #go to right
            time.sleep(0.5)
            pwm12.ChangeDutyCycle(2) #go to left
            time.sleep(0.5)
            pwm5.ChangeDutyCycle(6)
            time.sleep(0.5)
            pwm26.ChangeDutyCycle(7)
            time.sleep(0.5)
            pwm19.ChangeDutyCycle(6)
            time.sleep(0.5)
            sad_run = False


        def sing():
            pwm19.ChangeDutyCycle(6)
            time.sleep(0.25)
            pwm19.ChangeDutyCycle(7)
            time.sleep(0.25)
            pwm19.ChangeDutyCycle(6)
            time.sleep(0.3)
            pwm19.ChangeDutyCycle(7)
            time.sleep(0.25)
            pwm19.ChangeDutyCycle(6)
            time.sleep(0.25)


        code_run = True

        #button press detection variables
        neutral_run= False
        angry_run = False
        excited_run = False
        sad_run = False

        new_press = False

        while (code_run):

            #render buttons and text

            happy = pygame.draw.circle(screen, (255,255,0), [50,50], 40,0)
            happy_surf = my_font.render("Happy", True, WHITE)
            happy_rect = happy_surf.get_rect(center =(50,50))
            screen.blit(happy_surf, happy_rect)

            sad = pygame.draw.circle(screen, (0,0,255), [210,50], 40,0)
            sad_surf = my_font.render("Sad", True, WHITE)
            sad_rect = sad_surf.get_rect(center =(210,50))
            screen.blit(sad_surf, sad_rect)

            excited = pygame.draw.circle(screen, (255,0,255), [50,170], 40,0)
            excited_surf = my_font.render("Excited", True, WHITE)
            excited_rect = excited_surf.get_rect(center =(50,170))
            screen.blit(excited_surf, excited_rect)

            angry = pygame.draw.circle(screen, (255,0,0), [210,170], 40,0)
            angry_surf = my_font.render("Angry", True, WHITE)
            angry_rect = angry_surf.get_rect(center =(210,170))
            screen.blit(angry_surf, angry_rect)

            #initialize audio here
            pygame.mixer.init()

            #get touch events
            pitft.update()
            pygame.display.update()
            for event in pygame.event.get():
                if (event.type is MOUSEBUTTONDOWN):
                    pos = pygame.mouse.get_pos()
                elif(event.type is MOUSEBUTTONUP):
                    pos = pygame.mouse.get_pos()
                    x,y = pos

                    #happy button
                    if x>10 and x<90: if y>10 and y<90: 
                        if (neutral_run= =False):
                            neutral_run=True
                            look_forward()
                            pwm5.ChangeDutyCycle(7)
                            time.sleep(0.5)
                            pwm26.ChangeDutyCycle(6)
                            time.sleep(0.5)
                            pwm19.ChangeDutyCycle(6)
                            pygame.mixer.music.load("sexy.wav") #put in audio file name
                            #add audio stuff w / motor control here
                            pygame.mixer.music.play()
                            pygame.mixer.music.set_volume(0.3)
                            while (not new_press and pygame.mixer.music.get_busy()= =True):
                                look_up()
                                sing()
                                look_down()
                                continue
                            neutral_run=False

                    #sad button
                    if x>170 and x<250: 
                        if y>10 and y<90: #play other audio file
                            pygame.mixer.music.load("sad_violin.wav") #put in audio file name
                            pygame.mixer.music.play()
                            pygame.mixer.music.set_volume(0.1)
                            if (sad_run= =False):
                                new_press=True
                                sad_run=True
                                #Eyes look down
                                look_down()
                                look_forward()
                                pwm13.ChangeDutyCycle(1) #go to right
                                time.sleep(0.5)
                                pwm12.ChangeDutyCycle(2) #go to left
                                time.sleep(0.5)
                                pwm5.ChangeDutyCycle(6)
                                time.sleep(0.5)
                                pwm26.ChangeDutyCycle(7)
                                time.sleep(0.5)
                                pwm19.ChangeDutyCycle(6)
                                time.sleep(0.5)
                                while (not new_press and pygame.mixer.music.get_busy()= =True):
                                    continue
                            sad_run=False
                            new_press=False

                    #excited button
                    if x>10 and x<90: if y>130 and y<210: #play another audio file
                        pygame.mixer.music.load("gangnamstyle.wav") #put in audio file name
                        pygame.mixer.music.play()
                        if (excited_run= =False):
                            new_press=True
                            excited_run=True
                            pygame.mixer.music.play()
                            look_forward()
                            look_up()
                            pwm13.ChangeDutyCycle(10)
                            time.sleep(0.5)
                            pwm12.ChangeDutyCycle(10)
                            time.sleep(0.5)
                            pwm5.ChangeDutyCycle(10)
                            time.sleep(0.5)
                            pwm26.ChangeDutyCycle(7)
                            time.sleep(0.5)
                            pwm19.ChangeDutyCycle(8)
                            time.sleep(0.5)
                            for x in range(14):
                                sing()
                            look_left()
                            look_right()
                            look_forward()
                            vomit()
                            new_press=False
                            excited_run=False
                            while (not new_press and pygame.mixer.music.get_busy()= =True):
                                continue 
                                                          
                    #angry button
                    if x>170 and x<250: 
                        if y>130 and y<210: #play one more audio file
                            pygame.mixer.music.load("angrykill.wav") #put in audio file name
                            pygame.mixer.music.play()
                            pygame.mixer.music.set_volume(1.0)
                            if (angry_run= =False):
                                new_press=True
                                angry_run=True
                                look_forward()
                                look_left()
                                look_right()
                                pwm13.ChangeDutyCycle(10)
                                time.sleep(0.25)
                                pwm12.ChangeDutyCycle(10)
                                time.sleep(0.25)
                                pwm5.ChangeDutyCycle(10)
                                time.sleep(0.25)
                                pwm26.ChangeDutyCycle(4)
                                time.sleep(0.25)
                                pwm19.ChangeDutyCycle(6)
                                time.sleep(0.25)
                                sing()
                                sing()
                                sing()
                                time.sleep(0.25)
                                sing()
                                angry_run=False
                                new_press=False
                                while (not new_press and pygame.mixer.music.get_busy()= =True):
                                    continue

            #flip everything at the end
            pygame.display.flip()