1 / 28

Embodied Speech and Facial Expression Avatar Critical Design Review

Embodied Speech and Facial Expression Avatar Critical Design Review. sss Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking sss March 10, 2004. Problem Background/Needs Statement. There has been a great deal of research trying to understand how a person interacts with a computer.

Download Presentation

Embodied Speech and Facial Expression Avatar Critical Design Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Embodied Speech and Facial Expression Avatar Critical Design Review sss Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking sss March 10, 2004

  2. Problem Background/Needs Statement • There has been a great deal of research trying to understand how a person interacts with a computer. • Messages of the face help illustrate verbal communication by revealing what the expresser is feeling or trying to convey. • The ability to generate animated facial expressions together with speech is important to many diverse application areas. • A deaf person could use an animated face as a lip-reading system. • An autistic child could be positively affected from a robotic face in terms of social interaction, language development, and learning through structure and repetition.

  3. Goals and Objectives • The overall goal of this project is to create a robotic face capable of displaying human emotion accompanied with speech.

  4. Goals and Objectives • Reverse engineer Yano’s motors and sensors so we are able to move them to any desired position. • Develop a GUI that allows the user to move each motor in both directions to a desired position. • Research the psychology behind the use of facial expressions to convey emotion and mimic these facial expressions with the Yano face. • Develop a GUI that allows the user to select and display real human facial expressions. • Add a microphone input in which we will develop software to mimic speech based on a measure of intensity. • Incorporate facial expressions with speech input to complete the avatar.

  5. 1 F 2 A B E 3 4 D C Yano Control System

  6. User Interface Flow Chart Note: In the future a speech dialog box will be added to the menu.

  7. User Interface Class Diagram MenuDlg ManualControlDlg CYanoDlg SpeechControlDlg YanoEngine Motor (3 instances) CComm

  8. ManualControlDlg (.cpp) Class SetProgressValues() CheckEndSwitches() YanoEngine::CalibrateMotor(int) YanoEngine:: adjustMotors(int destX, int destY, int destZ) Goes back to menu

  9. CYanoDlg (.cpp) Class YanoEngine:: conveyEmotion(int destX, int destY, int destZ) YanoEngine:: adjustMotors(int destX, int destY, int destZ) Goes back to menu

  10. Command Examples • Open Mouth: “PC5PS6D200PS5PS6\r” • Close Mouth: “PS5PC6D200PS5PS6\r” • Raise Cheeks: “PS3PC4D200PS3PS4\r” • Lower Cheeks: “PC3PS4D200PS3PS4\r” • Open Mouth and Lower Cheeks: “PC5PS6PC3PS4D200PS3PS4PS5PS6\r”

  11. SV203 MicrocontrollerCircuit Description Input Port: AD1 AD5 Power: Gnd Vcc Motor Control Port: SV6 SV1 Serial Port

  12. SV203 MicrocontrollerFunctional Description • Receives command through the serial port • Set or Clear the appropriate Motor Control Pin(s) • Read an analogue voltage off of the desired Input Pin(s) • Transmit a value representing the voltage back up the serial line

  13. SV203 MicrocontrollerInterface Description • Serial Port – ASCII text commands are sent to the board via the serial port to tell it what to do. Values from the input pins are also sent back to the computer via the serial port • List of commands we use: • SVxM0 – initialize pin x to use digital logic • PSx – set pin x high • PCx – clear pin x to low • Dn – delay for n milliseconds before next command • PC1PC3PC5D300PS1PS3PS5 – typical motor control command • ADy – read the voltage of input pin y, transmit up serial port • Motor Control Port – sends the logic controls for the motors to the Yano I/O Board. When a pin is set high with PSx, it is set to 6V, PCx will set it to 0V. We use six pins, SV1 through SV6 • A/D Input Port – receives the status of Yano’s switches from the Yano I/O Board. We use 5 pins, AD1 through AD5. Each pin will have 6V on it if it’s switch is open, and near 0V if it is closed. The SV203 converts these voltages to the numbers 0 – 255 for 0V-6V.

  14. SV203 MicrocontrollerLogic Description

  15. Yano I/O BoardCircuit Description SV203 Microcontroller Yano Switch Circuit:

  16. Yano I/O BoardFunctional Description • Receives logic controls for the motors from SV203 • Converts them into powered control for Yano’s motors • Reads in status of Yano’s switches, open or closed • Converts this to a voltage, 6V for open, 0V for closed, and sends back to SV203

  17. Yano I/O BoardInterface Description • Motor Control Input – the logic input for the H-Bridges that determines motor direction and movement. They are paired off, 2 pins per H-Bridge, 1 Bridge per motor: • Mouth: SV5 and SV6 • Cheeks: SV3 and SV4 • Eyes: SV1 and SV2 • Motor Outputs – 3 two pin ports, one for each motor, each pin will have either Vcc or Gnd. If both pins are Vcc (default state) there is no potential between them and the motor will not turn. If one pin drops to Gnd, the motor will turn one way, vice-versa for the other pin. • Sensor Inputs – these ports connect directly to Yano’s switches. Each motor has two limit switches to determine when it runs far enough in each direction. • Sensor Outputs - the interface back to the SV203 that has 5 pins, each of which are set to 6V for open switch and 0V for closed switch. They are paired off according to which motor they are the limit switches for: • Mouth: AD3 and AD4 • Cheeks: AD1 and AD2 • Eyes: AD5

  18. Yano I/O BoardLogic Description

  19. YanoCircuit Description

  20. YanoFunctional Description • Yano has 3 motors powered by the Yano I/O Board. One for each the mouth, one for the cheeks, and one to control the eyelids, eyebrows, and ears. • When the mouth and cheek motors reach their endpoints (ie. fully open or fully closed), they close a switch to indicate that limit is reached. • These switches are read by the Yano I/O Board.

  21. YanoInterface Description • Yano’s interfaces are the motor controls, and the switch feedbacks. • The wires are coded as follows: • Motors: • Red/Black – Eyes – SV1/SV2 • Green/Black – Cheeks – SV3/SV4 • White/Black – Mouth – SV5/SV6 • Sensors: • Red/Green/Brown – Mouth – Gnd/AD4/AD3 • Gray/Yellow/Pink – Cheeks – Gnd/AD2/AD1 • Green/Yellow/Red/Brown – Eyes – Vcc/AD5/Gnd/Gnd

  22. YanoLogic Description

  23. Tasks and Accomplishments • Completed: • Disassemble Yano • Reverse engineer motors and end switches • Create control circuit • Simple computer interface for motor control • In Progress: • Motor calibration • Reverse engineer eye motor IR switch • To Do: • Facial expressions • Sound analysis software • Complete GUI

  24. Validation and Testing Procedures • Calibration Test - Calibrate the motors, then run the motors to its limits and back to see if it stays calibrated. • Expression Test - Change from any one expression to any other expression, and the face should show the desired expression each time. • Speech Test - Using a sample sound file, make sure Yano produces the right mouth movements for the differences in sound volume consistently and accurately.

  25. Validation and Testing Procedures • Complete Project Validation – When we are successfully able to calibrate Yano’s motors and change between various facial expressions, as well as produce mouth movements that mimic human speech intensity, we will know we have been successful in accomplishing our goal.

  26. Itemized Budget

  27. Schedule of Tasks

  28. Questions?

More Related