active capture and folk computing n.
Skip this Video
Loading SlideShow in 5 Seconds..
Active Capture and Folk Computing PowerPoint Presentation
Download Presentation
Active Capture and Folk Computing

play fullscreen
1 / 41
Download Presentation

Active Capture and Folk Computing - PowerPoint PPT Presentation

shubha
80 Views
Download Presentation

Active Capture and Folk Computing

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Active Capture and Folk Computing Ana Ramírez and Marc Davis ICME 2004 – Taipei, Taiwan 29 June 2004 UC Berkeley - Garage Cinema Research - Group for User Interface Research

  2. Smart Multimedia Acquisition Systems • First two papers – automatic camera calibration • Image • Audio • Third paper – understand structure of what is being captured to edit in real time • Active capture - smart cameras that interactively guide and capture human action

  3. Outline • Sample applications • Active Capture • Designing Active Capture algorithms • Future work

  4. SampleApplications Automatic Movie Trailers • Video of capture process Play Video

  5. SampleApplications Automatic Movie Trailers • Video of automatically created movie trailer Play Video

  6. SampleApplications Sports Instruction

  7. SampleApplications Telemedicine Rural Town Large City • leishmaniasis

  8. SampleApplications Automated Health Screening Rural Town Large City • leishmaniasis

  9. Active Capture Direction/ Cinematography Capture Interaction Active Capture Human- Computer Interaction Computer Vision/ Audition Processing

  10. Active Capture Direction/ Cinematography Capture Interaction Active Capture Human- Computer Interaction Computer Vision/ Audition Processing

  11. Active Capture Direction/ Cinematography Capture Interaction Active Capture Human- Computer Interaction Computer Vision/ Audition Processing

  12. Active Capture Direction/ Cinematography Capture Interaction Active Capture Human- Computer Interaction Computer Vision/ Audition Processing

  13. Active Capture Direction/ Cinematography Capture Interaction Active Capture Human- Computer Interaction Computer Vision/ Audition Processing

  14. Active Capture • Traditionally, signal processing algorithms avoid interacting with the user • Signal processing + interaction => more sophisticated recognizers • How to design hybrid algorithms that involve capture, interaction, and processing

  15. Components of Active Capture Algorithms • Simple computer vision and audition recognizers / sensors • Motion • Eyes • Sound • Desired action in terms of recognizers • Interaction script

  16. Design Process • Input: • Desired action = head turn • Recognizers = motion, eyes Motion Eyes time

  17. Design Process • Input: • Desired action = head turn • Recognizers = motion, eyes • Step 1: • Express desired action in terms of recognizers No Motion Motion No Motion Motion Eyes No Eyes Eyes time

  18. Design Process • Input: • Desired action = head turn • Recognizers = motion, eyes • Step 1: • Express desired action in terms of recognizers • Step 2: • Design interaction script

  19. Design Process – Step II

  20. Design Process – Step II

  21. Design Process – Step II Play Video

  22. Design Process – Step II

  23. Design Process – Step II Play Video

  24. Design Challenges Step I - Description of action • Approximate timing • Strict and non strict ordering Step II – Interaction script • What to do if something goes wrong – mediation

  25. Step I – Action Description

  26. Step I - Action Description Visual Language • Observations • Commands • Capture • Time constraints • Strict ordering • Non-strict ordering

  27. Step I - Action Description Visual Language • Observations • Commands • Capture • Time constraints • Strict ordering • Non-strict ordering

  28. Step I - Action Description Visual Language • Observations • Commands • Capture • Time constraints • Strict ordering • Non-strict ordering

  29. Step I - Action Description Visual Language • Observations • Commands • Capture • Time constraints • Strict ordering • Non-strict ordering

  30. Step I - Action Description Visual Language • Observations • Commands • Capture • Time constraints • Strict ordering • Non-strict ordering

  31. Step I - Action Description Visual Language • Observations • Commands • Capture • Time constraints • Strict ordering • Non-strict ordering

  32. Step I - Action Description Visual Language • Observations • Commands • Capture • Time constraints • Strict ordering • Non-strict ordering

  33. Step II – Interaction Script

  34. Step II – Interaction Script Contextual Inquiries • Golf instructor • Aikido instructor • 911 emergency phone operator • Triage nurse • Children’s portrait photographer • Film and theatre directors [Jeffrey Heer, Nathaniel S. Good, Ana Ramirez, Marc Davis, and Jennifer Mankoff. “Presiding Over Accidents: System Direction of Human Action.” In: Proceedings of the Conference on Human Factors in Computing Systems (CHI 2004) in Vienna, Austria. ACM Press, 463-470, 2004. ]

  35. Step II – Interaction Script Direction and Feedback Strategies • External aids Play Video

  36. Step II – Interaction Script Direction and Feedback Strategies • Decomposition and “Show” Play Video

  37. Step II – Interaction Script Direction and Feedback Strategies • Method shift from “Show” to “Tell” Play Video

  38. Step II – Interaction Script Direction and Feedback Strategies

  39. Summary • Active Capture – smart cameras that interactively guide and capture human action • Sample applications • Automated health screening • Automated movie clips • Sports trainer • Design Challenges • Description of action • Interaction script

  40. Future Work • Support design and implementation of Active Capture applications • Evaluate the relative contribution of signal analysis and user interaction in these hybrid algorithms

  41. Questions Ana Ramírez anar@cs.berkeley.edu www.cs.berkeley.edu/~anar Garage Cinema Research http://garage.sims.berkeley.edu Group for User Interface Research http://guir.berkeley.edu