1 / 46

Interactive Control of Avatars Animated with Human Motion Data

Interactive Control of Avatars Animated with Human Motion Data. Jehee Lee Carnegie Mellon University Seoul National University. Jinxiang Chai Carnegie Mellon University. Paul S. A. Reitsma Brown University. Jessica K. Hodgins Carnegie Mellon University. Nancy S. Pollard Brown University.

salena
Download Presentation

Interactive Control of Avatars Animated with Human Motion Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interactive Control of Avatars Animated with Human Motion Data Jehee Lee Carnegie Mellon University Seoul National University Jinxiang Chai Carnegie Mellon University Paul S. A. Reitsma Brown University Jessica K. Hodgins Carnegie Mellon University Nancy S. Pollard Brown University

  2. Avatars:Controllable, ResponsiveAnimated Characters • Realistic behavior • Non-trivial environment • Intuitive user interface

  3. Interactive Avatar Control • How to create a rich set of behaviors ? • How to direct avatars ? • How to animate avatar motion ? Motion Database Animate Avatars Motion Sensor Data Preprocess Search User Interface

  4. Related Work(Probabilistic/Statistical Models) Statistical models • Bradley & Stuate 97 • Brand & Hertzmann 00 • Pullen & Bregler 00 • Bowden 00 • Galata, Johnson & Hogg 01 • Li, Wang & Shum 02 Search and playback original motion data • Molina-Tanco & Hilton 00 • Pullen & Bregler 02 • Arikan & Forsyth 02 • Kovar, Gleicher & Pighin 02 • This work

  5. Motion Database In video games • Many short, carefully planned, labeled motion clips • Manual processing

  6. Walk Cycle Start Stop Left Turn Right Turn

  7. Motion Database Our approach • Extended, unlabeled sequences • Automatic processing

  8. Motion Data Acquisition

  9. Maze - Sketch Interface

  10. Re-sequence Motion capture region Virtual environment Sketched path Obstacles

  11. Re-sequence Motion capture region Virtual environment

  12. Data Acquisition “Polesand Holes” rough terrain

  13. Terrain Navigation

  14. Unstructured Input Data A number of motion clips • Each clip contains many frames • Each frame represents a pose

  15. Unstructured Input Data Connecting transition • Between similar frames

  16. Graph Construction

  17. Distance between Frames Weighted differences of joint angles Weighted differences of joint velocities

  18. i j Pruning Transitions Reduce storage space • O(n^2) will beprohibitive Better quality • Pruning “bad” transitions Efficient search • Sparse graph

  19. Pruning Transition • Contact state:Avoid transition to dissimilar contact state • Likelihood: User-specified threshold • Similarity: Local maxima • Avoid dead-ends: Strongly connected components

  20. Graph Search Best-first graph traversal • Path length is bounded • Fixed number of frames at each frame Comparison to global search • Intended for interactive control • Not for accurate global planning

  21. Comparison to Real Motion Environment with physical obstacles

  22. Comparison to Real Motion

  23. Global vs. Local Coordinates Global, fixed, object-relative coordinates Local, moving, body-relative coordinates

  24. User Interface In maze and terrain environments • Sketch interface was effective

  25. User Interface In playground • A broader variety of motions are available

  26. Choice Interface What is available in database ? • Provide with several options • Select among available behaviors

  27. Choice Interface

  28. What to ShowSpace and Time Windows

  29. How to Create Choices

  30. Clustering A A C B C A D E E D

  31. How to Capture Transitions A A C B C A D E E D

  32. How to Capture Transitions A A C B C A D E E D

  33. A A B C D Cluster Tree Three possible actions:ABA, ABC, ABD A A C B C A D E E D

  34. A A B C D Cluster Forest B A C C E A A C B C A D E E D A C D B D

  35. Performance Interface Motion Database Search Vision-based Interface

  36. Vision Interface – Single Camera

  37. Search 3 sec Video buffer A A C B C A D E E D

  38. A A B D Search 3 sec A A C B C A D E E D

  39. Summary Graph representation • Flexibility in motion Cluster forest • A map for avatar’s behavior User interfaces

  40. Future Work Body-relative vs. object-relative • Assemble objects in new configurations • Interactions among avatars Evaluate user interface • User test for effectiveness Combine with existing techniques • Motion editing and style modifications

  41. Acknowledgements Thank • All of our motion capture subjects • Rory and Justin Macey Support • NSF Project web page http://graphics.snu.ac.kr/~jehee/Avatar/avatar.htm

  42. Similarity between Frames

  43. Pruning Transitions

  44. Related Work (Character Animation)

  45. Related Work (User Interfaces)

More Related