1 / 23

UCL

Human Representation in Immersive Space. UCL. Z. Zr. Xr. Yr. X. Y. Body Chat. Sensing. Real–Time Animation. Human Representation in Immersive Space. UCL. BODY CHAT. Aim: To optimise human behaviour in virtual communication. Environment: Network chat environments. UCL.

winslow
Download Presentation

UCL

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Human Representation in Immersive Space UCL

  2. Z Zr Xr Yr X Y Body Chat Sensing Real–Time Animation Human Representation in Immersive Space UCL

  3. BODY CHAT Aim: To optimise human behaviour in virtual communication Environment: Network chat environments UCL BODY CHATSENSINGREAL-TIME ANIMATION Human Representation in Immersive Space

  4. BODY CHAT Description of the system UCL BODY CHATSENSINGREAL-TIME ANIMATION Human Representation in Immersive Space

  5. BODY CHAT Description of the system A CLIENT A CLIENT B UCL BODY CHATSENSINGREAL-TIME ANIMATION Human Representation in Immersive Space

  6. BODY CHAT Avatar behaviour Presence and Movement:Dynamic creation of an avatar if the user logs on and removal if user logs off. Avatars identify particular user’s presence in the virtual environment and pinpoints his or her location. Avatar and Shadow avatar directly react to the keyboard input of forward, backwards, left and right. Signs of life:Automatedbreathing and eye blinking of the avatar, also some randomness to prevent synchrony UCL Communication: Typed text. Conversational Phenomena Communicative Behaviours BODY CHATSENSINGREAL-TIME ANIMATION Human Representation in Immersive Space

  7. BODY CHAT Avatar behaviour High level user control Communication Conversational Phenomena Communicative Behaviours Salutation phenomenon is associated with: Looking Head tossing Waving Smiling UCL BODY CHATSENSINGREAL-TIME ANIMATION Human Representation in Immersive Space

  8. BODY CHAT Awareness of environment: UCL BODY CHATSENSINGREAL-TIME ANIMATION Human Representation in Immersive Space

  9. BODY CHAT Sample interaction: UCL BODY CHATSENSINGREAL-TIME ANIMATION Human Representation in Immersive Space

  10. BODY CHAT Sample for behaviour initiated through typing: UCL BODY CHATSENSINGREAL-TIME ANIMATION Human Representation in Immersive Space

  11. Statements: The performance of human features are important not the appearance of the features themselves. Macromanaging instead of Micromanaging avatar behaviour BODY CHAT UCL Objective: To understand the avatar as an autonomous agent with semi-autonomous animations BODY CHATSENSINGREAL-TIME ANIMATION Human Representation in Immersive Space

  12. Z Zr Xr Yr X Y Real-Time Control of a Virtual Human Using Minimal Sensors Their goal is to realistically recreate human postures while minimally encumbering the operator 6 DOF Sensor (“Flock of Birds” from Ascension Technology, Inc.) BODY CHATSENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  13. Z Z Z Z Z Z Z Z Zr Zr Zr Zr Zr Zr Zr Zr Xr Xr Xr Xr Xr Xr Xr Xr Yr Yr Yr Yr Yr Yr Yr Yr S 1 S 2 S 2 S 1 S 4 S 3 S 3 S 4 X X X X X X X X Y Y Y Y Y Y Y Y Real-Time Control of a Virtual Human Using Minimal Sensors Placement of four 6 DOF position sensors to track body position BODY CHATSENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  14. Z Z Zr Z Zr Zr Xr Yr Xr Yr Xr Yr S 2 X Y S 1 S 3 X Y X Y Real-Time Control of a Virtual Human Using Minimal Sensors Sensors tied most closely to view cone and effector (hand) positions BODY CHATSENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  15. Real-Time Control of a Virtual Human Using Minimal Sensors Software Processing • Uses an inverse kinematics algorithm to infer from the sensor position and orientation data a human posture representation. The software needs to make assumptions about parameters constraining the representation. This is to overcome the absence of information from all the un-sensed joints e.g. elbows, knees and feet etc. • The processing time taken to compute this turned out to be the largest portion of latency in each frame update. BODY CHATSENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  16. Real-Time Control of a Virtual Human Using Minimal Sensors Application and Limitations No finger/grip sensing for manipulating objects, non-intuitive protocols such as bring hands together to an intersection to represent “object selection” etc. Calibration is required to suit the user, this was based on average proportions and might misrepresent non-average persons. Range of approximately 3.0m in a hemisphere around the transmitter. BODY CHATSENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  17. Real Time Animation of Realistic Virtual Humans BODY CHAT SENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  18. Real Time Animation of Realistic Virtual Humans ENVIRONMENT: Games, Interactive TV, Cave, Immersive Environments CLAIM: Simulating humans in real-time adds to the sense of presence. OBJECTIVE: To develop a system capable of animating humans in real-time (With high-realism graphics and movement) CyberTennis. The virtual tennis players animated by real-time motion capture BODY CHAT SENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  19. APPROACH 3-part process • Modelling of figure • Deformation of figure in real-time • Motion Control • MODELLING • Divided body into head, hands and body as each part has different requirements for modelling • Sculptor software to model head and hands based on prototype examples. Head created from template in Sculptor software. BODY CHAT SENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  20. BodyBuilder software to model the body. • Took a multi-layered approach to design of bodies • 1) Articulated skeleton: Body proportions are designed at this stage. • Joints: Metaballs or Ellipsoids or Grouped volume primitives. • Simulates muscle shape and behaviour. • Attached to skeleton and can be transformed interactively. • Body envelope: Equivalent to human skin. • Uses spline surfaces. • Texture fitting or mapping: • Adds detail to model e.g. skin or hair. • Requires correlation between model and image. • Developed new software to do this BODY CHAT SENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  21. Animation Animate skeleton and the top layers are automatically computed and move with it. Skeleton Each joint has defined degrees of freedom and rotation. Animating the joint anglesover time animates the skeleton. Skin Deformation: Compromise between realism and computing speed. Constructing a body mesh: Body data is output as cross-sectional contours. Triangle meshes are produced for each part Allows manipulation of skin contours. Transforms 3D coordinates into 2D plane Hands: Modelled in similar way. Facial Animation: Based on pseudo muscle design. Considers skin surface as polygonal mesh Basic motion parameters are defined as minimum perceptible actions. These MPA’s define both the facial expressions and the animated shape of the face. Used control lattice to control face geometry. Result from three types of input – video, audio or speech and pre-defined actions. BODY CHAT SENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  22. Skeleton motion control: Possible approaches • Skeleton motion captured in real-time drives a pure avatar: • Movements exactly reflect those of real person • Information collected via tracking sensors and used in combination with human animation knowledge. • Skeleton motion is selectively activated from a database of predefined motions: • i.e. avatar is controlled by user in real-time, but avatar movements do not correspond to those of user. • Can use motion capture to generate animated sequences which reduces design time • Skeleton animation is dynamically calculated. • An autonomous actor may act without the users intervention • Behaviour relies on perception of environment • Should use visual, auditory and tactile senses and adapt behavour according to information received. BODY CHAT SENSING REAL-TIME ANIMATION UCL Human Representation in Immersive Space

  23. Z Zr Xr Yr X Y Body Chat Sensing Real–Time Animation How realistic does the graphical representation have to map in the virtual embodiment for a given use context? To what extend does the virtual embodiment have to have an accurate mapping of the real as opposed to a synthetic caricature for a given use? UCL Human Representation in Immersive Space

More Related