1 / 37

Using Xbox Kinect for Delivering Experiential Learning Games

Using Xbox Kinect for Delivering Experiential Learning Games. Kinectify your Digital Learning Initiatives. Presented By – Manish Gupta CEO, G-Cube. Agenda. Motion Aware Learning Experiences Motion and Learning in your Organization Kinect based learning solutions

tybalt
Download Presentation

Using Xbox Kinect for Delivering Experiential Learning Games

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Xbox Kinect for Delivering Experiential Learning Games • Kinectifyyour Digital Learning Initiatives • Presented By – • Manish Gupta • CEO, G-Cube

  2. Agenda • Motion Aware Learning Experiences • Motion and Learning in your Organization • Kinect based learning solutions • Core Aspects of building Kinect based Learning Experiences • Other Motion Tracking Options

  3. Motion Aware Learning Experiences

  4. Motion Awareness • Triggered via visuals, sounds, electrical and mechanical activities • Detecting motion • Motion detection for security and lighting • Directional Motion • Directional motion detection for whole objects such as detecting approaching/receding vehicles • Granular motion detection • Parts of an object – from facial expressions & finger movements to full body motions

  5. Motion Aware Learning Experiences • Inferring from Motion in learning • The ability to understand what motion represents when learning is in progress • Could be a mixture of many things • Eye tracking (focus of interest, blink velocity, tiredness etc.) • Facial Expressions (smiling, frowning, bored) • Head positions (looking at or away) • Inferences from other bodily positions • Leading to insights into concentration & focus, degree of engagement etc. that can be used to modify learning environments dynamically

  6. Motion Aware Learning Experiences • Inferring from Motion for learning • The ability to infer that the right motions are being learnt as part of the learning • Could be a mixture of many things • A set of physical movements to accomplish an outcome (e.g.dancing) • The degree to which speed, force or pressure is being employed in the action (e.g. learning a sport) • Leading to insights into how well physical acts embodied in the learning are being replicated by the learner

  7. Motion Aware Learning Experiences • Inferring from Motion throughlearning • The ability for learning environments to change based on the motions of the learner as she progresses through the learning • Could be a mixture of many things • Mimicking real world responses in simulated environments (e.g. aircraft control, safety drill) • Changing the context and difficulty of learning based on quality of learners response (e.g. surgery) • Adapting to trainees that have special disabilities (e.g. sign language interfaces) • Leading to learning environments to adapt to learner’s progress or learner’s special needs

  8. Activity • Have you experienced or used motion in any of the ways described so far? • In learning (motion when learning is in progress) • For learning (motion as an integral part of the learning) • Through learning (when learning environments adapt to your motion) • If yes, what has your experience been on the effectiveness of these motion aware learning experiences? • Note: Please include both technological and human observable motion

  9. Motion and Learning in your Organization

  10. Motion & Learning in Your Organization • Some obvious statements • Your e-learning could be made increasingly more engaging and fulfilling if you knew how your users were experiencing it physically • A lot of learning that involves physical motion can be now digitally experienced and assessed to varying levels of detail • Learning environments can adapt to user physical actions making them more immersive, thereby increasing learning effectiveness • Some obvious caveats • No technology (yet) can completely simulate real world touch and feel or real world complexity • Technology for implementing things such as eye tracking may be expensive to procure and implement in their current state • Some of these technologies may raise “big brother” privacy concerns

  11. Motion in Learning for Sales Function Presentation Skills for sales people • Body language • Verbal Communication • Facial Expressions

  12. Motion in Learning for Operations Function Standard Operating Procedures - • Safety Operations • Operating Machines • On Ground Training (aircraft signaling, medical procedures)

  13. Motion in Learning for Recruitment Function • Psychometric Profiling • Present lifelike immersive scenarios • Full voice and body language analytics • Induction Program • Immersive virtual exploration of the company • Foreign location orientation

  14. Motion in Learning for People Function • Employee Wellness Initiative • Fitness training programs • Competitive and collaborative fitness based events • Gamification • Self Defense programs

  15. Motion in Learning for the Customer Activity For Customer Facing Solutions - • Clothes Shopping • Spectacles or Hairstyles • Games for Discounts • Shopping Recommendations

  16. G-Cube - Case Studies Sample from a Course on Two Mopping Solution Video 1 – POC on Presentation Skills Video 2 – POC on Hard Skills

  17. Activity Take the function(s) you are involved in supporting in your organization. Conjure up as many examples as you can to employ motion in learning for your organization. Critics: Come up with as many “yeah, but…” statements.

  18. Kinect based learning solutions

  19. Affordances • Gesture based computing • Facial Recognition • Speech Recognition using MS Speech engine • Object Reconstruction • Community (Xbox Live) • Integrated Real-time Video • Content (Kinect enabled experiences)

  20. Introducing Kinect • Color VGA video camera - This video camera aids in facial recognition and other detection features by detecting three color components: red, green and blue. • Depth sensor - An infrared projector and a monochrome sensor work together to "see" the room in 3-D regardless of the lighting conditions. • Multi-array microphone - This is an array of four microphones that can isolate the voices of the players from the noise in the room. • Detects and tracks 48 points on each player's body, mapping them to a digital reproduction of that player's body shape and skeletal structure, including facial details

  21. Skeletal Tracking • Kinect can recognize six people and track two

  22. Skeletal Tracking • Kinect can recognize standing and seated modes

  23. Speech • Speech recognition is one of the key functionalities of the Kinect API. • The Kinect sensor’s microphone array is an excellent input device for speech recognition-based applications. • It provides better sound quality than a comparable single microphone and is much more convenient to use than a headset. • Managed applications can use the Kinect microphone with the Microsoft.Speech API, which supports the latest acoustical algorithms.

  24. Kinect Interaction • Identification and tracking of primary interaction hand of upto two users • Detection services for user's hand location and state • Grip and grip release detection • Press detection • Hover, select, wave, and other standard interactions to minimize learning curve

  25. Kinect Fusion • Kinect Fusion provides 3D object scanning and model creation using a Kinect for Windows sensor • The user can paint a scene with the Kinect camera and simultaneously see, and interact with, a detailed 3D model of the scene • Kinect Fusion can be run at interactive rates on supported GPUs, and can run at non-interactive rates on a variety of hardware

  26. Face Tracking SDK • The Face Tracking SDK’s face tracking engine analyzes input from a Kinect camera, deduces the head pose and facial expressions, and makes that information available to an application in real time. • For example, this information can be used to render a tracked person’s head position and facial expression on an avatar in a game or a communication application or to drive a natural user interface (NUI).

  27. Latest SDK 1.8 • New background removal -An API removes the background behind the active user so that it can be replaced with an artificial background • Realistic colorcapture -A new Kinect Fusion API scans the color of the scene along with the depth information so that it can capture the color of the object along with its three-dimensional (3D) model • Improved tracking robustness - This algorithm makes it easier to scan a scene. • HTML Interactions - It allows developers to use HTML5 and JavaScript to implement Kinect-enabled user interfaces • Adaptive UI -Build an application that adapts itself depending on the distance between the user and the screen—from gesturing at a distance to touching a touchscreen.

  28. Core design consideration for building Kinect based Learning Experiences

  29. Design Considerations Choose your use cases carefully • Embedding physical motion in learning can be a daunting task for most of regular corporate learning because we are so used to traditional eLearning and classroom modes • Kinect’s most obvious use cases are where physical motion is part of the training (At this point in time, full gross limb movements are better tracked as compared to fine grained limb movements (e.g. fingers)) • Building algorithms to do intelligent things (and not just skeletal tracking or facial recognition) can get very complex very quickly

  30. Design Considerations 3D Worlds, Gamification, Serious Games and Simulations (GSGS) • Typically these will involve some components from 3D sets, and a Serious Games / Simulation based component. • Using GSGS in motion based learning environments increases the effectiveness and engagement levels • Learners are more motivated in competitive settings and there are also avenues for rich collaboration between players • Bodily immersion into the gaming environment increases levels of engagement

  31. Design Considerations Leverage the Human Interface • Too often, the most used actions in an educational solution maybe simple “point and press” actions • Kinect provides a substantially different interface to performing actions – use natural actions as far as possible • Read the Kinect Human Interface Guidelines for a detailed look at how the interface should be designed

  32. Deployment Considerations • Space – the sensor requires around 4 ft distance from the user and the user should not feel cramped executing the actions • Privacy – users may feel “odd” or “silly” performing physical actions in front of the screen • Personalization – you would need to manage quick identification of users on shared Xbox installations and map them to the corporate directory • LMS – you could still integrate with an LMS if it supports web services for communications or by using Tin Can API

  33. Other Motion Tracking Options

  34. Leap Motion • The Leap Motion Controller senses your hands and fingers and follows their every move • Useful for learning interactivities where minute hand movements are to be tracked (e.g. working with small instruments) • Mold, stretch, or bend 3D objects • Take things apart and put them together • Interact with content using hand movements

  35. Tobii • Tobii is one of the leaders in eye tracking and gaze interactions. This technology makes it possible to know where exactly users are looking, which can result into various applications - • From point and click to look and do • Interacting with content without distractions • Can be used as assistive technology • Can be used to evaluate user content interaction patters in controlled group

  36. Summary 1 Kinect and other gesture friendly devices are exciting innovations and have the potential of augmenting our current learning solutions 2 3 However, Kinect based learning solutions development is not easy and design needs to be extremely strong Top use cases will be in the areas where physical motion is demanded as part of the training or where serious games can be built that involve bodily motion

  37. Manish Gupta CEO, G-Cube Twitter: manish.g3 Email: manishg@gc-solutions.net

More Related