1 / 17

Compact, Configurable Inertial Gesture Recognition

Compact, Configurable Inertial Gesture Recognition. Ari Y. Benbasat Responsive Environments Group MIT Media Laboratory. Uses of Inertial Technology. Conventionally: Position/Orientation tracking (missiles,vehicles,boreholes) Stabilization (Platforms, sights, etc)

keith
Download Presentation

Compact, Configurable Inertial Gesture Recognition

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Compact, ConfigurableInertial Gesture Recognition Ari Y. Benbasat Responsive Environments Group MIT Media Laboratory

  2. Uses of Inertial Technology • Conventionally: • Position/Orientation tracking (missiles,vehicles,boreholes) • Stabilization (Platforms, sights, etc) • More Recently (with cheap, low-cost parts): • Human-Computer Interfaces • Joysticks (Microsoft) / Mice (GyroMouse) • Dance / Exercise • Shoes (MIT, Reebok Traxtar) • Toys

  3. In the Academic Literature… • Musical Input • Sawada (Waseda): Accel-only system; connected 10 simple gestures to a MIDI controller • Digital Baton (MIT): Accels, pressure sensors and IR tracking linked to music • Palmtop Interfaces • Itsy (Compaq): Accels as both static (orientation) or dynamic input (fanning) • Design concepts by Small(MIT), Fitzmaurice(U of T), etc. • GW: • Hoffman (TU-Berlin): Similar high-level structure

  4. Impetus • Inertial gestural systems often: • Are built in an ad hoc fashion • Limited sensors / axes • A priori heavy data analysis • Makes application building difficult • Therefore, we built a • General IMU • Framework for application designers

  5. Compact Inertial Measurement Unit • Full sensor set for 3D motion detection in compact wireless package. • Implementation • 3(+1) Accelerometers • 3 Gyroscopes • 12-bit ADC/Microcontroller • 900 MHz wireless link • Low-power (75mW) • 66 Hz state updates • Allows direct sensing of quantities of interest

  6. First Application: (void*)(with Synthetic Characters) • Control 1 of 3 characters • Buns and forks select dance-style movements (video) • HMM-based gesture recognition • Effective controller • But slow recognition and hard to threshold

  7. S.M. Thesis Work • Create analysis and interpretation framework for such devices: • Analysis: Activity detection • Gesture Recognition: Parameterized atomic gestures • Output Scripting: Links gestures to outputs Project Organization

  8. A Sample Data Stream • 1-8: Straight line gestures, various sizes • 9-15: There and back gestures, various sizes • Note regularity, number of peaks

  9. Human Physiology • Unconstrained arm motion will minimize mean squared jerk (Flash and Hogan, MIT) • Values will scale with duration and magnitude • Therefore: • Strong general a priori • Natural parameterization Norm. Vel. Normalized Time

  10. Activity Detection • Simple scheme based on windowed variance • Piecewise model of peak used to analytically find threshold • Finds areas of interest in data streams to be analyzed by the gesture recognition system • Err on side of false positives • Stuttering gestures OK

  11. Gesture Recognition • Parameterized • Magnitude and duration are properties of the detected gestures, not fundamental to the process • Atomic • Considered on axis at a time • Considered only in units of a number of peaks • Algorithm • Expects net zero sum (accelerometers) • Non-trivial size (gyroscopes) • Pieces together stuttering gestures by combining failed gestures • Breaks gestures if polarity of adjacent peaks is identical

  12. Scripting / Output • Simple scripts are used to link atomic gestures to output routines • Individual atoms can be matched to ranges • Atoms combined with AND / OR operands • Such combinations can then be put in a temporal order to create full gestures • Full gestures are linked to output routines

  13. Sample Analysis Perform Gestures and Collect Data Find Areas of Activity y x (no rotation) Run Recognition Recombine Atomic Output 2 Peaks = + = = 3 Peaks

  14. Sample Analysis (2) • Perform gesture • Sweeping twist • Find gestures in stream • One axis at a time • Note baseline subtraction • Recombine atoms • Can be tied to output 1 Peak (gyro) = = 2 Peaks (acc) sound light etc. = causes +

  15. Application overview • Value in Applications: • Compact / low-power→useable in a wide variety of situations • Low complexity of algorithms allows for stand-alone devices → combined perception and expertise in single device • Not limited to human gesture • Limits: • No absolute reference frame • Separable gestures • Deliberate gestures

  16. Early Applications • Re-implementation of forks and buns • Similar accuracy for an order of magnitude less time • Less fragile and easier to reconfigure • Palm III implementation • Demonstrate simplicity of algorithms • Only 2 accels (space restrictions) • 50 Hz update • Real time combination of gestures

  17. Other Applications • Navigation - Update / reinitialize filters • Gestural information can allow velocity assumptions • Structure seen in physiological data also useful • E.g. zero jerk implies zero velocity • Draper “Smart Boot” work showed value of this approach • Things that Think • Closes the feedback loop between action and correction • Framework allows for robustness and easy alteration

More Related