80 likes | 233 Views
Inertial Gesture Recognition. Ari Y. Benbasat Responsive Environments Group MIT Media Laboratory. Compact Inertial Measurement Unit. Full sensor set for 3D motion detection in compact wireless package. Implementation 3(+1) Accelerometers 3 Gyroscopes 12-bit ADC/Microcontroller
E N D
Inertial Gesture Recognition Ari Y. Benbasat Responsive Environments Group MIT Media Laboratory
Compact Inertial Measurement Unit • Full sensor set for 3D motion detection in compact wireless package. • Implementation • 3(+1) Accelerometers • 3 Gyroscopes • 12-bit ADC/Microcontroller • 900 MHz wireless link • Low-power (75mW)
S.M. Thesis Work • Create analysis and interpretation framework for such devices: • Analysis: Activity detection • Gesture Recognition: Parameterized atomic gestures • Output Scripting: Links gestures to outputs • Applications: • Current: Re-implementation of (void*) • Future: Gesture-based control and learning Project Organization
Activity Detection • Simple scheme based on windowed variance • Piecewise model of model used to analytically find threshold • Finds areas of interest in data streams to be analyzed by the gesture recognition system • Err on side of false positives • Stuttering gestures OK
Gesture Recognition • Parameterized • Magnitude and duration are properties of the detected gestures, not fundamental to the process • Atomic • Considered on axis at a time • Considered only in units of a number of peaks • Algorithm • Expects net zero sum (accelerometers) • Non-trivial size (gyroscopes) • Pieces together stuttering gestures by combining failed gestures • Breaks gestures if polarity of adjacent peaks is identical
Output / Applications • Simple JPython script allows temporal and logical combinations of gestures to be linked to output routines • Value in Applications: • Allows direct, in situ sensing of quantities of interest • Compact / low-power→useable in a wide variety of situations • Low complexity of algorithms allows for stand-alone devices → combined perception and expertise in single device • Not limited to human gesture
Sample Analysis Perform Gestures and Collect Data Find Areas of Activity y x (no rotation) Run Recognition Recombine Atomic Output 2 Peaks = + = = 3 Peaks
Sample Analysis (2) • Perform gesture • Sweeping twist • Find gestures in stream • One axis at a time • Note baseline subtraction • Recombine atoms • Can be tied to output 1 Peak (gyro) = = 2 Peaks (acc) sound light etc. = causes +