Kinect H4x. Gesture Recognition and Playback Tools (+Inspiration). SDK Version 1.0 - Out TODAY What's New?. Ability to control which user(s) are under full skeletal tracking. "Near mode" enables interaction as close as 40cm from the device. Includes "too far" and "too close" depth indicators:
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Gesture Recognition and Playback Tools (+Inspiration)
Ability to control which user(s) are under full skeletal tracking.
"Near mode" enables interaction as close as 40cm from the device. Includes "too far" and "too close"
Mother of All Kinect Demos - Kinect Explorer sample app shows off all features (camera tilt, audio beam angles, etc).
A sequence matching algorithm that can adapt to sequences that vary in speed and time. (think Levenshtein distances, but generalized to matching any sort of input to stored data)
It measures the similarity between two sequences based on a cost function of how much it needs to "warp" the points forward/backward in time to have them line up.
In Kinect-land, this means an algorithm that can take streaming joint data and quickly find the closest match to a gesture 'on record'.
Skeleton2DDataExtract.cs -- Takes in Kinect SDK joint data, spits out normalized 2d skeleton points.
Skeleton2DdataCoordEventArgs.cs -- Defines the event args that get emitted from the Skeleton2DDataExtract event handler once processed.
DtwGestureRecognizer.cs - Parse the 2d skeleton data and call Recognize() to match against loaded gestures (see code for loading/saving example).
The DtwGestureRecognizer can flexibly match any vectorized data stream.
We happened to use skeleton data in our example, but it should be fairly simple to incorporate a depth stream or color stream. Just ensure that each of the sequence objects you pass into AddOrUpdate and Recognize is an array of doubles (e.g. double observationPoint).
In light of the fact that many teams share a single Kinect, it might be of use to you to be able to record a sequence of skeleton data, write it to a file, and replay it back through a dummy nui.SkeletonFrameReady handler.
Fortunately, the Kinect Toolbox (not to be confused with the Coding4Fun Kinect Toolkit), allows us to do just that.
Get it at: http://kinecttoolbox.codeplex.com/
The Cool News: Hand + Finger Tracking!
The Bad News: Behemoth, undocumented code library
Project available at http://candescentnui.codeplex.com
Kinect SDK only provides depth values from 800mm out. The finger tracking code only works in the range of .8 - 1m, so using this project in conjunction with the Kinect SDK will prove difficult.
UPDATE: NEW SDK 1.0 provides depth values from 400mm out!
Compatible With Candescent Alternative: OpenNI + NITE uses the raw point cloud to make best-guess tracking estimates < 800mm away. Good community, documentation at OpenNI.org / OpenKinect.org
Anant - Shape Game Walkthrough Gestures
Deixis Application To Children's Education Games
Main Idea: Ask children to point and verbally identify ("this one!") a subset of objects (numbers, colors, animals).
Description + Video
Gesture Enabled Garden Hose
Main Idea: Use servos (simple motors) in conjunction with netduinos (network-enabled microcontrollers) to control the servo via Kinect gestures:
In Practice: http://channel9.msdn.com/coding4fun/kinect/A-Kinect--Netduino-controlled-squirt-gun
Gesture Based Electronic Music Performance
EDEN: Interactive Ecosystem Simulation Software
Main Idea: Create a topographical landscape on the iPad, fill it with (simulated) water, project it onto a sandscape via depth data with an overhead Kinect + projector. Play with the sand to change the climate and topography to terraform your own sandscape.
Kinect Telepresence - http://www.youtube.com/watch?v=ecMOX8_GeRY
Home Security Camera - http://www.youtube.com/watch?v=UfGOR1Eg_qQ
Living Paintings - http://www.youtube.com/watch?v=UjDaHMKwQl4
Visually Impaired Navigation Tool- http://www.youtube.com/watch?v=l6QY-eb6NoQ
ZigFu: Single bundle install NITE, OpenNI, PrimeSense Sensor, everything you need to work outside of the official SDK: http://zigfu.com/devtools.html