1 / 55

Toolkits for Supporting Gestures in Applications

Toolkits for Supporting Gestures in Applications. Justin Weisz 05-830 UI Software Nov. 16, 2004. What is a gesture?. “A single stroke indicates the operation (move text), the operand (the text to be moved), and additional parameters (the new location of the text).” -- Rubine.

efuru
Download Presentation

Toolkits for Supporting Gestures in Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Toolkits for Supporting Gestures in Applications Justin Weisz 05-830 UI Software Nov. 16, 2004

  2. What is a gesture? “A single stroke indicates the operation (move text), the operand (the text to be moved), and additional parameters (the new location of the text).” -- Rubine

  3. What is a gesture? “A single stroke indicates the operation (move text), the operand (the text to be moved), and additional parameters (the new location of the text).” -- Rubine

  4. Editing existing objects Creating new objects Uses of gestures

  5. Issuing commands Back Reload page Menu > Copy Uses of gestures

  6. Applications of gesturing - Lightpens

  7. Applications of gesturing - Tablets

  8. Applications of gesturing - PDAs

  9. Applications of gesturing - Video games PowerGlove in “The Wizard”

  10. Applications of gesturing - Video games Black and White - 2001

  11. Rubine [1991] Gesture Recognizers Automated (in a) Novel Direct Manipulation Architecture

  12. Rubine [1991]

  13. Rubine [1991]

  14. Rubine [1991] For the line gesture: recog = [Seq :[handler mousetool:LineCursor] :[[view createLine] setEndpoint:0 x:<start X> y:<start Y>] ]; manip = [recog setEndpoint:1 x:<current X> y:<current Y>]; done = nil;

  15. Rubine [1991] BUT, how are gestures actually represented and recognized? Assumptions: • Gestures are 2D, single strokes • Start and end of a gesture is clearly defined Representation: set of P sample points position & timestamp, preprocessed to remove jitter

  16. Example features: cos of initial angle length of BB diagonal angle between three pts(?) total angle traversed Rubine [1991] Feature vector extracted from G:

  17. Rubine [1991] BUT... “The aforementioned feature set was empirically determined by the author to work well on a number of different gesture sets” -- Rubine

  18. Rubine [1991] Classification Each gesture class represented by a weight vector To classify gesture G: weight of feature i score bias(?) feature i of gesture G Take the highest score:

  19. Rubine [1991] Training Optimal classifier

  20. Rubine [1991] Rejection > 0.95? ACCEPT Gesture G Pr(G matches i) Classification i REJECT g1 g3 g2 mean(i)

  21. Rubine [1991] Evaluation

  22. Rubine [1991] Evaluation

  23. Rubine [1991] Evaluation

  24. Aside: Agate - Landay, Myers [1993]

  25. gdt - Long et al. [1999]

  26. gdt - Long et al. [1999] Newton and Palm users reported: • Gestures are powerful, efficient and convenient • Want more commands to have gestures • Want to define new gestures • Recognition accuracy is not good enough

  27. gdt - Long et al. [1999] Oh Agate, I will make you beautiful!

  28. gdt - Long et al. [1999]

  29. gdt - Long et al. [1999] Distance matrix

  30. gdt - Long et al. [1999] Classification matrix

  31. gdt - Long et al. [1999] Experiment - Hypotheses • “Participants could use gdt to improve their gesture sets.” • “The tables gdt provided would aid designers.” • “PDA users and non-PDA users would perform differently.”

  32. gdt - Long et al. [1999] Experiment - Procedure (pay no attention to the man behind the curtain...)

  33. gdt - Long et al. [1999] Experiment - Results

  34. “Clustering” d Reverse direction existing gesture classes new class gdt - Long et al. [1999] Experiment - Problems with gdt

  35. “Sloppiness” Gesture overloading Delete gdt - Long et al. [1999] Experiment - Problems with gdt

  36. rect copy gdt - Long et al. [1999] Lessons learned • GDT helpful, but participants averaged a 95.4% recognition rate • Tables too confusing, didn’t help performance (better: “Gesture class A is too similar to gesture class B”) • Should be able to create a test set of gestures and run it against a different gesture class

  37. Break time! Muchas gracias to my officemate for the suggestion. Smiling babies make people happy. BE HAPPY!

  38. Problem: Real problem: it is (still) cumbersome to design a system to perform gesture recognition GT2k - Westeyn et al. [2003]

  39. I’m back! Sensors Microphones Cameras Accelerometers <action> GT2k - Westeyn et al. [2003] GT2k system components Data generator Results interpreter

  40. HMM Transition probs. Symbol output probs. Initial state dist. kth symbol in the alphabet Aside: Hidden Markov Models

  41. Aside: Hidden Markov Models

  42. Aside: Hidden Markov Models • Evaluation problem • Given HMM and O={o1,...,oT}, compute Pr(O|HMM) • Forward algorithm • Decoding problem • Given O, compute most likely state sequence that produced O • Viterbi algorithm • Learning problem • Given O, compute transition probs. to maximize likelihood of observing O • Forward-Backward algorithm (aka. Baum-Welch)

  43. GT2k - Westeyn et al. [2003] Grammars MoveForward = Advance Slow_Down Halt MoveBackward = Reverse Slow_Down Halt command = Attention <MoveForward | MoveBackward>

  44. GT2k - Westeyn et al. [2003] Converting raw sensor data to feature vectors 1 56 Attention 57 175 Advance 176 235 Slow_Down 236 250 Halt

  45. train overfit! test GT2k - Westeyn et al. [2003] Training Training and validation procedure

  46. only during continuous recognition GT2k - Westeyn et al. [2003] Accuracy A = accuracy N = number of examples S = # substitution errors (misclassification) D = # deletion errors (failed to recognize a gesture) I = # insertion errors (system hallucinates a gesture)

  47. GT2k - Westeyn et al. [2003] Applications - Gesture Panel gesture = up | down | left | right | up-left | up-right | down-left | down-right Result: 99.20% accuracy on 251 examples (2 substitution errors)

  48. GT2k - Westeyn et al. [2003] Applications - Prescott blinkprint = person_1 | person_2 | person_3 Result: 89.6% accuracy on 48 examples (5 substitution errors, not good!)

  49. GT2k - Westeyn et al. [2003] Applications - TeleSign word = my | computer | helps | me | talk sentence = ( calibrate word word word word word exit ) Result: 90.48% accuracy on 72 examples

  50. GT2k - Westeyn et al. [2003] Applications - Workshop Activity Recognition gesture = hammer | file | sand | saw | screw | vise | drill | clap | use_drawer | grind Result: 93.33% accuracy on 10 examples per activity

More Related