1 / 163

Machine Learning Methods for Human-Computer Interaction

Machine Learning Methods for Human-Computer Interaction. Kerem Altun Postdoctoral Fellow Department of Computer Science University of British Columbia. IEEE Haptics Symposium March 4, 2012 Vancouver, B.C., Canada. Machine learning. Machine learning. Pattern recognition. Regression.

isanne
Download Presentation

Machine Learning Methods for Human-Computer Interaction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Machine Learning Methods for Human-Computer Interaction Kerem Altun Postdoctoral Fellow Department of Computer Science University of British Columbia IEEE Haptics Symposium March 4, 2012 Vancouver, B.C., Canada

  2. Machine learning Machine learning Pattern recognition Regression Template matching Statistical pattern recognition Structural pattern recognition Neural networks Supervised methods Unsupervised methods IEEE Haptics Symposium 2012

  3. title even appears in the International Association for Pattern Recognition (IAPR) newsletter many definitions exist simply: the process of labeling observations (x) with predefined categories (w) What is pattern recognition? IEEE Haptics Symposium 2012

  4. Various applications of PR [Jain et al., 2000] IEEE Haptics Symposium 2012

  5. “tufa” “tufa” “tufa” Supervised learning Can you identify other “tufa”s here? lifted from lecture notes by Josh Tenenbaum IEEE Haptics Symposium 2012

  6. Unsupervised learning How many categories are there? Which image belongs to which category? lifted from lecture notes by Josh Tenenbaum IEEE Haptics Symposium 2012

  7. Pattern recognition in haptics/HCI [Altun et al., 2010a] human activity recognition body-worn inertial sensors accelerometers and gyroscopes daily activities sitting, standing, walking, stairs, etc. sports activities walking/running, cycling, rowing, basketball, etc. IEEE Haptics Symposium 2012

  8. Pattern recognition in haptics/HCI [Altun et al., 2010a] walking basketball right arm acc left arm acc IEEE Haptics Symposium 2012

  9. Pattern recognition in haptics/HCI [Flagg et al., 2012] touch gesture recognition on a conductive fur patch IEEE Haptics Symposium 2012

  10. Pattern recognition in haptics/HCI [Flagg et al., 2012] light touch stroke scratch IEEE Haptics Symposium 2012

  11. Other haptics/HCI applications? IEEE Haptics Symposium 2012

  12. Pattern recognition example [Duda et al., 2000] • excellent example by Duda et al. • classifying incoming fish on a conveyor belt using a camera image • sea bass • salmon IEEE Haptics Symposium 2012

  13. Pattern recognition example • how to classify? what kind of information can distinguish these two species? • length, width, weight, etc. • suppose a fisherman tells us that salmon are usually shorter • so, let's use length as a feature • what to do to classify? • capture image – find fish in the image – measure length – make decision • how to make the decision? • how to find the threshold? IEEE Haptics Symposium 2012

  14. Pattern recognition example [Duda et al., 2000] IEEE Haptics Symposium 2012

  15. Pattern recognition example • on the average, salmon are usually shorter, but is this a good feature? • let's try classifying according to lightness of the fish scales IEEE Haptics Symposium 2012

  16. Pattern recognition example [Duda et al., 2000] IEEE Haptics Symposium 2012

  17. Pattern recognition example • how to choose the threshold? IEEE Haptics Symposium 2012

  18. Pattern recognition example • how to choose the threshold? • minimize the probability of error • sometimes we should consider costs of different errors • salmon is more expensive • customers who order salmon but get sea bass instead will be angry • customers who order sea bass but occasionally get salmon instead will not be unhappy IEEE Haptics Symposium 2012

  19. Pattern recognition example • we don't have to use just one feature • let's use lightness and width each point is a feature vector 2-D plane is the feature space [Duda et al., 2000] IEEE Haptics Symposium 2012

  20. Pattern recognition example • we don't have to use just one feature • let's use lightness and width each point is a feature vector 2-D plane is the feature space decision boundary [Duda et al., 2000] IEEE Haptics Symposium 2012

  21. Pattern recognition example • should we add as more features as we can? • do not use redundant features IEEE Haptics Symposium 2012

  22. Pattern recognition example • should we add as more features as we can? • do not use redundant features • consider noise in the measurements IEEE Haptics Symposium 2012

  23. Pattern recognition example • should we add as more features as we can? • do not use redundant features • consider noise in the measurements • moreover, • avoid adding too many features • more features means higher dimensional feature vectors • difficult to work in high dimensional spaces • this is called the curse of dimensionality • more on this later IEEE Haptics Symposium 2012

  24. Pattern recognition example • how to choose the decision boundary? is this one better? [Duda et al., 2000] IEEE Haptics Symposium 2012

  25. Pattern recognition example • how to choose the decision boundary? is this one better? [Duda et al., 2000] IEEE Haptics Symposium 2012

  26. Probability theory review • a chance experiment, e.g., tossing a 6-sided die • 1, 2, 3, 4, 5, 6 are possible outcomes • the set of all outcomes: W={1,2,3,4,5,6} is the sample space • any subset of the sample space is an event • the event that the outcome is odd: A={1,3,5} • each event is assigned a number called the probability of the event: P(A) • the assigned probabilities can be selected freely, as long as Kolmogorov axioms are not violated IEEE Haptics Symposium 2012

  27. Probability axioms • for any event, • for the sample space, • for disjoint events • third axiom also includes the case • die tossing – if all outcomes are equally likely • for all i=1…6, probability of getting outcome i is 1/6 IEEE Haptics Symposium 2012

  28. Conditional probability • sometimes events occur and change the probabilities of other events • example: ten coins in a bag • nine of them are fair coins – heads (H) and tails (T) • one of them is fake – both sides are heads (H) • I randomly draw one coin from the bag, but I don’t show it to you H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T • which of these events would you bet on? IEEE Haptics Symposium 2012

  29. Conditional probability • suppose I flip the coin five times, obtaining the outcome HHHHH (five heads in a row) • call this event F H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T • which of these events would you bet on now? IEEE Haptics Symposium 2012

  30. Conditional probability • definition: the conditional probability of event A given that event B has occurred: • P(AB) is the probability of events A and B occurring together • Bayes’ theorem: read as: "probability of A given B" IEEE Haptics Symposium 2012

  31. Conditional probability H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) • we know that F occurred • we want to find – • difficult – use Bayes’ theorem IEEE Haptics Symposium 2012

  32. Conditional probability H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) IEEE Haptics Symposium 2012

  33. Conditional probability H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) probability of observing F if H0 was true prior probability (before the observation F) posterior probability total probability of observing F IEEE Haptics Symposium 2012

  34. Conditional probability H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) total probability of observing F IEEE Haptics Symposium 2012

  35. Conditional probability 1 1 H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) IEEE Haptics Symposium 2012

  36. Conditional probability 1 1/10 1 1/10 H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) IEEE Haptics Symposium 2012

  37. Conditional probability 1 1/10 1 1/10 1/32 H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) IEEE Haptics Symposium 2012

  38. Conditional probability 1 1/10 1 1/10 1/32 9/10 H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) IEEE Haptics Symposium 2012

  39. Conditional probability 1 1/10 32/41 1 1/10 1/32 9/10 which event would you bet on? H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) IEEE Haptics Symposium 2012

  40. Conditional probability H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) this is very similar to a pattern recognition problem! 1 1/10 32/41 1 1/10 1/32 9/10 IEEE Haptics Symposium 2012

  41. Conditional probability H0: the coin is fake, both sides H H1: the coin is fair – one side H, other side T F: obtaining five heads in a row (HHHHH) we can put a label on the coin as “fake” based on our observations! 1 1/10 32/41 1 1/10 1/32 9/10 IEEE Haptics Symposium 2012

  42. Bayesian inference w0: the coin belongs to the “fake” class w1: the coin belongs to the “fair” class x: observation • decide if the posterior probability is higher than others • this is called the MAP (maximum a posteriori) decision rule IEEE Haptics Symposium 2012

  43. Random variables • we model the observations with random variables • a random variable is a real number whose value depends on a chance experiment • discrete random variable • the possible values form a discrete set • continuous random variable • the possible values form a continuous set IEEE Haptics Symposium 2012

  44. Random variables • a discrete random variable X is characterized by a probability mass function (pmf) • a pmf has two properties IEEE Haptics Symposium 2012

  45. Random variables • a continuous random variable X is characterized by a probability density function (pdf) denoted by for all possible values • probabilities are calculated for intervals IEEE Haptics Symposium 2012

  46. Random variables • a pdf also has two properties IEEE Haptics Symposium 2012

  47. Expectation • definition • average of possible values of X, weighted by probabilities • also called expected value, mean IEEE Haptics Symposium 2012

  48. Variance and standard deviation • variance is the expected value of deviation from the mean • variance is always positive • or zero, which means X is not random • standard deviation is the square root of the variance IEEE Haptics Symposium 2012

  49. Gaussian (normal) distribution • possibly the most ''natural'' distribution • encountered frequently in nature • central limit theorem • sum of i.i.d. random variables is Gaussian • definition: the random variable with pdf • two parameters: IEEE Haptics Symposium 2012

  50. Gaussian distribution it can be proved that: figure lifted from http://assets.allbusiness.com IEEE Haptics Symposium 2012

More Related