1 / 36

ELEC 303 – Random Signals

ELEC 303 – Random Signals. Lecture 17 – Hypothesis testing 2 Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 2, 2009. outline. Reading: 8.2,9.3 Bayesian Hypothesis testing Likelihood Hypothesis testing. Four versions of MAP rule.  discrete, X discrete  discrete, X continuous

snow
Download Presentation

ELEC 303 – Random Signals

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ELEC 303 – Random Signals Lecture 17 – Hypothesis testing 2 Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 2, 2009

  2. outline • Reading: 8.2,9.3 • Bayesian Hypothesis testing • Likelihood Hypothesis testing

  3. Four versions of MAP rule •  discrete, X discrete •  discrete, X continuous •  continuous, X discrete •  continuous, X continuous

  4. Example – spam filter • Email may be spam or legitimate • Parameter , taking values 1,2, corresponding to spam/legitimate, prob p(1), P(2) given • Let 1,…, n be a collection of special words, whose appearance suggests a spam • For each i, let Xi be the Bernoulli RV that denotes the appearance of i in the message • Assume that the conditional prob are known • Use the MAP rule to decide if spam or not.

  5. Bayesian Hypothesis testing • Binary hypothesis: two cases • Once the value x of X is observed, Use the Bayes rule to calculate the posterior P|X(|x) • Select the hypothesis with the larger posterior • If gMAP(x) is the selected hypothesis, the correct decision’s probability is P(= gMAP(x)|X=x) • If Si is set of all x in the MAP, the overall probability of correct decision is P(= gMAP(x))=iP(=i,XSi) • The probability of error is: iP(i,XSi)

  6. Multiple hypothesis

  7. Example – biased coin, single toss • Two biased coins, with head prob. p1 and p2 • Randomly select a coin and infer its identity based on a single toss • =1 (Hypothesis 1), =2 (Hypothesis 2) • X=0 (tail), X=1(head) • MAP compares P(1)PX|(x|1) ? P(2)PX|(x|2) • Compare PX|(x|1) and PX|(x|2) (WHY?) • E.g., p1=.46 and p2 =.52, and the outcome tail

  8. Example – biased coin, multiple tosses • Assume that we toss the selected coin n times • Let X be the number of heads obtained • ?

  9. Example – signal detection and matched filter • A transmitter sending two messages =1,=2 • Massages expanded: • If =1, S=(a1,a2,…,an), if =2, S=(b1,b2,…,bn) • The receiver observes the signal with corrupted noise: Xi=Si+Wi, i=1,…,n • Assume WiN(0,1)

  10. Likelihood Approach to Binary Hypothesis Testing

  11. BHT and Associated Error

  12. Likelihood Approach to BHT (Cont’d)

  13. Likelihood Approach to BHT (Cont’d)

  14. Binary hypothesis testing • H0: null hypothesis, H1: alternative hypothesis • Observation vector X=(X1,…,Xn) • The distribution of the elements of X depend on the hypothesis • P(XA;Hj) denotes the probability that X belongs to a set A, when Hj is true

  15. Rejection/acceptance • A decision rule: • A partition of the set of all possible values of the observation vector in two subsets: “rejection region” and “acceptance region” • 2 possible errors for a rejection region: • Type I error (false rejection): Reject H0, even though H0 is true • Type II error (false acceptance): Accept H0, even though H0 is false

  16. Probability of regions • False rejection: • Happens with probability (R) = P(XR; H0) • False acceptance: • Happens with probability (R) = P(XR; H1)

  17. Analogy with Bayesian • Assume that we have two hypothesis =0 and =1, with priors p(0) and p(1) • The overall probability of error is minimized using the MAP rule: • Given observations x of X, =1 is true if • p(0)pX|(x|0) < p(1)pX|(x|1) • Define: = p(0) / p(1) • L(x) = pX|(x|1) / pX|(x|0) • =1is true if the observed values of x satisfy the inequality: L(x)> 

  18. More on testing • Motivated by the MAP rule, the rejection region has the form R={x|L(x)>} • The likelihood ratio test • Discrete: L(x)= pX(x;H1) / pX(x;H0) • Continuous: L(x) = fX(x;H1) / fX(x;H0)

  19. Example • Six sided die • Two hypothesis • Find the likelihood ratio test (LRT) and probability of error

  20. Error probabilities for LRT • Choosing  trade-offs between the two error types, as  increases, the rejection region becomes smaller • The false rejection probability (R) decreases • The false acceptance probability (R) increases

  21. LRT • Start with a target value  for the false rejection probability • Choose a value  such that the false rejection probability is equal to : P(L(X) > ; H0) =  • Once the value x of X is observed, reject H0 if L(x) >  • The choices for  are 0.1, 0.05, and 0.01

  22. Requirements for LRT • Ability to compute L(x) for observations X • Compare the L(x) with the critical value  • Either use the closed form for L(x) (or log L(x)) or use simulations to approximate

  23. Example • A camera checking a certain area • Recording the detection signal • X=W, and X=1+W depending on the presence of the intruders (hypothesis H0 and H1) • Assume W~N(0,) • Find the LRT and acceptance/rejection region

  24. Example

  25. Example

  26. Example

  27. Example (Cont’d)

  28. Example (Cont’d)

  29. Error Probabilities

  30. Example: Binary Channel

  31. Example: Binary Channel

  32. Example: Binary Channel

  33. Example: More on BHT

  34. Example: More on BHT

  35. Example: More on BHT

  36. Example: More on BHT

More Related