1 / 6

Opinionated

Opinionated. Lessons. in Statistics. by Bill Press. #4 The Jailer’s Tip. A. Our next topic is Bayesian Estimation of Parameters . We’ll ease into it with an example that looks a lot like the Monte Hall Problem:. The Jailer’s Tip:. Of 3 prisoners (A,B,C), 2 will be released tomorrow.

jbudge
Download Presentation

Opinionated

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Opinionated Lessons in Statistics by Bill Press #4 The Jailer’s Tip Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  2. A Our next topic is Bayesian Estimation of Parameters. We’ll ease into it with an example that looks a lot like the Monte Hall Problem: The Jailer’s Tip: • Of 3 prisoners (A,B,C), 2 will be released tomorrow. • A, who thinks he has a 2/3 chance of being released, asks jailer for name of one of the lucky – but not himself. • Jailer says, truthfully, “B”. • “Darn,” thinks A, “now my chances are only ½, C or me”. Is this like Monty Hall? Did the data (“B”) change the probabilities? Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  3. “says B” ( j ) ( ) P S B C 0 1 · · ( j ) ( j ) ( j ) P A S P A B S P A C S + x x = B = B B B ; ( j ) ( ) P S A B P A B B = ( j ) ( ) ( j ) ( ) ( j ) ( ) P S A B P A B P S B C P B C P S C A P C A + + B B B 1 1 3 = = 1 1 1 1 0 + + + x ¢ ¢ x 3 3 0 1/3 1 x Further, suppose (unlike Monty Hall) the jailer is not indifferent about responding “B” versus “C”. Does that change your answer to the previous question? So if A knows the value x, he can calculate his chances.If x=1/2 (like Monty Hall), his chances are 2/3, same as before; so (unlike Monty Hall) he got no new information.If x≠1/2, he does get new info – his chances change. But what if he doesn’t know x at all? Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  4. R ( j ) ( j ) ( j ) d P A S I P A S I I x p x x = B B x law of de-Anding:x’s are EME! (e.g., Jailer’s Tip): 1 R ( j ) d I p x x = 1 + x x “Marginalization” (this is important!) • When a model has unknown, or uninteresting, parameters we “integrate them out” … • …multiplying by any knowledge of their distribution • At worst, just a prior informed by background information • At best, a narrower distribution based on data • This is not any new assumption about the world • it’s just the Law of de-Anding Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  5. R ( ( ) j ( ) j ) ( j ) ( j ) d P A S I I P A S I I Z ´ p x p x x p x x = B B X X x ( ) ( ) d d P 1 1 1 p x x p x x = $ = $ = i i i 1 R ( j ) d I p x x = x i i 1 + x x (repeating previous equation:) first time we’ve seen a continuous probability distribution, but we’ll skip the obvious repetition of all the previous laws (Notice that p(x) is a probability of a probability! That is fairly common in Bayesian inference.) Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  6. 1 R ( ( ( ) ) j ( ) ( ) ( ( j ) ) ( ) j ) ± d P A S I P A S I I 1 0 0 1 1 · · · · ¡ p p x x x x x x p x x = = = B B ; 2 ; x 1 1 ( j ) = 1 P A S I R 2 3 ( j ) d l P A S I 2 0 6 9 3 1 R = = ( j ) B d I x n = = = = B 1 1 2 + p x x = 1 : 0 + x 1 + x x (repeating previous equation:) What should Prisoner A take for p(x) ?Maybe the “uniform prior”? A Not the same as the “massed prior at x=1/2” “Dirac delta function” substitute value and remove integral Professor William H. Press, Department of Computer Science, the University of Texas at Austin

More Related