1 / 9

Opinionated Lessons in Statistics by Bill Press #2: Bayes

Professor William H. Press from the University of Texas at Austin explains Bayes' Theorem and its applications in statistics and inference. Learn how data changes probabilities and the importance of background information in hypothesis testing.

leroymartin
Download Presentation

Opinionated Lessons in Statistics by Bill Press #2: Bayes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Opinionated Lessons in Statistics by Bill Press #2 Bayes Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  2. ( j ) ( j ) ( ) P H B P B H P H / i i i ( ) P H B i ( j ) P H B = i ( ) P B ( j ) ( ) P B H P H i i = P ( j ) ( ) P B H P H j j j Bayes Theorem Thomas Bayes1702 - 1761 (same picture as before) Law of And-ing Law of de-Anding We usually write this as this means, “compute the normalization by using the completeness of the Hi’s” Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  3. As a theorem relating probabilities, Bayes is unassailable • But we will also use it in inference, where the H’s are hypotheses, while B is the data • “what is the probability of an hypothesis, given the data?” • some (defined as frequentists) consider this dodgy • others (Bayesians like us) consider this fantastically powerful and useful • in real life, the “war” between Bayesians and frequentists is long since over, and most statisticians adopt a mixture of techniques appropriate to the problem • for a view of the “war”, see Efron paper on the course web site • Note that you generally have to know a complete set of EME hypotheses to use Bayes for inference • perhaps its principal weakness Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  4. Let’s work a couple of examples using Bayes Law: Example: Trolls Under the Bridge Trolls are bad. Gnomes are benign.Every bridge has 5 creatures under it: 20% have TTGGG (H1) 20% have TGGGG (H2) 60% have GGGGG (benign) (H3) Before crossing a bridge, a knight captures one of the 5 creatures at random. It is a troll. “I now have an 80% chance of crossing safely,” he reasons, “since only the case 20% had TTGGG (H1)  now have TGGG is still a threat.” Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  5. ( j ) ( j ) ( ) P H T P T H P H / i i i 2 1 2 ¢ 5 5 ( j ) P H T = = 1 2 1 1 1 3 3 0 + + ¢ ¢ ¢ 5 5 5 5 5 so, The knight’s chance of crossing safely is actually only 33.3%Before he captured a troll (“saw the data”) it was 60%.Capturing a troll actually made things worse! (80% was never the right answer!) Data changes probabilities!Probabilities after assimilating data are called posterior probabilities. Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  6. ( j ) ( ) ( j ) h l l d d l l A W T B P H D D D P H D P H D i i t t u e e p s n e r e o w r e s s : e e e s : a r e r e a c u a y i i i 1 1 2 2 , , , ( ( ( j j j ) ) ) ( ( ( j j j ) ) ( ( ) ) ) ( j ) d d l d ! f P P P H H D D D H D D P P D P D D H H H P D P H H P H D O i i i t t t c o n o n e / o / n p r e v o u s y s e e n a a e n = i i i i i i i i i 2 1 1 2 1 1 1 2 1 1 ( j ) ( ( j ) ) h P D D H P P H H I i i t t w r e s a s = i i i 1 2 . this is now a prior! this being symmetrical shows that we would get the same answer regardless of the order of seeing the data Commutivity/Associativity of Evidence background information Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  7. Bayes Law is a “calculus of inference”, better (and certainly more self-consistent) than folk wisdom. Example: Hempel’s Paradox Folk wisdom: A case of a hypothesis adds support to that hypothesis. Example: “All crows are black” is supported by each new observation of a black crow. All crows are black All non-black things are non-crows  But this is supported by the observation of a white shoe. So, the observation of a white shoe is thus evidence that all crows are black! Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  8. ( j ) ( j ) ( ) P H D P D H P H 1 1 1 = ( j ) ( j ) ( ) P H D P D H P H 2 2 2 ( ) ( ) P H P H 0 0 0 0 1 1 1 : 0 0 0 1 = = : ( ) ( ) P H P H 0 1 2 2 : I.J. Good: “The White Shoe is a Red Herring” (1966) We observe one bird, and it is a black crow.a) Which world are we in?b) Are all crows black? Important concept, Bayes odds ratio: So the observation strongly supports H2 and the existence of white crows. Hempel’s folk wisdom premise is not true. Data supports the hypotheses in which it is more likely compared with other hypotheses. (This is Bayes!) We must have some kind of background information about the universe of hypotheses, otherwise data has no meaning at all. Professor William H. Press, Department of Computer Science, the University of Texas at Austin

  9. Congratulations! You are now a Bayesian. Bayesian viewpoint:Probabilities are modified by data. This makes them intrinsically subjective, because different observers have access to different amounts of data (including their “background information” or “background knowledge”). Notice in particular that the connection of probability to “frequency of occurrence of repeated events” is now complicated! (Would have to “repeat” the exact state of knowledge of the observer.) Professor William H. Press, Department of Computer Science, the University of Texas at Austin

More Related