- 50 Views
- Uploaded on
- Presentation posted in: General

Conditional Probability

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Conditional Probability

When do you think the probability of snow is higher, in September or February? February, at least in Nebraska, right?

Conditional Probability

As we have seen, P(A) refers to the probability that event A will occur. P(A|B) refers to the probability that A will occur but with the understanding that B has already occurred and we know it. So, we say the probability of A given B. The given B part means that it is known that B has occurred.

By definition

P(A|B) = P(A and B)/P(B).

Similarly

P(B|A) = P(A and B)/P(A).

Note P(A and B) = P(B and A)

Now we have by definition

P(A|B) = P(A and B)/P(B).

In this definition, B has already occurred. The P(B) is the denominator of P(A|B) and is thus the base of the conditional probability. The intersection of A and B is in the numerator. Since B has occurred, the only way A can have occurred is if there is an overlap of A and B. So we have the ratio

probability of overlap/probability of known event.

Let’s turn to an example where a person planned to make a purchase of a big TV and then there is a follow up to see if they actually made the purchase.

Actually Purchased

Planned to purchase B (Yes)B’ (No)Total

A (Yes)0.200.050.25

A’ (No)0.100.650.75

Total0.300.701.0

So, if we know B has occurred then we look at column B. The only way A could also have occurred is if we had the joint event A and B. Thus, P(A|B) = .2/.3 = .67

Similarily, if we know A has occurred then we look at the row A. The only way B could also have occurred is if we had the joint event A and B. P(B|A) = .2/.25 = .8

In the example on the last slide P(A) = .25, but P(A|B) = .67.

Each is dealing with the probability of A. But, in this case having information about B gives a different view about A. When P(A|B) ≠ P(A) we say events A and B are dependent events. Similarly, when P(B|A) ≠ P(B) events A and B are dependent.

In our example, .25 of the folks in general have what we called event A (planned to purchase). But, the conditional probability is indicating that if we know B occurred (a purchase was actually made) then the chance is even higher (in this case) that they planned to purchase. In this sense A and B are dependent.

Independent Events

Events A and B are said to be independent if

P(A|B) = P(A) or P(B|A) = P(B).

Does a coin have a memory? In other words, does a coin remember how many times it has come up heads and will thus come up tails if it came up heads a lot lately? Say A is heads on the third flip, B is heads on the first two flips. Is heads on the third flip influenced by the first two heads. No, coins have no memory! Thus A and B are independent. (Note I am not concerned here about the probability of getting three heads!)

Have you ever heard the saying, “Pink sky in the morning, sailors take warning, pink sky at night sailors delight.” I just heard about it recently. Apparently it is a rule of thumb about rain. Pink sky in the morning would serve as a warning for rain that day. If A is rain in the day and B is pink sky in the morning, then it seems that the P(A|B) ≠ P(A) and thus the probability of rain is influenced by morning sky color (color is really just an indicator of conditions).

Let’s think about one more example. If you watched a football team all year you could use the empirical approach to find the probability that it will throw a pass on a given play. Say P(pass)=0.4. This means the probability it will pass on a given play is 0.4.

But, if there are 5 minutes left in the game and the team is down 14 points the team will want to pass more. SO, P(pass|down 14 with 5 minutes left) = 0.75, for example. This means the probability of a pass depends on the score and time remaining!

I have used some examples to give you a feel about when events are independent and when they are dependent.

By simple equation manipulation we change the conditional probability definition to the rule called the multiplication lawor rule for the intersection of events:

P(A and B) = P(B)P(A⃒B) or P(A and B) = P(A)P(B⃒A) .

Note the given part shows up in the other term.

Now this rule simplifies if A and B are independent. The conditional probabilities revert to regular probabilities. We would then have

P(A and B) = P(B)P(A) = P(A)P(B).

Say, as a new example, we have A and B with P(A)=.5, P(B)=.6 and P(A and B) =.4

Then

a. P(A⃒B) = .4/.6 = .667

b. P(B⃒A) = .4/.5 = .8

c. A and B are not independent because we do NOT have P(A⃒B) = P(A), or P(B⃒A) = P(B).

Say, as another example, we have A and B with P(A)=.3 and P(B)=.4 and here we will say A and B are mutually exclusive. This means P(A and B) = 0 (in a Venn Diagram A and B have no overlap), then

a. P(A⃒B) = 0/.4 = 0Here A and B are not independent.

Y

Y1Y2Totals

XX1P(X1 andY1) P(X1 andY2)P(X1)

X2P(X2 andY1) P(X2 andY2) P(X2)

TotalsP(Y1) P(Y2)1.00

Here I put the joint probability table again in general terms. Question X has mutually exclusive and collectively exhaustive events X1 and X2. For Y we have a similar set-up. Note here each has only two responses, but what we will see below would apply if there are more than 2 responses.

Let’s review some of the probability rules we just went through and then we will add one more rule.

Inside the joint probability table we find joint probabilities (like P(X1 and Y1) and in the margins we find the marginal probabilities (like P(X1)).

Marginal Probability Rule

P(X1) = P(X1 and Y1) + P(X1 and Y2)

General Addition Rule

P(X1 or Y1) = P(X1) + P(Y1) – P(X1 and Y1)

Conditional Probability

P(X1|Y1) = P(X1 and Y1)/P(Y1)

Multiplication Rule

P(X1 and Y1) = P(X1|Y1)P(Y1)

The new part is to view the marginal probability rule as taking each part and use the multiplication rule. So,

Marginal Probability Rule

P(X1) = P(X1 and Y1) + P(X1 and Y2)

=P(X1|Y1)P(Y1) + P(X1|Y2)P(Y2)

Where Y1 and Y2 are mutually exclusive and collectively exhaustive.

The joint probability table associated with the problem is

BB’total

A.11.22.33

A’.22.44.66

total .33.671.0 (Some off due to rounding)

P(A|B) = .11/.33 = .33,P(A|B’) = .22/.67 = .33

P(A’|B) = .22/.33 = .67Since P(A) = P(A|B) = .33 events A and B are independent. To see if you have independent events you just compare the marginal (like P(A)) with the conditional (like P(A|B)). If equal, you have independent events.

The joint probability table associated with the example is

BB’total

A.71.03.74

A’.18.08.26

total .89.111.0

a. P(B’|A’) = .08/.26 = .31,b. P(B’|A) = .03/.74 = .04

c. P(B|A) = .71/.74 = .96.Since P(B) = .89 ≠ P(B|A) the events A and B are NOT independent. To see if you have independent events you just compare the marginal (like P(B)) with the conditional (like P(B|A)). If equal, you have independent events.