Discrete Distributions What is the binomial distribution? The binomial distribution is a discrete probability distribution. It is a distribution that governs the random variable, X, which is the number of successes that occur in " n" trials.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
The binomial distribution is a discrete probability distribution. It is a distribution that governs the random variable, X, which is the number of successes that occur in "n" trials.
The binomial probability distribution gives us the probability that a success will occur x times in the n trials, for x = 0, 1, 2, …, n.
Thus, there are only two possible outcomes. It is conventional to apply the generic labels "success" and "failure" to the two possible outcomes.Discrete Distributions
1.A coin flip can be either heads or tails
2.A product is either good or defective
Binomial experiments of interest usually involve several repetitions or trials of the same basic experiments. These trials must satisfy the conditions outlined below:Discrete Distributions
Condition for use:
Each repetition of the experiment (trial) can result in only one of two possible outcomes, a success or failure. See example BD1.
The probability of a success, p, and failure (1-p) is constant from trial to trial.
All trials are statistically independent; i.e. No trial outcome has any effect on any other trial outcome.
The number of trials, n, is specified constant (stated before the experiment begins).Discrete Distributions
A coin flip results in a heads or tails
A product is defective or not
A customer is male or female
Example 4: binomial distribution:
Say we perform an experiment – flip a coin 10 times and observe the result. A successful flip is designated as heads.
Assuming the coin is fair, the probability of success is .5 for each of the 10 trials, thus each trial is independent.
We want to know the number of successes (heads) in 10 trials.
The random variable that records the number of successes is called the binomial random variable.
Random variable, x, the number of successes that occur in the n = 10 trials.Discrete Distributions
We are not concerned with sequence with the binomial. We could have several successes or failures in a row. Since each experiment is independent, sequence is not important.
The binomial random variable counts the number of successes in n trials of the binomial experiment.
By definition, this is a discrete random variable.Binomial Random Variable
Where x = 0, 1, 2, …, n
Calculating the Binomial Probability
Each pair of values (n, p) determines a distinct binomial distribution.
Two parameters: n and p where a parameter is:
Any symbol defined in the functions basic mathematical form such that the user of that function may specify the value of the parameter.
Since the outcome of each trial is
independent of the previous outcomes,
we can replace the conditional probabilities
with the marginal probabilities.
in three trials. Then,
P(X = 3) = p3
X = 3
X = 1
X = 0
P(X = 2) = 3p2(1-p)
P(X = 1) = 3p(1-p)2
P(X = 0) = (1- p)3
This multiplier is calculated in the following formula
A sample of 3 converter s is drawn. Find the probability distribution of the number of defectives.
A converter can be either defective or good.
There is a fixed finite number of trials (n=3)
We assume the converter state is independent on one another.
The probability of a converter being defective does not change from converter to converter (p=.05).
The conditions required for the binomial experiment are met
Define a “success” as “a converter is found to be defective”.
The quality control department of a manufacturer tested the most recent batch of 1000 catalytic converters produced and found that 50 of them to be defective. Subsequently, an employee unwittingly mixed the defective converters in with the nondefective ones. Of a sample of 3 converters is randomly selected from the mixed batch, what is the probability distribution of the number of defective converters in the sample?
Does this situation satisfy the requirements of a binomial experiment?
n = 3 trials with 2 possible outcomes (defective or nondefective).
Does the probability remain the same for each trial? Why or why not?
The probability p of selecting a defective converter does not remain constant for each trial because the probability depends on the results of the previous trial. Thus the trials are not independent.
The probability of selecting a defective converter on the first trial is 50/1000 = .05.
If a defective converter is selected on the first trial, then the probability changes to 49/999 = .049.
In practical terms, this violation of the conditions of a binomial experiment is often considered negligible. The difference would be more noticeable if we considered 5 defectives out of a batch of 100.
If we assume the conditions for a binomial experiment hold, then consider p = .5 for each trial.
Let X be the binomial random variable indicating the number o defective converters in the sample of 3.
P(X = 0) = p(0) = [3!/0!3!](.05)0(.95)3 = .8574
P(X = 1) = p(1) = [3!/1!2!](.05)1(.95)2 = .1354
P(X = 2) = p(2) = [3!/2!1!](.05)2(.95)1 = .0071
P(X = 3) = p(3) = [3!/3!0!](.05)3(.95)0 = .0001
The resulting probability distribution of the number of defective converters in the sample of 3, is as follows:
F(x)= S from k=0 to x: nCx * p k q (n-k)
Another way to look at things = cummulative probabilities
Say we have a binomial with n = 3 and p = .05
this could be written in cumulative form: from x = 0 to x = k:
What is the advantage of cummulative?
It allows us to find the probability that X will assume some value within a range of values.
Example 1: Cumulative:
p(2) = p(x<2) – p(x<1)
= .9999 - .9928
Example 2: Cumulative:
Find the probability of at most 3 successes in n=5 trials of a binomial experiment with p = .2.
We locate the entry corresponding to k = 3 and p = .2
P(X < 3) = SUM p(x) = p(0) + p(1) + p(2) + p(3) = .993
V(X) = s2 = np(1-p)
Records show that 30% of the customers in a shoe store make their payments using a credit card.
This morning 20 customers purchased shoes.
Use the Cummulative Binomial Distribution Table (A.1 of Appendix) to answer some questions stated in the next slide.
Mean and Variance of Binomial Random Variable
This is a binomial experiment with n=20 and p=.30.
P(At least 12 used credit card)
= 1-.995 = .005
P(X=3 or 4 or 5 or 6)
=.608 - .035 = .573
E(X) = np = 20(.30) = 6
Find the probability that exactly 14 customers did not use a credit card.
Let Y be the number of customers who did not use a credit card.P(Y=14) = P(X=6) = P(X<=6) - P(x<=5) = .608 - .416 = .192
Find the probability that at least 9 customers did not use a credit card.
Let Y be the number of customers who did not use a credit card.P(Y>=9) = P(X<=11) = .995
Probability Distribution of the Poisson Random Variable:
Poisson Random Variable
The X axis in Excel
Starts with x=1!!
0 1 2 3 4 5
distribution with m =2
0 1 2 3 4 5 6
distribution with m =5
0 1 2 3 4 5 6 7 8 9 10
distribution with m =7
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
What is the probability that only two cars will arrive during a specified one-minute period? (Use the formula)
The probability distribution of arriving cars for any one-minute period is Poisson with µ = 360/60 = 6 cars per minute. Let X denote the number of arrivals during a one-minute period.
P(X = 2) = P(X<=2) - P(X<=1) = .062 - .017 = .045
P(X>=4) = 1 - P(X<=3) = 1 - .151 = .849
If p is very small (p< .05), we can approximate the binomial probabilities using the Poisson distribution.
Use = np and make the following approximation:Poisson Approximation of the Binomial
With parameters n and p
With m = np
Example: Poisson Approximation of the Binomial
A warehouse engages in acceptance sampling to determine if it will accept or reject incoming lots of designer sunglasses, some of which invariably are defective. Specifically, the warehouse has a policy of examining a sample of 50 sunglasses from each lot and accepting the lot only if the sample contains no more than 2 defective pairs. What is the probability of a lot's being accepted if, in fact, 2% of the sunglasses in the lot are defective?
This is a binomial experiment with n = 50 and p = .02. Our binomial tables include n values up to 25, but since p < .05 and the expected number of defective sunglasses in the sample is np = 50(.02) = 1, the required probability can be approximated by using the Poisson distribution with μ = 1.
From Table A.1, we find that the probability that a sample contains at most 2 defective pairs o sunglasses is .920.
This is a binomial experiment with n = 50, p = .02.
Tables for n = 50 are not available; p<.05; thus, a Poisson approximation is appropriate [ = (50)(.02) =1]
P(Xpoisson<=2) = .920 (true binomial probability = .922)
So how well does the Poisson approximate the Binomial? Consider the following table:
x Binomial (n = 50, p = .02) Poisson (μ = np = 1)
0 .364 .368
1 .372 .368
2 .186 .184
3 .061 .061
4 .014 .015
5 .003 .003
6 .000 .001
What if we wanted to know the probability of a small number of occurrences in a large number of trials and a very small probability of success?
We use Poisson as a good approximation of the answer.
When trying to decide between the binomial and the Poisson, use the following guidelines:
n > 20 n > 100
p < .05 or np < 10
What about sampling without replacement?
What is likely to happen to the probability of success?
Probability of success is not constant from trial to trial.
We have a finite set on N things, and "a" of them possess a property of interest. Thus, there are "a" successes in N things.
Let X be the number of successes that occur in a sample, without replacement of "n" things from a total of N things.
This is the hypergeometric distribution:
P(x) = (aCx)(N-a C n-x) x = 0,1,2….
If N is large, then the probability of success will remain approximately constant from one trial to another.
When can we use the binomial distribution as an approximation of the hypergeometric distribution when: N/10 > n
We use a special case of the binomial distribution where n=1:
P(x)= 1Cx * px * (1-p) 1-x x =0,1 which yields p(0)= 1-p
= px * (1-p) 1-x x=0,1 p(1)= p
In this form, the binomial is referred to as the Bernoulli distribution.Bernoulli Distribution
In this case, we use the geometric distribution, where X is the random variable representing the number of failures before the 1st success.
Mathematical form of geometric distribution:
P(x)=p * (1-p)x x = 0,1,2…Geometric Distribution
In this case, we use the negative binomial distribution.
The number of statistically independent trials that will be performed before the r success = X + r
The previous r – 1 successes and the X failures can occur in any order during the X + r – 1 trials.
Negative binomial distribution – mathematical form:
P(x) = r+x-1 Cx * pr * (1-p)x x= 0,1,2…..Negative Binomial Distribution
Bivariate probability distribution
The probability that X assumes the value x, and Y assumes the value y is denoted
p(x,y) = P(X=x, Y = y)Bivariate Distributions
Consider the following real estate data:
We want to know how the size of the house varies with the cost.
What is the next step?
Construct frequency distributions:
Frequency distribution of house size:
Frequency distribution of selling price:
Construct a bivariate frequency distribution of X and Y:
A Bivariate (or joint) probability distribution of X and Y is a table that gives the joint probabilities p(x,y) for all pairs of values (x,y).
Cumulative bivariate probability distribution function is :
F(x1, x2) + P(X1< x1 X2< x2)
Marginal probability distribution function of X , p (x )
Is a univariable probability distribution function:
Px1 (X1) = Sum p(x1, x2)
Px2 (X2) = Sum p(x1, x2)
The none probabilities in the interior of the table are the joint probabilities p(x,y).
p(0,0) = P(X = 0 and Y = 0) = .12
p(0,1) = P(X = 0 and Y = 1) = .21
p(0,2) = P(X = 0 and Y = 2) = .07Discrete Bivariate Probability Distribution Functions
Summing the probabilities in each of the other columns and rows, we obtain the other marginal probabilities.
Thus the marginal probability distributions of X and Y are:Discrete Bivariate Probability Distribution Functions
Y 0 1 2 p(y)
0 .12 .42 .06 .60
1 .21 .06 .03 .30
2 .07 .02 .01 .10
p(x) .40 .50 .10 1.00
P(Y=1), the marginal
The marginal probability
x p(x) y p(y)
0 .4 0 .6
1 .5 1 .3
2 .1 2 .1
E(X) = .7 E(Y) = .5
V(X) = .41 V(Y) = .45
This leads to the following relationship for independent variables
Example 6.7 - continued
Since P(X=0|Y=1)=.7 but P(X=0)=.4, The variables X and Y are not independent.Conditions For Independence
P(X=x|Y=y)=P(X=x) or P(Y=y|X=x)=P(Y=y).
P(X=x and Y=y) = P(X=x)P(Y=y)
The table below represent the joint probability distribution of the variable X and Y. Are the variables X and Y independent
Y 1 2
1 .28 .42
2 .12 .18
P(X=1)P(Y=1) = .40(.70) = .28
P(X=1 and Y=1) = .28
P(X=1)P(Y=2) = .40(.30) = .12
P(X=1 and Y=2) = .12
Find the marginal probabilities
of X and Y.
Then apply the multiplication rule.
P(y) .40 .60
Compare the other two pairs.
Yes, the two variables are
Example 6.7 - continued
Find the probability distribution of the total number of houses sold per week by Xavier and Yvette.
X+Y is the total number of houses sold.X+Y can have the values 0, 1, 2, 3, 4.
We find the distribution of X+Y as demonstrated next.The sum of Two Variables
x +y 0 1 2 3 4
p(x+y) .12 .63 .19 .05 .01
Y 0 1 2 p(y)
0 .12 .42 .06 .60
1 .21 .06 .03 .30
2 .07 .02 .01 .10
p(x) .40 .50 .10 1.00
P(X+Y=0) = P(X=0 and Y=0) = .12
P(X+Y=1) = P(X=0 and Y=1)+ P(X=1 and Y=0) =.21 + .42 = .63
The probabilities P(X+Y)=3 and
P(X+Y) =4 are calculated the
same way. The distribution follows
P(X+Y=2) = P(X=0 and Y=2)+ P(X=1 and Y=1)+ P(X=2 and Y=0)
= .07 + .06 + .06 = .19
An alternative is to use the relationships
E(aX+bY) =aE(X) + bE(Y);
V(aX+bY) = a2V(X) + b2V(Y) if X and Y are independent.
When X and Y are not independent, (see the previous example) we need to incorporate the covariance in the calculations of the variance V(aX+bY).
Expected value and variance of X+Y