Loading in 5 sec....

Binomial Random VariablesPowerPoint Presentation

Binomial Random Variables

- 59 Views
- Uploaded on
- Presentation posted in: General

Binomial Random Variables

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Binomial Random Variables

- A sequence ofn trials(called Bernoulli trials), each of which results in either a “success” or a “failure”.
- The trials are independent and so theprobability of success, p, remains the same for each trial.
- Define a random variable Y as the number of successes observed during the n trials.
- What is the probability p(y), for y = 0, 1, …, n ?
- How many successes may we expect? E(Y) = ?

- Suppose the retention rate for a school indicates the probability a freshman returns for their sophmore year is 0.65. Among 12 randomly selected freshman, what is the probability 8 of them return to school next year?

Each student either returns or doesn’t. Think of each selected student as a trial, so n = 12.

If we consider “student returns” to be a success, then p = 0.65.

- To find the probability of this event, consider the probability for just one sample point in the event.
- For example, the probability the first 8 students return and the last 4 don’t.
- Since independent, we just multiply the probabilities:

- For the probability of this event, we sum the probabilities for each sample point in the event.
- How many sample points are in this event?
- How many ways can 8 successes and 4 failures occur?

- Each of these sample points has the same probability.
- Hence, summing these probabilities yields

- A random variable has a binomial distribution with parameters n and p if its probability function is given by

- In a research study, rats are injected with a drug. The probability that a rat will die from the drug before the experiment is over is 0.16. Ten rats are injected with the drug.

What is the probability that at least 8 will survive?

Would you be surprised if at least 5 died during the experiment?

- For parts machined by a particular lathe, on average, 95% of the parts are within the acceptable tolerance.
- If 20 parts are checked, what is the probability that at least 18 are acceptable?
- If 20 parts are checked, what is the probability that at most 18 are acceptable?

- As we saw in our Discrete class, the Binomial Theorem allows us to expand
- As a result, summing the binomial probabilities, where q = 1- p is the probability of a failure,

- If Y is a binomial random variable with parameters n and p, the expected value and variance for Y are given by

When y = 0, the summand is zero. Just as well start at y = 1.

=1

Just the highlights (see page 104 for details).

“fairly common trick” to use E[Y(Y-1)] to find E(Y2)

- In a research study, rats are injected with a drug. The probability that a rat will die from the drug before the experiment is over is 0.16. Ten rats are injected with the drug.

- How many of the rats are expected to survive?
- Find the variance for the number of survivors.

Geometric Random Variables

- Similar to the binomial experiment, we consider:
- A sequence ofindependent Bernoulli trials.
- The probability of “success” equals p on each trial.
- Define a random variable Y as the number of the trial on which the 1st success occurs. (Stop the trials after the first success occurs.)
- What is the probability p(y), for y = 1,2, … ?
- On which trial is the first success expected?

(S)

S

(F, S)

S

F

(F, F, S)

S

F

(F, F, F, S)

S

F

….

- Consider the values of Y:y = 1: (S)y = 2: (F, S)y = 3: (F, F, S)y = 4: (F, F, F, S)and so on…

p(1) = pp(2) = (q)( p)p(3) = (q2)( p)p(4) = (q3)( p)

- A random variable has a geometric distribution with parameter p if its probability function is given by

(D)

D

(G, D)

D

G

(G, G, D)

D

G

G

- Of course, you need to be clear on what you consider a “success”.
- For example, the 1st success might mean finding the 1st defective item!

- If Y is a geometric random variable with parameter p the expected value and variance for Y are given by

Using the “trick” of finding E[Y(Y-1)] to get E(Y2)…

Now, forming the second moment, E(Y2)…

And so, we find the variance…

- For a geometric random variable and a > 0,show P(Y > a) = qa

- Consider P(Y > a) = 1 – P(Y < a)
= 1 – p(1 + q + q2 + …+ qa-1)

= qa , based on the sum of a geometric series

- Based on the result, it follows P(Y > a + b) = qa+b
- Also, the conditional probabilityP(Y > a + b | Y > a ) = qb = P(Y > b)
“the memoryless property”

- For the geometric distribution P(Y > a + b | Y > a ) = qb = P(Y > b)
- This implies P(Y > 7 | Y > 2 ) = q5 = P(Y > 5)“knowing the first two trials were failures, the probability a success won’t occur on the next 5 trials”
as compared to “just starting the trials and a success won’t occur on the first 5 trials” same probability?!

- Considering implementing a new policy in a large company, so we ask employees whether or not they favor the new policy.
- Suppose the first four reject the new policy, but the 5th individual is in favor of the policy.

What does this tell us about the percentage of employees we might expect to favor the policy?Can we estimate the probability p of getting a favorable vote on any given trial?

- We wish to find the value of p which would make it highly probable that the 5th individual turns out to be first “success”.
- That is, let’s maximize the probability of finding the first success on trial 5, where p(5) = (1- p)4 p
- For what value of p is this probability a max?

Using the derivative to locate the maximum

The derivative is zero and the probability is at its maximum when p = 0.2

“the method of maximum likelihood”