please stand by l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Statistics PowerPoint Presentation
Download Presentation
Statistics

Loading in 2 Seconds...

play fullscreen
1 / 78

Statistics - PowerPoint PPT Presentation


  • 202 Views
  • Uploaded on

Please Stand By…. Statistics Chapter 4: Probability and Distributions Randomness General Probability Probability Models Random Variables Moments of Random Variables Randomness The language of probability Thinking about randomness The uses of probability Randomness Randomness

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Statistics' - DoraAna


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
chapter 4 probability and distributions
Chapter 4: Probability and Distributions
  • Randomness
  • General Probability
  • Probability Models
  • Random Variables
  • Moments of Random Variables
randomness
Randomness
  • The language of probability
  • Thinking about randomness
  • The uses of probability
chapter goals
Chapter Goals

After completing this chapter, you should be able to:

  • Explain three approaches to assessing probabilities
  • Apply common rules of probability
  • Use Bayes’ Theorem for conditional probabilities
  • Distinguish between discrete and continuous probability distributions
  • Compute the expected value and standard deviation for a probability distributions
important terms
Important Terms
  • Probability – the chance that an uncertain event will occur (always between 0 and 1)
  • Experiment – a process of obtaining outcomes for uncertain events
  • Elementary Event – the most basic outcome possible from a simple experiment
  • Randomness –
    • Does not mean haphazard
    • Description of the kind of order that emerges only in the long run
important terms cont d
Important Terms (CONT’D)
  • Sample Space – the collection of all possible elementary outcomes
  • Probability Distribution Function
    • Maps events to intervals on the real line
    • Discrete probability mass
    • Continuous probability density
slide11

Sample Space

The Sample Space is the collection of all possible outcomes (based on an probabilistic experiment)

e.g., All 6 faces of a die:

e.g., All 52 cards of a bridge deck:

events
Events
  • Elementary event – An outcome from a sample space with one characteristic
    • Example: A red card from a deck of cards
  • Event – May involve two or more outcomes simultaneously
    • Example: An ace that is also red from a deck of

cards

elementary events
Elementary Events
  • A automobile consultant records fuel type and vehicle type for a sample of vehicles

2 Fuel types: Gasoline, Diesel

3 Vehicle types: Truck, Car, SUV

6 possible elementary events:

e1 Gasoline, Truck

e2 Gasoline, Car

e3 Gasoline, SUV

e4 Diesel, Truck

e5 Diesel, Car

e6 Diesel, SUV

Truck

e1

e2

e3

e4

e5

e6

Car

Gasoline

SUV

Truck

Diesel

Car

SUV

independent vs dependent events
Independent vs. Dependent Events
  • Independent Events

E1 = heads on one flip of fair coin

E2 = heads on second flip of same coin

Result of second flip does not depend on the result of the first flip.

  • Dependent Events

E1 = rain forecasted on the news

E2 = take umbrella to work

Probability of the second event is affected by the occurrence of the first event

probability concepts
Probability Concepts
  • Mutually Exclusive Events
    • If E1 occurs, then E2 cannot occur
    • E1 and E2 have no common elements

E2

A card cannot be Black and Red at the same time.

E1

Red

Cards

Black

Cards

coming up with probability
Coming up with Probability
  • Empirically
    • From the data!
    • Based on observation, not theory
  • Probability describes what happens in very many trials.
  • We must actually observe many trials to pin down a probability
  • Based on belief (Bayesian Technique)
assigning probability
Assigning Probability
  • Classical Probability Assessment

Number of ways Ei can occur

Total number of elementary events

P(Ei) =

  • Relative Frequency of Occurrence

Number of times Ei occurs

N

Relative Freq. of Ei =

  • Subjective Probability Assessment

An opinion or judgment by a decision maker about the likelihood of an event

calculating probability
Calculating Probability
  • Counting Outcomes
  • Observing Outcomes in Trials

Number of ways Ei can occur

Total number of elementary events

counting20
Counting

a b c d e …. ___ ___ ___ ___

  • N take n ___ ___ ___
  • N take k ___ ___
  • Order not important – less than permutations
rules of probability
Rules of Probability
  • S is sample space
  • Pr(S) = 1
  • Events measured in numbers result in a Probability Distribution
rules of probability23
Rules of Probability

Rules for

Possible Values

and Sum

Individual Values

Sum of All Values

0 ≤ P(ei) ≤ 1

For any event ei

where:

k = Number of elementary events in the sample space

ei = ith elementary event

addition rule for elementary events
Addition Rule for Elementary Events
  • The probability of an event Ei is equal to the sum of the probabilities of the elementary events forming Ei.
  • That is, if:

Ei = {e1, e2, e3}

then:

P(Ei) = P(e1) + P(e2) + P(e3)

complement rule
Complement Rule
  • The complementof an event E is the collection of all possible elementary events not contained in event E. The complement of event E is represented by E.
  • Complement Rule:

E

E

Or,

addition rule for two events
Addition Rule for Two Events
  • Addition Rule:

P(E1 or E2) = P(E1) + P(E2) - P(E1 and E2)

+

=

E1

E2

E1

E2

P(E1or E2) = P(E1) + P(E2) - P(E1and E2)

Don’t count common elements twice!

addition rule example
Addition Rule Example

P(Red or Ace) = P(Red) +P(Ace) - P(Redand Ace)

= 26/52 + 4/52 - 2/52 = 28/52

Don’t count the two red aces twice!

Color

Type

Total

Red

Black

2

2

4

Ace

24

24

48

Non-Ace

26

26

52

Total

addition rule for mutually exclusive events
Addition Rule for Mutually Exclusive Events
  • If E1 and E2 are mutually exclusive, then

P(E1 and E2) = 0

So

E1

E2

= 0

if mutually

exclusive

P(E1 or E2) = P(E1) + P(E2) - P(E1 and E2)

= P(E1) + P(E2)

conditional probability
Conditional Probability
  • Conditional probability for any

two events E1 , E2:

conditional probability example
Conditional Probability Example
  • What is the probability that a car has a CD player, given that it has AC ?

i.e., we want to find P(CD | AC)

  • Of the cars on a used car lot, 70% have air conditioning (AC) and 40% have a CD player (CD). 20% of the cars have both.
conditional probability example31
Conditional Probability Example

(continued)

  • Of the cars on a used car lot, 70% have air conditioning (AC) and 40% have a CD player (CD).

20% of the cars have both.

CD

No CD

Total

AC

No AC

1.0

Total

conditional probability example32
Conditional Probability Example

(continued)

  • Given AC, we only consider the top row (70% of the cars). Of these, 20% have a CD player. 20% of 70% is about 28.57%.

CD

No CD

Total

.2

.5

.7

AC

.2

.1

No AC

.3

.4

.6

1.0

Total

for independent events
For Independent Events:
  • Conditional probability for independent events E1 , E2:
multiplication rules
Multiplication Rules
  • Multiplication rule for two events E1 and E2:

Note:If E1 and E2 are independent, then

and the multiplication rule simplifies to

tree diagram example
Tree Diagram Example

P(E1 and E3) = 0.8 x 0.2 = 0.16

Truck: P(E3|E1) = 0.2

Car: P(E4|E1) = 0.5

P(E1 and E4) = 0.8 x 0.5 = 0.40

Gasoline P(E1) = 0.8

SUV: P(E5|E1) = 0.3

P(E1 and E5) = 0.8 x 0.3 = 0.24

P(E2 and E3) = 0.2 x 0.6 = 0.12

Truck: P(E3|E2) = 0.6

Diesel

P(E2) = 0.2

Car: P(E4|E2) = 0.1

P(E2 and E4) = 0.2 x 0.1 = 0.02

SUV: P(E5|E2) = 0.3

P(E3 and E4) = 0.2 x 0.3 = 0.06

get ready
Get Ready….
  • More Probability Examples
  • Random Variables
  • Probability Distributions
introduction to probability distributions
Introduction to Probability Distributions
  • Random Variable – “X”
    • Is a function from the sample space to another space, usually Real line
    • Represents a possible numerical value from a random event
  • Each r.v. has a Distribution Function – FX(x), fX(x) based on that in the sample space
    • Assigns probability to the (numerical) outcomes (discrete values or intervals)
random variables
Random Variables
  • Not Easy to Describe
random variables40
Random Variables
  • Not Easy to Describe
introduction to probability distributions41
Introduction to Probability Distributions
  • Random Variable
    • Represents a possible numerical value from a random event

Random

Variables

Discrete

Random Variable

Continuous

Random Variable

discrete probability distribution
Discrete Probability Distribution
  • A list of all possible [ xi , P(xi) ] pairs

xi = Value of Random Variable (Outcome)

P(xi) = Probability Associated with Value

  • xi’s are mutually exclusive

(no overlap)

  • xi’s are collectively exhaustive

(nothing left out)

  • 0 £ P(xi) £ 1 for each xi
  • S P(xi) = 1
discrete random variables
Discrete Random Variables
  • Can only assume a countable number of values

Examples:

    • Roll a die twice

Let x be the number of times 4 comes up

(then x could be 0, 1, or 2 times)

    • Toss a coin 5 times.

Let x be the number of heads

(then x = 0, 1, 2, 3, 4, or 5)

discrete probability distribution44
Discrete Probability Distribution

x ValueProbability

0 1/4 = .25

1 2/4 = .50

2 1/4 = .25

Experiment: Toss 2 Coins. Let x = # heads.

4 possible outcomes

Probability Distribution

T

T

T

H

H

T

.50

.25

Probability

H

H

0 1 2 x

the distribution function
The Distribution Function
  • Assigns Probability to Outcomes
    • F, f > 0; right-continuous
    • and
    • P(X<a)=FX(a)
  • Discrete Random Variable
  • Continuous Random Variable
discrete random variable summary measures moments
Discrete Random Variable Summary Measures - Moments
  • Expected Value of a discrete distribution

(Weighted Average)

E(x) = xi P(xi)

    • Example: Toss 2 coins,

x = # of heads,

compute expected value of x:

E(x) = (0 x .25) + (1 x .50) + (2 x .25)

= 1.0

x P(x)

0 .25

1 .50

2 .25

discrete random variable summary measures
Discrete Random Variable Summary Measures

(continued)

  • Standard Deviation of a discrete distribution

where:

E(x) = Expected value of the random variable

x = Values of the random variable

P(x) = Probability of the random variable having the value of x

discrete random variable summary measures48
Discrete Random Variable Summary Measures

(continued)

  • Example: Toss 2 coins, x = # heads,

compute standard deviation (recall E(x) = 1)

Possible number of heads = 0, 1, or 2

two discrete random variables
Two Discrete Random Variables
  • Expected value of the sum of two discrete random variables:

E(x + y) = E(x) + E(y)

=  x P(x) +  y P(y)

(The expected value of the sum of two random variables is the sum of the two expected values)

sums of random variables
Sums of Random Variables
  • Usually we discuss sums of INDEPENDENT random variables, Xi i.i.d.
  • Only sometimes is
  • Due to Linearity of the Expectation operator,

E(SXi) = SE(Xi) and Var(SXi) =  Var(Xi)

  • CLT: Let Sn=SXi then (Sn - E(Sn))~N(0, var)
covariance
Covariance
  • Covariance between two discrete random variables:

σxy =  [xi – E(x)][yj – E(y)]P(xiyj)

where:

xi = possible values of the x discrete random variable

yj = possible values of the y discrete random variable

P(xi ,yj) = joint probability of the values of xi and yj occurring

interpreting covariance
Interpreting Covariance
  • Covariance between two discrete random variables:

xy > 0 x and y tend to move in the same direction

xy < 0 x and y tend to move in opposite directions

xy = 0 x and y do not move closely together

correlation coefficient
Correlation Coefficient
  • The Correlation Coefficient shows the strength of the linear association between two variables

where:

ρ = correlation coefficient (“rho”)

σxy = covariance between x and y

σx = standard deviation of variable x

σy = standard deviation of variable y

interpreting the correlation coefficient
Interpreting the Correlation Coefficient
  • The Correlation Coefficient always falls between -1 and +1

 = 0 x and y are not linearly related.

The farther  is from zero, the stronger the linear relationship:

 = +1 x and y have a perfect positive linear relationship

 = -1 x and y have a perfect negative linear relationship

useful discrete distributions
Useful Discrete Distributions
  • Discrete Uniform
  • Binary – Success/Fail (Bernoulli)
  • Binomial
  • Poisson
  • Empirical
    • Piano Keys
    • Other “stuff that happens” in life
useful continuous distributions
Useful Continuous Distributions
  • Finite Support
    • Uniform fX(x)=c
    • Beta
  • Infinite Support
    • Gaussian (Normal) N(m,s)
    • Log-normal
    • Gamma
    • Exponential
section summary
Section Summary
  • Described approaches to assessing probabilities
  • Developed common rules of probability
  • Distinguished between discrete and continuous probability distributions
  • Examined discrete and continuous probability distributions and their moments (summary measures)
slide58

Probability Distributions

Probability Distributions

Discrete

Probability Distributions

Continuous

Probability Distributions

Binomial

Normal

Poisson

Uniform

Etc.

Etc.

binomial distribution formula
Binomial Distribution Formula

n

!

-

x

x

n

P(x)

=

p

q

x !

(

-

)

!

n

x

P(x) = probability of x successes in n trials,

with probability of success pon each trial

x = number of ‘successes’ in sample,

(x = 0, 1, 2, ..., n)

p = probability of “success” per trial

q = probability of “failure” = (1 – p)

n = number of trials (sample size)

Example: Flip a coin four times, let x = # heads:

n = 4

p = 0.5

q = (1 - .5) = .5

x = 0, 1, 2, 3, 4

binomial characteristics
Binomial Characteristics

Examples

n = 5 p = 0.1

P(X)

Mean

.6

.4

.2

0

X

0

1

2

3

4

5

n = 5 p = 0.5

P(X)

.6

.4

.2

X

0

0

1

2

3

4

5

using binomial tables
Using Binomial Tables

Examples:

n = 10, p = .35, x = 3: P(x = 3|n =10, p = .35) = .2522

n = 10, p = .75, x = 2: P(x = 2|n =10, p = .75) = .0004

the poisson distribution
The Poisson Distribution
  • Characteristics of the Poisson Distribution:
    • The outcomes of interest are rare relative to the possible outcomes
    • The average number of outcomes of interest per time or space interval is 
    • The number of outcomes of interest are random, and the occurrence of one outcome does not influence the chances of another outcome of interest
    • The probability of that an outcome of interest occurs in a given segment is the same for all segments
poisson distribution formula
Poisson Distribution Formula

where:

t = size of the segment of interest

x = number of successes in segment of interest

 = expected number of successes in a segment of unit size

e = base of the natural logarithm system (2.71828...)

poisson distribution shape
Poisson Distribution Shape
  • The shape of the Poisson Distribution depends on the parameters  and t:

t = 0.50

t = 3.0

poisson distribution characteristics
Poisson Distribution Characteristics
  • Mean
  • Variance and Standard Deviation
  • http://www.math.csusb.edu/faculty/stanton/m262/poisson_distribution/Poisson_old.html

where  = number of successes in a segment of unit size

t = the size of the segment of interest

graph of poisson probabilities
Graph of Poisson Probabilities

Graphically:

 = .05 and t = 100

P(x = 2) = .0758

the uniform distribution
The Uniform Distribution

(continued)

The Continuous Uniform Distribution:

f(x) =

where

f(x) = value of the density function at any x value

a = lower limit of the interval

b = upper limit of the interval

uniform distribution
Uniform Distribution

Example: Uniform Probability Distribution

Over the range 2 ≤ x ≤ 6:

1

f(x) = = .25 for 2 ≤ x ≤ 6

6 - 2

f(x)

.25

x

2

6

slide72
By varying the parameters μ and σ, we obtain different normal distributions
  • μ ± 1σencloses about 68% of x’s; μ ± 2σcovers about 95% of x’s; μ ± 3σcovers about 99.7% of x’s
  • The chance that a value that far or farther

away from the mean is highly unlikely, given

that particular mean and standard deviation

probability as area under the curve
Probability as Area Under the Curve

The total area under the curve is 1.0, and the curve is symmetric, so half is above the mean, half is below

f(x)

0.5

0.5

μ

x

the standard normal distribution
The Standard Normal Distribution

Also known as the “z” distribution = (x-m)/s

Mean is by definition 0

Standard Deviation is by definition 1

f(z)

1

z

0

Values above the mean have positive z-values, values below the mean have negative z-values

comparing x and z units
Comparing x and z units

μ= 100

σ= 50

100

250

x

0

3.0

z

Note that the distribution is the same, only the scale has changed. We can express the problem in original units (x) or in standardized units (z)

finding normal probabilities
Finding Normal Probabilities

(continued)

  • Suppose x is normal with mean 8.0 and standard deviation 5.0.
  • Now Find P(x < 8.6)

.0478

.5000

P(x < 8.6)

= P(z < 0.12)

= P(z < 0) + P(0 < z < 0.12)

= .5 + .0478 = .5478

Z

0.00

0.12

section summary78
Section Summary
  • Reviewed discrete distributions
    • binomial, poisson, etc.
  • Reviewed some continuous distributions
    • normal, uniform, exponential
  • Found probabilities using formulas and tables
  • Recognized when to apply different distributions
  • Applied distributions to decision problems