1 / 51

STAT 552 PROBABILITY AND STATISTICS II

STAT 552 PROBABILITY AND STATISTICS II. INTRODUCTION Short review of S551. WHAT IS STATISTICS?.

netis
Download Presentation

STAT 552 PROBABILITY AND STATISTICS II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STAT 552PROBABILITY AND STATISTICS II INTRODUCTION Short review of S551

  2. WHAT IS STATISTICS? • Statistics is a science of collecting data, organizing and describing it and drawing conclusions from it. That is, statistics is a way to get information from data. It is the science of uncertainty.

  3. BASIC DEFINITIONS • POPULATION: The collection of all items of interest in a particular study. • SAMPLE: A set of data drawn from the population; a subset of the population available for observation • PARAMETER: A descriptive measure of the population, e.g., mean • STATISTIC: A descriptive measure of a sample • VARIABLE: A characteristic of interest about each element of a population or sample.

  4. STATISTIC • Statistic (or estimator) is any function of a r.v. of r.s. which do not contain any unknown quantity. E.g. • are statistics. • are NOT. • Any observed or particular value of an estimator is an estimate.

  5. Sample Space • The set of all possible outcomes of an experiment is called a sample space and denoted byS. • Determining the outcomes. • Build an exhaustive list of all possible outcomes. • Make sure the listed outcomes are mutually exclusive.

  6. RANDOM VARIABLES • Variables whose observed value is determined by chance • A r.v. is a function defined on the sample space S that associates a real number with each outcome in S. • Rvs are denoted by uppercase letters, and their observed values by lowercase letters.

  7. DESCRIPTIVE STATISTICS • Descriptive statistics involves the arrangement, summary, and presentation of data, to enable meaningful interpretation, and to support decision making. • Descriptive statistics methods make use of • graphical techniques • numerical descriptive measures.

  8. Types of data – examples

  9. PROBABILITY POPULATION SAMPLE STATISTICAL INFERENCE

  10. PROBABILITY: A numerical value expressing the degree of uncertainty regarding the occurrence of an event. A measure of uncertainty. • STATISTICAL INFERENCE: The science of drawing inferences about the population based only on a part of the population, sample.

  11. Probability P : S  [0,1] Probability domain range function

  12. THE CALCULUS OF PROBABILITIES • If P is a probability function and A is any set, then a. P()=0 b. P(A)  1 c. P(AC)=1  P(A)

  13. ODDS • The odds of an event A is defined by • It tells us how much more likely to see the occurrence of event A.

  14. ODDS RATIO • OR is the ratio of two odds. • Useful for comparing the odds under two different conditions or for two different groups, e.g. odds for males versus females.

  15. CONDITIONAL PROBABILITY • (Marginal) Probability: P(A): How likely is it that an event A will occur when an experiment is performed? • Conditional Probability: P(A|B): How will the probability of event A be affected by the knowledge of the occurrence or nonoccurrence of event B? • If two events are independent, then P(A|B)=P(A)

  16. CONDITIONAL PROBABILITY

  17. BAYES THEOREM • Suppose you have P(B|A), but need P(A|B).

  18. Independence • A and B are independent iff • P(A|B)=P(A) or P(B|A)=P(B) • P(AB)=P(A)P(B) • A1, A2, …, An are mutually independent iff for every subset j of {1,2,…,n} E.g. for n=3, A1, A2, A3 are mutually independent iff P(A1A2A3)=P(A1)P(A2)P(A3) and P(A1A2)=P(A1)P(A2) and P(A1A3)=P(A1)P(A3) and P(A2A3)=P(A2)P(A3)

  19. DISCRETE RANDOM VARIABLES • If the set of all possible values of a r.v. X is a countable set, then X is called discrete r.v. • The function f(x)=P(X=x) for x=x1,x2, … that assigns the probability to each value x is called probability density function (p.d.f.) or probability mass function (p.m.f.)

  20. Example • Discrete Uniform distribution: • Example: throw a fair die. P(X=1)=…=P(X=6)=1/6

  21. CONTINUOUS RANDOM VARIABLES • When sample space is uncountable (continuous) • Example: Continuous Uniform(a,b)

  22. CUMULATIVE DENSITY FUNCTION (C.D.F.) • CDF of a r.v. X is defined as F(x)=P(X≤x).

  23. JOINT DISCRETE DISTRIBUTIONS • A function f(x1, x2,…, xk) is the joint pmf for some vector valued rv X=(X1, X2,…,Xk) iff the following properties are satisfied: f(x1, x2,…, xk) 0 for all (x1, x2,…, xk) and

  24. MARGINAL DISCRETE DISTRIBUTIONS • If the pair (X1,X2) of discrete random variables has the joint pmf f(x1,x2), then the marginal pmfs of X1 and X2 are

  25. CONDITIONAL DISTRIBUTIONS • If X1 and X2 are discrete or continuous random variables with joint pdf f(x1,x2), then the conditional pdf of X2 given X1=x1 is defined by • For independent rvs,

  26. EXPECTED VALUES Let X be a rv with pdf fX(x) and g(X) be a function of X. Then, the expected value (or the mean or the mathematical expectation) of g(X) providing the sum or the integral exists, i.e., <E[g(X)]<.

  27. EXPECTED VALUES • E[g(X)] is finite if E[| g(X) |]is finite.

  28. Laws of Expected Value E(c) = c E(X + c) = E(X) + c E(cX) = cE(X) Laws of Variance V(c) = 0 V(X + c) = V(X) V(cX) = c2V(X) Laws of Expected Value and Variance Let X be a rv and c be a constant.

  29. EXPECTED VALUE If X and Y are independent, The covariance of X and Y is defined as

  30. EXPECTED VALUE If X and Y are independent, The reverse is usually not correct! It is only correct under normal distribution. If (X,Y)~Normal, then X and Y are independent iff Cov(X,Y)=0

  31. EXPECTED VALUE If X1 and X2 are independent,

  32. CONDITIONAL EXPECTATION AND VARIANCE

  33. CONDITIONAL EXPECTATION AND VARIANCE (EVVE rule) Proofs available in Casella & Berger (1990), pgs. 154 & 158

  34. SOME MATHEMATICAL EXPECTATIONS • Population Mean:  = E(X) • Population Variance: (measure of the deviation from the population mean) • Population Standard Deviation: • Moments:

  35. This measure reflects the dispersion of all the observations • The variance of a population of size N x1, x2,…,xN whose mean is m is defined as • The variance of a sample of n observationsx1, x2, …,xn whose mean is is defined as The Variance

  36. MOMENT GENERATING FUNCTION The m.g.f. of random variable X is defined as for t Є (-h,h) for some h>0.

  37. Properties of m.g.f. • M(0)=E[1]=1 • If a r.v. X has m.g.f. M(t), then Y=aX+b has a m.g.f. • M.g.f does not always exists (e.g. Cauchy distribution)

  38. CHARACTERISTIC FUNCTION The c.h.f. of random variable X is defined as for all real numbers t. C.h.f. always exists.

  39. Uniqueness Theorem: • If two r.v.s have mg.f.s that exist and are equal, then they have the same distribution. • If two r.v.s have the same distribution, then they have the same m.g.f. (if they exist) Similar statements are true for c.h.f.

  40. SOME DISCRETE PROBABILITY DISTRIBUTIONS • Please review: Degenerate, Uniform, Bernoulli, Binomial, Poisson, Negative Binomial, Geometric, Hypergeometric, Extended Hypergeometric, Multinomial

  41. SOME CONTINUOUS PROBABILITY DISTRIBUTIONS • Please review: Uniform, Normal (Gaussian), Exponential, Gamma, Chi-Square, Beta, Weibull, Cauchy, Log-Normal, t, F Distributions

  42. TRANSFORMATION OF RANDOM VARIABLES • If X is an rv with pdf f(x), then Y=g(X) is also an rv. What is the pdf of Y? • If X is a discrete rv, replace Y=g(X) whereever you see X in the pdf of f(x) by using the relation . • If X is a continuous rv, then do the same thing, but now multiply with Jacobian. • If it is not 1-to-1 transformation, divide the region into sub-regions for which we have 1-to-1 transformation.

  43. CDF method • Example: Let Consider . What is the p.d.f. of Y? • Solution:

  44. M.G.F. Method • If X1,X2,…,Xn are independent random variables with MGFs Mxi (t), then the MGF of is

  45. THE PROBABILITY INTEGRAL TRANSFORMATION • Let X have continuous cdfFX(x) and define the rvY as Y=FX(x). Then, Y ~ Uniform(0,1), that is, P(Y  y) = y, 0<y<1. • This is very commonly used, especially in random number generation procedures.

  46. SAMPLING DISTRIBUTION • A statistic is also a random variable. Its distribution depends on the distribution of the random sample and the form of the function Y=T(X1, X2,…,Xn). The probability distribution of a statistic Y is called the sampling distribution of Y.

  47. SAMPLING FROM THE NORMAL DISTRIBUTION Properties of the Sample Mean and Sample Variance • Let X1, X2,…,Xn be a r.s. of size n from a N(,2) distribution. Then,

  48. SAMPLING FROM THE NORMAL DISTRIBUTION If population variance is unknown, we use sample variance:

  49. SAMPLING FROM THE NORMAL DISTRIBUTION • The F distribution allows us to compare the variances by giving the distribution of • If X~Fp,q, then 1/X~Fq,p. • If X~tq, then X2~F1,q.

  50. X Random Variable (Population) Distribution Sample Mean Distribution CENTRAL LIMIT THEOREM If a random sample is drawn from any population, the sampling distribution of the sample mean is approximately normal for a sufficiently large sample size. The larger the sample size, the more closely the sampling distribution of will resemble a normal distribution. Random Sample (X1, X2, X3, …,Xn)

More Related