1 / 11

Class notes for ISE 201 San Jose State University Industrial & Systems Engineering Dept.

Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 3 Notes. Class notes for ISE 201 San Jose State University Industrial & Systems Engineering Dept. Steve Kennedy. A Random Variable.

maile-payne
Download Presentation

Class notes for ISE 201 San Jose State University Industrial & Systems Engineering Dept.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probability & Statistics for Engineers & Scientists, byWalpole, Myers, Myers & Ye ~Chapter 3 Notes Class notes for ISE 201 San Jose State University Industrial & Systems Engineering Dept. Steve Kennedy

  2. A Random Variable • So far, we have seen how to calculate the probability of occurrence of various outcomes in the sample space of an experiment, and to calculate the probability that various events (subsets of S) occur. • A random variable is a function that associates a real number with each element in a sample space. • Example: 3 components can either be defective (D) or not (N). • What is the sample space for this situation? • S = {NNN, NND, NDN, NDD, DNN, DND, DDN, DDD} • Let random variable X denote the number of defective components in each sample point. • Describe P(X  2) in words. • What are the values of P(X = x), for x = 0, 1, 2, 3? • .125, .375, .375, .125.

  3. Discrete and Continuous Sample Spaces • A discrete sample space has a finite or countably infinite number of points (outcomes). • Example of countably infinite: experiment consists of flipping a coin until a heads occurs. • S = ? • S = {H, TH, TTH, TTTH, TTTTH, TTTTTH, …} • S has a countably infinite number of sample points. • A continuous sample space has an infinite number of points, equivalent to the number of points on a line segment. • For discrete sample spaces, we know that the sum of all of the probabilities of the points in the sample space equals 1. • Addition won’t work for continuous sample spaces. Here, P(X = x) = 0 for any given value of x.

  4. Discrete Distributions • For a discrete random variable X, we generally look at the probability P(X = x) of X taking on each value x. • Often, the probability can be expressed in a formula, f(x) = P(X = x). • The set of ordered pairs (x, f(x)), is called the probability distribution or probability function of X. • Note that f(x)  0, and x f(x) = 1.

  5. Cumulative Distribution & Plotting • The cumulative distribution, denoted F(x), of a discrete random variable X with distribution f(x), is F(x) = P(X  x) • How is F(x) calculated? • F(x) = t  x f(t). • It is useful to plot both a probability distribution and the corresponding cumulative distribution. • Typically, the values of f(x) versus x are plotted using a probability histogram. • Cumulative distributions are also plotted using a similar type of histogram/step function.

  6. Continuous Distributions • Continuous distributions have an infinite number of points in the sample space, so for a given value of x, what is P(X = x)? • P(X = x) = 0. • Otherwise the probabilities couldn’t sum to 1. • What we can calculate is the probability that X lies in a given interval, such as P(a < X < b), or P(X < C). • Since the probability of any individual point is 0, P(a < X < b) = P(a  X  b)on, the endpoints can be included or not. • For continuous distributions, f(x) is called a probability density function.

  7. Probability Density Functions • If f(x) is a continuous probability density function, • f(x)  0, as before. • What corresponds to x f(x) = 1 for discrete distributions? • -… f(x) dx = 1. • and, P(a  X  b) = ? • P(a  X  b) = a…b f(x) dx • The cumulative distribution of a continuous random variable X is? • F(x) = P(X  x) = ? • F(x) = -…x f(x) dx • What is P(a < X < b) in terms of F(x)? • P(a < X  b) = F(b) - F(a) • If discrete, must use "a < X", and not "a  X", above.

  8. Joint Probability Distributions • Given a pair of discrete random variables on the same sample space, X and Y, the joint probability distribution of X and Y is f (x,y) = P (X = x, Y = y)f (x,y) equals the probability that both x and y occur. • The usual rules hold for joint probability distributions: • f (x,y)  0 • x y f (x, y) = 1 • For any region A in the xy plane, P[(X,Y)  A] =  A f (x, y) • For continuous joint probability distributions, the sums above are replaced with integrals.

  9. Marginal Distributions • The marginal distribution of X alone or Y alone can be calculated from the joint distribution function as follows: • g (x) = y f (x,y) and h (y) = x f (x,y) if discrete • g (x) = y f (x,y) and h (y) = x f (x,y) if continuous • In other words, for example, g (x) = P(X = x) is the sum (or integral) of f(x,y) over all values of y.

  10. Conditional Distributions • For either discrete or continuous random variables, X and Y, the conditional distribution of Y, given that X = x, is f (y | x) = f (x,y) / g(x) if g(x) > 0 and f (x | y) = f (x,y) / h(y) if h(y) > 0 • X and Y are statistically independent if f (x,y) = g (x) h (y)for all x and y within their range. • A similar equation holds for n mutually statistically independent jointly distributed random variables.

  11. Statistical Independence • The definition of independence is as before: • Previously, P (A | B) = P (A) and P (B | A) = P (B). • How about terms of the conditional distribution? • f (x | y) = g (x) and f (y | x) = h (y). • The other way to demonstrate independence? • f (x, y) = g(x) h(y)  x, y in range. • Similar formulas also apply to more than two mutually independent random variables.

More Related