1 / 22

ELEC 303 – Random Signals

ELEC 303 – Random Signals. Lecture 7 – Discrete Random Variables: Conditioning and Independence Farinaz Koushanfar ECE Dept., Rice University Sept 15, 2009. Lecture outline. Reading: Finish Chapter 2 Review Joint PMFs Conditioning Independence. Random Variables.

ivor-french
Download Presentation

ELEC 303 – Random Signals

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ELEC 303 – Random Signals Lecture 7 – Discrete Random Variables: Conditioning and Independence FarinazKoushanfar ECE Dept., Rice University Sept 15, 2009

  2. Lecture outline Reading: Finish Chapter 2 Review Joint PMFs Conditioning Independence

  3. Random Variables A random variable is a Real-valued function of an experiment outcome A function of a random variable defines another random variable We associate with each RV some averages of interest, such as mean and variance A random variable can be conditioned on an event or another random variable There is a notion of independence of a random variable from an event or from another

  4. Discrete random variables • It is a real-valued function of the outcome of the experiments • can take a finite or infinitely finite number of values • A discrete random variable has an associated probability mass function (PMF) • It gives the probability of each numerical value that the random variable can take • A functionof a discrete random variable defines another discrete random variable (RV) • Its PMF can be found from the PMF of the original RV

  5. Probability mass function (PMF) • Notations • Random variable: X • Experimental value: x • PX(x) = P({X=x}) • It mathematically defines a probability law • Probability axiom: xPX(x) = 1 • Example: Coin toss • Define X(H)=1, X(T)=0 (indicator RV)

  6. Review: discrete random variable PMF, expectation, variance Probability mass function (PMF) PX(x) = P (X=x) x PX(x)=1

  7. Expected value for functions of RV • Let X be a random variable with PMF pX, and let g(X) be a function of X. Then, the expected value of the random variable g(X) is given by • E[g(X)] = x g(x)pX(x) • Var(X) = E[(X-E[X])2] = x (x-E[X])2pX(x) • Similarly, the nth moment is given by • E[Xn]= xxnpX(x)

  8. Properties of variance

  9. Joint PMFs of multiple random variables • Joint PMF of two random variabels: pX,Y • PX,Y(x,y)=P(X=x,Y=y) • Calculate the PMFs of X and Y by the formula • PX(x)= yPX,Y(x,y) • PY(y)= XPX,Y(x,y) • We refer to PX and PY as the marginal PMFs

  10. Tabular method • Assume Z=X+2Y • Find E[Z]? For computing marginal PMFs

  11. Expectation

  12. Variances

  13. Example: Binomial mean and variance

  14. More than two variables PX,Y,Z (x,y,z) = P(X=x,Y=y,Z=z) PX,Y (x,y) = z PX,Y,Z (x,y,z) PX(x) = y z PX,Y,Z (x,y,z) The expected value rule: E[g(X,Y,Z)] = x y z g(x,y,z)PX,Y,Z (x,y,z) http://www.coventry.ac.uk

  15. Conditioning Conditional PMF of a RV on an event A PX|A(x)=P(X=x|A) = P({X=x} A)/P(A) P(A) = x P({X=x} A)  x PX|A(x) = 1

  16. Example A student will take a certain test up to a max of n times, each time with a probability p of passing independent of the number of attempts Find the PMF of the number of attempts given that the student passes the test A={the event of passing} X is a geometric RV with parameter p and A={Xn} P(A) = {m=1 to n}(1-p)m-1p

  17. Conditioning a RV on another PX|Y(x|y) = P(X=x|Y=y) PX|Y(x|y) = P(X=x,Y=y)/P(Y=y) = PX,Y(x,y)/PY(y) The conditional PMF is often used for the joint PMF, using a sequential approach PX,Y(x,y) = PY(y)PX|Y(x|y)

  18. Conditional expectation Conditional expectation of X given A (P(A)>0) E(X|A)= x x PX|A(x|A) E[g(X)|A] = x g(x) PX|A(x|A) If A1,..,An are disjoint events partitioning the sample space, then E[X]= iP(Ai)E[X|Ai] For any event B with P(AiB)>0 for all i E[X|B]= iP(Ai|B)E[X|AiB] E(X)= y pY(y)E(X|Y=y)

  19. Mean and variance of Geometric Assume there is a probability p that your program works correctly (independent of how many times you write). Find the mean and variance of X, the number of tries till it works correctly? pX(k)=(1-p)k-1p, k=1,2,… E[X] = kk(1-p)k-1p Var(X) = k (k-E[X])2(1-p)k-1p

  20. Mean and variance of Geometric • E[X|X=1]=1, E[X|X>1]=1+E(X) •  E[X] • E[X2|X=1]=1, E[X2|X>1]=E[(1+X)2]=1+2E[x]+E[X2] E[X2] = 1+2(1-p)E[X]/p

  21. Independence Independence from an event P(X=x, A) = P(X=x)P(A) = PX(x) P(A), for all x P(X=x, A) = P(X=x and A) = PX|A(x)(A), PX|A(x)=PX(x), for all x Independence of random variables P(X=x,Y=y|A) =P(X=x|A)P(Y=y|A) for all x and y For two independent RVs: E[XY] = E[X]E[Y] Also, E[g(X)h(Y)] = E[g(X)]E[h(Y)]

  22. Multiple RVs, sum of RVs Three RVs X, Y, and Z are said to be independent if PX,Y,Z (x,y,z) = PX(x)PY(y)PZ(z)

More Related