1 / 16

Tutorial 5 - PowerPoint PPT Presentation

Tutorial 5. Generating Functions & Sum of Independent Random Variables. Generating Functions. Generating functions are tools for studying distributions of R.V.’s in a different domain. (c.f. Fourier transform of a signal from time to frequency domain)

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

PowerPoint Slideshow about ' Tutorial 5' - bedros

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Tutorial 5

Generating Functions & Sum of Independent Random Variables

• Generating functions are tools for studying distributions of R.V.’s in a different domain. (c.f. Fourier transform of a signal from time to frequency domain)

• Moment Generating Function gX (t)=E[etX]

• Ordinary Generating Function hX (z)= E[ZX]

• o.g.f. is also called z-transform which is applied to discrete R.V’s only.

• We illustrate the use of g.f.’s by z-transform:

• Let a non-negative discrete r.v. X with p.m.f.

{pk, k = 0,1,…}, z is a complex no.

• The z-transform of {pk} is

hX(z) = p0 + p1z + p2z2+ ……

=  pkzk

• It can be easily seen that

 pkzk = E[zX]

• We can obtain many useful properties of r.v. X from hX(z).

• First, we can observe that

• hX(0) = p0 + p10+ p202+ …… = p0

• hX(1) = p0 + p11+ p212+ …… = 1

• By differentiate hX(z), we can get the mean and variance of X.

• Put z = 1, we get

•  hX’(1) is the mean of of X.

• Similarly,

• E[X2] is called the 2nd moment of X.

• In general, E[Xk] is called the k-th moment of X. We can get E[Xk] from successive derivatives of hX (z).

• Since Var(X) = E[X2] - E[X]2, we get

• Find the mean and variance of a Bernoulli distr. by z-transform.

P(X=1) = p, P(X=0) = 1-p

• E[X] = hX’(1) = p

Finding pj from g(t) and h(z)

• If we know g(t), then we know h(z), then we can find the pj :

• Let X , Y be 2 independent continuous R.V.’s

• The cumulative distribution function (c.d.f) of X+Y:

• By differentiating the above equation, we obtain the p.d.f. of X+Y:

• fX+Y(a) is the convolution of fX and fY .

• On the other hand, the moment generating function of p.d.f. fX is

• The m.g.f. of fX+Y is:

• We have obtained an important property:

• If S = X+Y, where X & Y are independent.

• In general, if

p.d.f.

m.g.f.

• You are in a casino and confronted by two slot machines. Each machine pays off either one dollar or nothing. The probability that the first machine pays off a dollar is x and that the second machine pays off a dollar is y. We assume that x and y are random numbers chosen independently from the interval [0,1] and unknown to you. You are permitted to make a series of ten plays, each time choosing one machine or the other.

• How should you choose to maximize the number of times that you win?

• Strategies described in Grinstead and Snell(P.170):

• Play-the-best (calculate the prob. that each machine will pay off at each stage and choose the machine with the higher prob. )

• Play-the-winner (choose the same machine when we win and switch machines when we lose)

• Modified two-armed bandit problem:

both unknown prob. vary in a linear manner over the twenty plays,

Pr(payoff at kth play for machine i) = ai + kbi

where ai and bi are constants.

• Make a series of 20 plays

• Design a simple strategy to maximize the number of times that you win