1 / 27

Probability Theory and Random Processes

Probability Theory and Random Processes. Communication Systems , 5ed., S. Haykin and M. Moher , John Wiley & Sons, Inc., 2006. Probability. Probability theory is based on the phenomena that can be modeled by an experiment with an outcome that is subject to chance.

anitra
Download Presentation

Probability Theory and Random Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probability Theory and Random Processes Communication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006.

  2. Probability • Probability theory is based on the phenomena that can be modeled by an experiment with an outcome that is subject to chance. • Definition: A random experiment is repeated n time (n trials) and the event A is observed m times (m occurrences). The probability is the relative frequency of occurrence m/n.

  3. Probability Based on Set Theory • Definition: An experiment has K possible outcomes where each outcome is represented as the kth sample sk. The set of all outcomes forms the sample space S. The probability measure Psatisfies the • Axioms: • 0 ≤P[A] ≤ 1 • P[S] = 1 • If A and B are two mutually exclusive events (the two events cannot occur in the same experiment), P[AUB]=P [A] + P[B], otherwise P[AUB] = P[A] + P[B] – P[A∩B] • The complement is P[Ā] = 1 – P[A] • If A1, A2,…, Am are mutually exclusive events, then P[A1] + P[A2] + … + P[Am] = 1

  4. Venn Diagrams sk Sample can only come from A, B, or neither. S A B Sample can only come from both A andB. Events A and B that are mutually exclusive events in the sample space S. sk S A B Events A and B are not mutually exclusive events in the sample space S.

  5. Conditional Probability • Definition: An experiment involves a pair of events A and B where the probability of one is conditioned on the occurrence of the other. Example: P[A|B] is the probability of event A given the occurrence of event B • In terms of the sets and subsets • P[A|B] = P[A∩B] / P[A] • P[A∩B] =P[A|B]P[B] = P[B|A]P[A] • Definition: If events A and B are independent, then the conditional probability is simply the elementary probability, e.g. P[A|B] = P[A], P[B|A] = P[B].

  6. Random Variables • Definition: A random variable is the assignment of a variable to represent a random experiment. X(s) denotes a numerical value for the event s. • When the sample space is a number line, x = s. • Definition: The cumulative distribution function (cdf) assigns a probability value for the occurrence of x within a specified range such that FX(x) = P[X ≤x]. • Properties: • 0 ≤ FX(x) ≤ 1 • FX(x1) ≤ FX(x2), if x1 ≤ x2

  7. Random Variables • Definition: The probability density function (pdf) is an alternative description of the probability of the random variable X: fX(x) = d/dx FX(x) • P[x1 ≤ X ≤x2] = P[X ≤x2] - P[X ≤x1] = FX(x2) - FX(x1) =fX(x)dx over the interval [x1,x2]

  8. Example Distributions • Uniform distribution

  9. Several Random Variables • CDF: • Marginal cdf: • PDF: • Marginal pdf: • Conditional pdf:

  10. Statistical Averages • Expected value: • Function of a random variable: • Text Example 5.4

  11. Statistical Averages • nth moments: • Central moments: Mean-square value of X Variance of X

  12. Joint Moments • Correlation: • Covariance: • Correlation coefficient: Expected value of the product - Also seen as a weighted inner product Correlation of the central moment

  13. Random Processes • Definition: a random process is described as a time-varying random variable • Mean of the random process: • Definition: a random process is first-order stationary if its pdf is constant • Definition: the autocorrelation is the expected value of the product of two random variables at different times Constant mean, variance Stationary to second order

  14. Random Processes • Definition: the autocorrelation is the expected value of the product of two random variables at different times • Definition: the autocovariance of a stationary random process is Stationary to second order

  15. Properties of Autocorrelation • Definition: autocorrelation of a stationary process only depends on the time differences • Mean-square value: • Autocorrelation is an even function: • Autocorrelation has maximum at zero:

  16. Example • Sinusoidal signal with random phase • Autocorrelation As X(t) is compared to itself at another time, we see there is a periodic behavior it in correlation

  17. Cross-correlation • Two random processes have the cross-correlation • Wide-sense stationary cross-correlation

  18. Example • Output of an LTI system when the input is a RP • Text 5.7

  19. Power Spectral Density • Definition: Fourier transform of autocorrelation function is called power spectral density • Consider the units of X(t) Volts or Amperes • Autocorrelation is the projection of X(t) onto itself • Resulting units of Watts (normalized to 1 Ohm)

  20. Properties of PSD • Zero-frequency of PSD • Mean-square value • PSD is non-negative • PSD of a real-valued RP Which theorem does this property resemble?

  21. Example • Text Example 5.12 • Mixing of a random process with a sinusoidal process • Autocorrelation • PSD Wide-sense stationary RP (to make it easier) Uniformly distributed, but not time-varying

  22. PSD of LTI System • Start with what you know and work the math

  23. PSD of LTI System • The PSD reduces to System shapes power spectrum of input as expected from a filtering like operation

  24. Gaussian Process • The Gaussian probability density function for a single variable is • When the distribution has zero mean and unit variance • The random variable Yis said to be normally distributed as N(0,1)

  25. Properties of a Gaussian Process • The output of a LTI is Gaussian if the input is Gaussian • The joint pdf is completely determined by the set of means and autocovariance functions of the samples of the Gaussian process • If a Gaussian process is wide-sense stationary, then the output of the LTI system is strictly stationary • A Gaussian process that has uncorrelated samples is statistically independent

  26. Noise • Shot noise • Thermal noise • White noise • Narrow

More Related