1 / 26

Stochastic signals and processes

Lecture 1. Stochastic signals and processes. welcome. Introduction to probability and random variables. A deterministic signal can be derived by mathematical expressions .

ormand
Download Presentation

Stochastic signals and processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 1 Stochastic signals and processes welcome

  2. Introduction to probability and random variables • A deterministic signal can be derived by mathematical expressions. • A deterministic model (or system) will always produce the same output from a given starting condition or initial state. • Stochastic (random) signals or processes • Counterpart to a deterministic process • Described in a probabilistic way • Given initial condition, many realizations of the process

  3. Introduction to probability • Define the probability of an event A as: • where N is the number of possible outcomes of the random • experiment and NA is the number of outcomes favorable to the • event A. • For example: • A 6-sided die has 6 outcomes. • 3 of them are even, • Thus P(even) = 3/6

  4. Axiomatic Definition of Probability • A probability law (measure or function) that assigns probabilities to events such that: • P(A) ≥ 0 • P(S) =1 • If A and B are disjoint events (mutually exclusive), i.e. A ∩ B = ∅, then P(A ∪ B) = P(A) + P(B) • That is if A happens, B cannot occur.

  5. Some Useful Properties • : probability of impossible event • , the complement of A • If A and B are two events, then • If the sample space consits of n mutually exclusive events such that , then

  6. Joint and marginal probability • Joint probability: • is the likelihood of two events occurring together. Joint probability is the probability of event A occurring at the same time event B occurs. It is P(A ∩ B) or P(AB). • Marginal probability: • is the probability of one event, ignoring any information about the other event. Thus P(A) and P(B) are marginal probabilities of events A and B

  7. Conditional probability • Let A and B be two events. The probability of event B given that event A has occured is called the conditional probability • If the occurance of B has no effect on A, we say A and B are indenpendent events. In this case • Combining both, we get , when A and B are indenpendent

  8. Random variables • A random variable is a function, which maps events or outcomes (e.g., the possible results of rolling two dice: {1, 1}, {1, 2}, etc.) to real numbers (e.g., their sum). • A random variable can be thought of as a quantity whose value is not fixed, but which can take on different values. • A probability distribution is used to describe the probabilities of different values occurring.

  9. Random variables • Notations: • Random variables with capital letters: X, Y, ..., Z • Real value of the random variable by lowercase letters (x, y, …, z) • Types: • Continuous random variables: maps outcomes to values of an uncountable set. the probability of any specific value is zero. • Discrete random variable: maps outcomes to values of a countable set. Each value has probability ≥ 0. P(xi) = P(X = xi) • Mixed random variables

  10. Continuous random variables • Distribution function: • Properties: • 1. is either increasing or remains constant. • 2. • 3. • 4. • 5.

  11. Distribution functions • Cumulative distribution function (CDF) for a normal distribution

  12. Probability density function (pdf) • Definition: • Properties: • Thus integration of gives probability

  13. Expectation operator • Is defined only for random variables or a function of them. • Expected value: Is the weighted average of all possible values that this random variable can take on. • Let

  14. Definition of moments • A moment of order k of a random variable is defined as: • Order 1: • Order 2: Mean of X

  15. Central moments • Defined in a similar way as moment Variance of X

  16. Two random variables • Let X and Y be two random variables, we define the joint distribution function • and the joint probability density function as • ≥ 0

  17. Useful Properties

  18. Useful Properties • The probability that Xlies between x1 and x2 and Y lies between y1 and y2 is

  19. Indenpendent and uncorrelated random variables • Let X and Y be two random variables, we say that • X and Y are indenpendent if • X and Y are uncorrelated if • Indenpendence => uncorrelation

  20. Expected value • If X and Y are indenpendent we get The correlation

  21. Uncorrelated • If we say X and Y are orthogonal

  22. Covariance Correlation coefficient

  23. Random processes • A random process may be viewed as a collection of random variables, with time t as a parameter running through all real numbers.

  24. Example • With Φ a random variable uniformly distributed between 0 and 2π.

  25. First and second order statistics • Expected value • Autocorrelation function • Properties: Stationarity, Ergodicity, Power spectrum

  26. Homework • Workshop: Get familiar with the following terms • Probability density function, independency Autocorrelation, Stationarity, Ergodicity, Power spectrum. • Exercises on the web (click here)

More Related