1 / 130

Communication Theory

Communication Theory. I. Frigyes 2009-10/II. http://docs.mht.bme.hu/~frigyes/hirkelm hirkelm01bEnglish. Topics. (0. Math. Introduction: Stochastic processes, Complex envelope) 1. Basics of decision and estimation theory 2. Transmission of digital signels over analog channels: noise effects

keith
Download Presentation

Communication Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Communication Theory I. Frigyes 2009-10/II.

  2. http://docs.mht.bme.hu/~frigyes/hirkelmhirkelm01bEnglish

  3. Topics • (0. Math. Introduction: Stochastic processes, Complex envelope) • 1. Basics of decision and estimation theory • 2. Transmission of digital signels over analog channels: noise effects • 3. Transmission of digital signels over analog channels: dispersion effects • 4. Analóg jelek átvitele – analóg modulációs eljárások (?) • 5. Channel characterization: wireless channels, optical fibers • 6. A digitális jelfeldolgozás alapjai: mintavételezés, kvantálás, jelábrázolás • 7. Elvi határok az információközlésben. • 8. A kódelmélet alapjai • 9. Az átvitel hibáinak korrigálása: hibajavító kódolás; adaptív kiegyenlítés • 10. Spektrális hatékonyság – hatékony digitális átviteli eljárások

  4. (0. Stochastic processes, the complex envelope)

  5. Stochastic processes • Also called random waveforms. • 3 different meanings: • As a function of ξnumber of realizations: a series of infinite number of random variables ordered in time • As a function of time t:a member of a time-function family of irregular variation • As a function of ξ and t: one member of a family of time functions drawn at random

  6. f(t,ξ3) f(t,ξ2) ξ f(t,ξ1) f(t1,ξ) f(t2,ξ) f(t 3,ξ) t Stochastic processes • example: 2 1 3

  7. Stochastic processes: how to characerize them? • According to the third definition • And with some probability distribution. • As the number of random variables is infinite: with their joint distribution (or density) • (not only infinite but continuum cardinality) • Taking these into account:

  8. Stochastic processes: how to characerize them? • (Say: density) • First prob. density of x(t) • second: joint t1,t2 • nth:n-fold joint • The stochastic process is completly characterized, if there is a rule to compose density of any order (even for n→). • (We’ll see processes depending on 2 parameters)

  9. Stochastic processes: how to characerize them? • Comment: although precisely the process (function of t and ξ) and one sample function (function of t belonging to say ξ16)are distinguished we’ll not always make this distinction.

  10. Stochastic processes: how to characerize them? • Example: semi-random binary signal: • Values : ±1 (P0=P1= 0,5) • Change: only at t=k×T • First density_: • Second::

  11. 45o 45o Continuing the example: In two distinct time-slots In the same time-slot

  12. Stochastic processes: the Gaussian process • A stoch. proc. is Gaussianif its n-th density is that of an n-dimensional vector random variable • m is the expected value vector, K the covariance matrix. • nth density can be produced if are given • are given

  13. Stochastic processes: the Gaussian process • An interesting property of Gaussian processes (more precisely: of Gaussian variables): • These can be realizations of one process at different times

  14. Stochastic processes: stationary processes • A process is stationary if it does not change (much) as time is passing • E.g. the semirandom binary signal is (almost) like that • Phone: to transmit 300-3400 Hz sufficient (always, for everybody). (What could we do if this didn’t hold?) • etc.

  15. Stochastic processes: stationary processes • Precise definitions: what is almost unchanged: • A process is stationary (in the strict sense) if for the distribution function of any order and any at any time and time difference • Is stationary in order n if the first n distributions are stationary • E.g.: the seen example is first order stationary • In general: if stationary in order n also in any order <n

  16. Stochastic processes: stationary processes • Comment: to prove strict sense stationarity is difficult • But: if a Gaussian process is second order stationary (i.e. in this case: if K(t1,t2) does not change if time is shifted) it is strict sense (i.e. any order) stationary. As: if we know K(t1,t2)nth density can be computed (any n)

  17. Stochastic processes: stationarity in wide sense • Wide sense stationary: if the correlation function is unchanged if time is shifted (to be defined) • A few definitions:. • a process is called a Hilbert-process if • (That means: instantaneous power is finite.)

  18. Stochastic processes: wide sense stationary processes • (Auto)correlation function of a Hilbert-process: • The process is wide sense stationary if • the expected value is time-invariant and • R depends only on τ=t2-t1 for any time and any τ.

  19. Stochastic processes: wide sense – strict sense stationary processes • If a process is strict-sense stationary then also wide-sense • If at least second order stationary: then also wide sense. • I.e.:

  20. Stochastic processes: wide sense – strict sense stationary processes • Further: if wide sense stationary, not strict sense stationary in any sense • Exception: Gaussian process. This: if wide sense stationary, also in stict sense.

  21. Stochastic processes: once again on binary transmission • As seen: only first order stationary (Ex=0) • Correlation: • if t1 and t2 in the same time-slot: • if in different:

  22. e T Stochastic processes: once again on binary transmission • The semi-random binary transmission can be transformed in random by introducing a dummy random variable e distributed uniformly in (0,1) • like x:

  23. Stochastic processes: once again on binary transmission • Correlation: • If |t1-t2|>T, (as e T) • if |t1-t2| T • so

  24. -T T τ Stochastic processes: once again on binary transmission • I.e. :

  25. Stochastic processes: other type of stationarity • Given two processes, x and y, these are jointly stationary, if their joint distributions are alle invariant on any τtime shift. • Thus a complex process is stationary in the strict sense if x and y are jointly stationary. • A process is periodic (or ciklostat.) if distributions are invariant to kT time shift

  26. Stochastic processes: other type of stationarity • Cross-correlation: • Two processes are jointly stationary in the wide sense if their cross correlation is invariant on any time shift

  27. Stochastic processes: comment on complex processes • Appropriate definition of correlation for these: • A complex process is stationary in the wide sense if both real and imaginary parts are wide sense stationary and they are that jointly as well

  28. Stochastic processes: continuity • There are various definitions • Mean square continuity • That is valid if the correlation is continuous

  29. Stochastic processes: stochastic integral • x(t) be a stoch. proc. Maybe that Rieman integral exists for all realizations: • Then s is a random variable (RV). But if not, we can define an RV converging (e.g. mean square) to the integral-approximate sum:

  30. Stochastic processes: stochastic integral • For this

  31. Stochastic processes: stochastic integral - comment • In σs2 the integrand is the (auto)covariancie-function: • This depends only on t1-t2=τ if x is stationary (at least wide sense)

  32. Stochastic processes: time average • Integral is needed – among others –to define time average • Time average of a process is its DC component; • time average of its square is the mean power • definition:

  33. Stochastic processes: time average • In general this is a random variable. It would be nice if this were the statistical average. This is really the case if • Similarly we can define

  34. Stochastic processes: time average • This is in general also a RV. But equal to the correlation if • If these equalities hold the process is called ergodic • The process is mean square ergodic if

  35. Stochastic processes: spectral density • Spectral density of a process is, by definition the Fourier transform of the correlation function

  36. Stochastic processes: spectral density • A property: • Consequently this integral >0; (we’ll see: S˙(ω)>0)

  37. x(t) y(t) FILTERh(t) Spectral density and linear transformation • As known in time functions output function is convolution • h(t): impulse response

  38. Spectral density and linear transformation • Comment.: h(t<0)≡ 0; (why?); and: h(t) = F-1[H(ω)] • It is plausible: the same for stochastic processes • Based on that it can be shown : • (And also )

  39. x(t) y(t) FILTERh(t) H(ω) Sy(ω) (its integral is negative) Spectral density and linear transformation • FurtherS(ω) ≥ 0 (all frequ.) • For: if not, there is a domain where S(ω) <0 (ω1, ω2) Sx(ω)

  40. H(ω) ω Spectral density and linear transformation • S(ω) is the spectral density (in rad/sec).As:

  41. Modulated signals – the complex envelope • In previous studies we’ve seen that in radio, optical transmission • one parameter is influenced (e.g. made proportional) • of a sinusoidal carrier • by the modulating signal . • A general modulated signal:

  42. Modulated signals – the complex envelope • Here d(t) and/or (t) carries theinformation – e.g. are in linear relationship with the modulating signal • An other description method (quadrature form): • d, , a and q are real time functions – deterministic or realizations of a stoch. proc.

  43. Modulated signals – the complex envelope • Their relationship: • As known x(t) can also be written as:

  44. Modulated signals – the complex envelope • Here a+jq is the complex envelope. Question: when, how to apply. • To beguine with: Fourier transform of a real function is conjugate symmetric: • But if so: X(ω>0)describes the signal completly: knowing that we can form theω<0 partand, retransform.

  45. ↓„Hilbert” filter Modulated signals – the complex envelope • Thus instead of X(ω) we can take that: • By the way: • The relevant time function:

  46. Modulated signals – the complex envelope • We can write: • The shown inverse Fourier transform is 1/t. • So • Imaginary part is the so-callerd Hilbert-transzform of x(t)

  47. Modulated signals – the complex envelope • Now introduced function is the analyticfunction assigned to x(t) (as it is an analytic function of the z=t+ju complexvariable). • An analytic function can be assigned to any (baseband or modulated) function; relationship between the time function and the analytic function is

  48. Modulated signals – the complex envelope • It is applicable to modulated signals: analytic signal of cosωct is ejωct. Similarly that of sinωct is jejωct. So if quadrature components of the modulated signal a(t), q(t) are • band limited and • their band limiting frequency is < ωc/2π (narrow band signal) • then NB. Modulation is a linearoperation in a,q: frequencydisplacement.

  49. Modulated signals – the complex envelope • Thus complexenvelope determines uniquely the modulated signals. In the time domain • Comment: according to its name can be complex. (X(ω) is not conjugate symmetric around ωc.) • Comment 2: if the bandwidt B>fc, is not analytic, its real part does not define the modulated signal.) • Comment 3: a és q can be independent signals (QAM) or can be related (FM or PM).

  50. X(ω) X(ω) X˚(ω) X̃(ω) ω Modulated signals – the complex envelope • In frequency domain? On analytic signal we saw.

More Related