1 / 109

Communication Theory/2

Communication Theory/2. I. Frigyes 2009-10/II. http://docs.mht.bme.hu/~frigyes/hirkelm hirkelm01bEnglish. 2. Transmission of digital signals over analog channels: effect of noise. Introductory comments. Theory of digital transmission is (at least partly) application of decision theory

hachi
Download Presentation

Communication Theory/2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Communication Theory/2 I. Frigyes 2009-10/II.

  2. http://docs.mht.bme.hu/~frigyes/hirkelmhirkelm01bEnglish

  3. 2. Transmission of digital signals over analog channels: effect of noise

  4. Introductory comments • Theory of digital transmission is (at least partly) application of decision theory • Definition of digital signals/transmission: • Finite number of signal shapes (M) • Each has the same finite duration (T) • The receiver knows (a priori) the signal shapes (they are stored) • So the task of the receiver is hypothesis testing.

  5. Introductory comments • Quality parameter: error probability • (I.e. the costs are: • ) • Erroneous decision may be caused by: • additíve noise • linear distortion • nonlinear distortion • additive interference (CCI, ACI) • false knowledge of a parameter • e.g. synchronizing error

  6. ωc n(t) s(t) z0(t) z1(t) z2(t) NONLINEARAMPLIFIER BANDPASSFILTER FADINGCHANNEL BANDPASSFILTER DECISIONMAKER + ωc CCI INTER-FERENCE ω1 ACI INTER-FERENCE ω2 ACI INTER-FERENCE Introductory comments – degrading effects causing false decision

  7. T Tˆ Introductory comments • Often it is not one signal of which the error probability is of interest but of a group of signals – e.g. of a frame. • (A second quality parameter: erroneous recognition of T : the jitter.: • )

  8. Transmission Channal DIGITALSOURCE SINK DE-KÓDOLÓ DE-CODER SOURCE FORRÁS ENCODER KÓDOLÓ NYELŐ SINK PE ÁTVITELI CSATORNA Transmission Channal Transmission Channal PE,dec JITTERFREECLOCK data data ELASTICSTORE clock clock Introductory comments – improvement of performance parameters Both performance parameters can be improved Error probability: Jitter

  9. Introductory comments – degrading effects causing false decision • Comments: • 1. These effects canot be described by the two performance parameters. The channel, at this level is an analog channel producing the effects seen in slide #6 • 2. Behavior of radio and optical channels are rather different. First we deal with the first and then show differences in the second

  10. TIMING (T) n(t) ˆm SOURCE SIGNALGENERATOR DECISIONMAKER SINK + si(t) mi r(t)= si(t)+n(t) {mi}, Pi 0.Transmission of single signals in additive Gaussian noise • Among the many sources of error now we regard only this one • Model to be investigated:

  11. 0.Transmission of single signals in additive Gaussian noise • Specifications: • Pi a-priori probabilities are known • Support of the real time functions • is (0,T) • their energy is finite (E: square integral of the time functions) • mutual unique relationship (i.e.: their is no error in the transmitter)

  12. 0.Transmission of single signals in additive Gaussian noise • Noise: Gaussian • 0-mean • stationary • additive it’s drown so • white • Comment.: in white noise: σn=

  13. 0.Transmission of single signals in additive Gaussian noise • Com.: white is an approximation More exact: Planck-formule: • If hf/kBT0<<1: • If hf/kBT0>>1: • f =300 GHz,T0=30K:FkBT0 -0,1dB • f=200 THz,T0=270K:FkBT0-127 dB

  14. 0.Transmission of single signals in additive Gaussian noise • Decision: based on r(t)=si(t)+n(t). • Application of the general method: independent samples would result in too high noise; correlated samples yield less information; • they don’t specify bandwidth. • Contionouos investigation and appropriate processing of signals yield most information. This is the subject of our next investigations.

  15. 0.Transmission of single signals in additive Gaussian noise • Questions to be looked for: • 0. Vectorial representation of digital signals • 1. The optimal receiver • 2. Error probability • 3. Coherent – non-coherent • 4. Optimal signal set • 5. Bandwidth occupation

  16. 0.Transmission of single signals in additive Gaussian noise • Given the – somehow chosen – signal set • We chose an orthonormal base: • (ortonormal: • So that • Of course

  17. 0.Transmission of single signals in additive Gaussian noise • Thus: time functions are uniquely representet by D numbers (ai,1, ai,2 … ai,D) • But: any structer represented by D numbers can be regarded as a vector of D dimensions • I.e. • So we defined a vector space: signal space • D is the dimensionality of the signal space

  18. 0. Comment • As said: DM • Earlier we saw: in the general case dimensionality of the decision space is D=M-1.In the case of concretly defined signal waveforms (like now) decision can be made in the signal space; then D<M-1 is possible. • (We had also the observation space, with D=N. In the case of continouos obser-vation D=∞, is not too important

  19. 0. How to chose the base?

  20. 0. How to chose the base? • This can be done as long as we have signals (Gram-Schmidt ortogonalization) • We see: there areM base functions at most. • But if some signal waveforms are linear combinations of others these don’t introduce new dimensions • E.g. dimensionality of M-ary PAM signal set is 1, of QAM signal set it is 2.

  21. 0. Scalar product • Schalar product of two signal-space-vectors is the integral of their product: • By the way from that: |si|2 = Ei

  22. 0. Single signals: vectorial form of noise • After the signal: noise should also be given in vectorial form. • Of course: compo-nents of the noise vector can be written • And by that: the noise vector • But it is not true for the noise process that

  23. 0. Single signals: vectorial form of noise • (A Gaussian process can not be linear combination of a finite number of functions.) • So • We know that is orthogonal to the signal space; as the signal is in the signal space, an efficient receiver can filter out this part of the noise – it is thus irrelevant from the point of view of reception. I.e. n contains that part of the noise what is relevant. (We’ll briefly come back to that.)

  24. n ˆm SOURCE SIGNALVECTORGEN. DECISION SINK + si mi r= si+n {mi}, Pi 0. Single signals: vectorial form of the link • Thus we can investigate the vectorial model of this connection

  25. 0. Single signals: vectorial representation • Pdf of the noise in the signal space: σ2 was doubtful: of white noise is infinite. • Without details: in the interval [0,T] Gs noise can be expanded according to any complete orthogonal series. • Individual terms are independent and have equal σ2. • Base of the signal space is part of such a complete base (we are interested only in that part).

  26. 0. Single signals: vectorial representation • Thus pdf-s can be written:

  27. 0.The signal space

  28. φ(t) T 1/T1/2 s1(t) T A=(E/T)1/2 M=2 D=1 s2(t)=-s1(T) s2 s1 0.The signal space - examples • 1. (Antipodal) baseband NRZ signals:

  29. 0.The signal space - examples • 2. BPSK signals M=2 D=2 s2 Φ s1 If Φ=π: antipodal D=1

  30. s2 s1 s3 s4 0.The signal space - examples • 3. QPSK signals M=4 D=2

  31. 0.The signal space - examples • 4. Ortogonal QFSK signals M=4 D=4 s3 s4 s2 s1

  32. s2 s1 s3 s4 0.The signal space - examples • 5 Biortogonal signals M=4 D=2 Note: just like QPSK

  33. 0.The signal space - examples • 6. MQAM jelek M D=2 Example: M=16

  34. n s1 s2 r s5 n r s3 s4 1. Single signals: the optimal decision rule • Decision rule now: a jeltér optimal partitioning of the signal space (resulting in minimal error probability) • Pl: D=2 Before we had to partition the decision space

  35. 1. Single signals: the optimal decision rule • We’ve seen: risk is minimal if the a-posteriori probability is maximl. We the decide on what is the most likely, i.e. • To proceed apply Bayes theorem:

  36. 1. Single signals: the optimal decision rule • Thus the decision rule: • Or: as denominator does not depend explicitly on i

  37. 1. Single signals: the optimal decision rule • Logarithm: of the a-posteriory pdf: • Finally

  38. 1. Single signals: the optimal decision rule • For an instant come back to the noise vector • We’ve seen: • Details of the decision noise, taking the whole noise into considerationl • I.e. ‘n(t) in the optimal receiver is really irrelevant

  39. ½(N0lnPM-EM) ½(N0lnP2-E2) ½(N0lnP1-E1) × + + + × × r COMPARATOR max sM s1 s2 1. Optimal decider – vectorial form If E-s are equal it can be omittedfrom the bias. If in addition Pi=1/M,the whole bias can be omitted.

  40. ½(N0lnPM-EM) ½(N0lnP1-E1) ½(N0lnP2-E2) × + + × × + r(t) s1(t) COMPARATOR max sM(t) s2(t) 1. Optimal decider (correlation) Sense of scalar productis known Question: are all elements of the model needed?

  41. Timing (T) n(t) ˆm SOURCE SIGNAL GENERTOR DECISION SINK + si(t) mi r(t)= si(t)+n(t) {mi}, Pi M s(t) known

  42. 1. Single signals: the optimal decision rule • Comment: if Pi≡1/M (equal a-priori prob.) • I.e. we have to decide on which is closest

  43. 1. Optimal decider (matched filter) • Correlation is a linear operation (multiplication by a signal independent of r(t)and integration). • But: a linear operation can also be done with a linear filter thus an equivalent filter can also be found – its impulse response is h(t). h(t)=si(T-t) ↓ It is causal!

  44. sM(T-t) s1(T-t) s2(T-t) ½(N0lnPM-EM) ½(N0lnP1-E1) ½(N0lnP2-E2) + + + 1. Optimal decider (matched filter) t=T r(t) COMPARATOR max

  45. T T A 1/T1/2 s1(t) φ(t) 1.Some of the previous examples together with decision boundaries • 1. (Antipodal) NRZ baseband signals M=2 D=1 s2(t)=-s1(T) s1 s2

  46. 1.Some of the previous examples together with decision boundaries • MQAM signals M D=2

  47. s2 s1 s3 s4 1.Some of the previous examples together with decision boundaries • 3. QPSK M=4 D=2

  48. 1. Optimal reception in the optical band • We’ve seen that there is no termal noise in the optical band. • On the other hand there is shot noise. (We’ve seen – without refering to optics – the effect of Poisson noise.) • To some detail later.

  49. 2. Error probability in the optimal detector • Based on the precedings: conditional probability of correct decision (condition: siis transmitted): • Total probability of correct decision: • And the error probability

  50. 2. Error probability in the optimal detector • If the a-priori probabilities are equal • Or, if the constellation is in addition symmetric

More Related