1 / 45

DCSP-2: Fourier Transform I

DCSP-2: Fourier Transform I. Jianfeng Feng Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html. Time. Tuesday (L) 11.00 –- 12.00 CS1.01 Wednesday (L) 12.00 —13.00 room R1.13 Thursday (S) 10.00 -- 11.00 CS1.01 From this week, seminar starts

delorenzo
Download Presentation

DCSP-2: Fourier Transform I

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DCSP-2: Fourier Transform I Jianfeng Feng Jianfeng.feng@warwick.ac.uk http://www.dcs.warwick.ac.uk/~feng/dcsp.html

  2. Time • Tuesday (L) 11.00 –- 12.00 CS1.01 • Wednesday (L) 12.00 —13.00 room R1.13 • Thursday (S) 10.00 -- 11.00 CS1.01 From this week, seminar starts    Tian Ge

  3. Data transmission: Channel characteristics, signalling methods, interference and noise, synchronisation, data compression and encryption; Daily life terminology: information, information rate, sampling etc.

  4. The range of frequencies occupied by the signal is called its bandwidth.

  5. The ADC process is governed by an important law. Nyquist-Shannon Theorem (will be discussed in Chapter 3) An analogue signal of bandwidth B can be completely recreated from its sampled form provided its sampled at a rate equal to at least twice it bandwidth. That is S >= 2 B

  6. Example, a speech signal has an approximate bandwidth of 4KHz. If this is sampled by an 8-bit ADC at the Nyquist sampling, the bit rate R is R= 8x 2 B=64000 b/s=64k b/s

  7. The relationship between information, bandwidth and noise The most important question associated with a communication channel is the maximum rate at which it can transfer information.

  8. The relationship between information, bandwidth and noise The most important question associated with a communication channel is the maximum rate at which it can transfer information. Is there a limit on the number of levels?

  9. The relationship between information, bandwidth and noise The most important question associated with a communication channel is the maximum rate at which it can transfer information. Is there a limit on the number of levels? The limit is set by the presence of noise: If we continue to subdivide the magnitude of the changes into ever decreasing intervals, we reach a point where we cannot distinguish the individual levels because of the presence of noise.

  10. Noise therefore places a limit on the maximum rate at which we can transfer information

  11. Noise therefore places a limit on the maximum rate at which we can transfer information Obviously, what really matters is the signal to noise ratio (SNR).

  12. Noise therefore places a limit on the maximum rate at which we can transfer information Obviously, what really matters is the signal to noise ratio (SNR). This is defined by the ratio signal power S to noise power N, and is often expressed in deciBels (dB): SNR=10 log10 (S/N) dB

  13. Most signal carried by communication channels are modulated forms of sine waves ??????

  14. Most signal carried by communication channels are modulated forms of sine waves. A sine wave is described mathematically by the expression s(t)=A cos (w t +f) The quantities A, w,f are termed the amplitude, frequency and phase of the sine wave.

  15. When referring to measurements of amplitude it is usual to consider the ratio of the squares of A (measured amplitude) and A0 (reference amplitude).

  16. When referring to measurements of amplitude it is usual to consider the ratio of the squares of A (measured amplitude) and A0 (reference amplitude). This is because in most applications power is proportional to the square of amplitude. Thus the following definition is used: SNR=10 log10 (A2/A20) dB

  17. Noise sources Input noise is common in low frequency circuits and arises from electric fields generated by electrical switching. It appears as bursts at the receiver, and when present can have a catastrophic effect due to its large power. Other peoples signals can generate noise: cross-talk is the term give to the pick-up of radiated signals from adjacent cabling.

  18. Noise sources When radio links are used, interference from other transmitters can be problematic. Thermal noise is always present. It is due to the random motion of electric charges present in all media. It can be generated externally, or internally at the receiver.

  19. There is a theoretical maximum to the rate at which information passes error free over the channel.

  20. There is a theoretical maximum to the rate at which information passes error free over the channel. This maximum is called the channel capacity, C. The famous Hartley-Shannon Law states that the channel capacity, C (we will discuss in details in Chapter 3) is given by C = B log2(1+(S/N)) b/s S/N = A2/A20

  21. For example, a 10kHz channel operating at a SNR of 15dB has a theoretical maximum information rate of 10000 log2(1+31.623)=49828 b/s. (???)

  22. For example, a 10kHz channel operating at a SNR of 15dB has a theoretical maximum information rate of 10000 log2(31.623)=49828 b/s. The theorem makes no statement as to how the channel capacity is achieved.

  23. For example, a 10kHz channel operating at a SNR of 15dB has a theoretical maximum information rate of 10000 log2(31.623)=49828 b/s. The theorem makes no statement as to how the channel capacity is achieved. In fact, in practice channels only approach this limit.

  24. For example, a 10kHz channel operating at a SNR of 15dB has a theoretical maximum information rate of 10000 log2(31.623)=49828 b/s. The theorem makes no statement as to how the channel capacity is achieved. In fact, in practice channels only approach this limit. The task of providing high channel efficiency is the goal of coding techniques.

  25. Two basic laws • Nyquist-Shannon sampling theorem • Hartley-Shannon Law (channel capacity) Best piece of applied math.

  26. Analog signal sampling quantized coding channel receiver bandwidth

  27. Communication Techniques Time frequency bandwidth (Fourier Transform)

  28. Communication Techniques Time, frequency and bandwidth We can describe this signal in two ways. One way is to describe its evolution in time domain, as in the equation above.

  29. Communication Techniques Time, frequency and bandwidth We can describe this signal in two ways. One way is to describe its evolution in time domain, as in the equation above. The other way is to describe its frequency content, in frequency domain.

  30. Communication Techniques Time, frequency and bandwidth We can describe this signal in two ways. One way is to describe its evolution in time domain, as in the equation above. The other way is to describe its frequency content, in frequency domain. The cosine wave, s(t), has a single frequency, w=2 p/T where T is the period i.e. S(t+T)=s(t).

  31. This representation is quite general. In fact we have the following theorem due to Fourier. Any signal x(t) of period T can be represented as the sum of a set of cosinusoidal and sinusoidal waves of different frequencies and phases.

  32. In mathematics, the continuous Fourier transform is one of the specific forms of Fourier analysis.

  33. In mathematics, the continuous Fourier transform is one of the specific forms of Fourier analysis. As such, it transforms one function into another, which is called the frequency domain representation of the original function (which is often a function in the time-domain).

  34. In mathematics, the continuous Fourier transform is one of the specific forms of Fourier analysis. As such, it transforms one function into another, which is called the frequency domain representation of the original function (which is often a function in the time-domain). In this specific case, both domains are continuous and unbounded. The term Fourier transform can refer to either the frequency domain representation of a function or to the process/formula that "transforms" one function into the other.

  35. In mathematics, the continuous Fourier transform is one of the specific forms of Fourier analysis. As such, it transforms one function into another, which is called the frequency domain representation of the original function (which is often a function in the time-domain). In this specific case, both domains are continuous and unbounded. The term Fourier transform can refer to either the frequency domain representation of a function or to the process/formula that "transforms" one function into the other.

  36. h=0.1; for i=1:100 t(i)=i*h; x(i)=cos(2*pi*t(i)); y(i)=cos(2*2*pi*t(i)); z(i)=cos(2*2*2*pi*t(i)); end plot(t,x)

  37. Continuous time (analogous signals): FT (Fourier transform) • Discrete time: DTFT (infinity digital signals) • DFT: Discrete Fourier transform (finite digital signals)

  38. Theory only • Continuous time: FT (Fourier transform) • Discrete time: DTFT (infinity digital signals) • DFT: Discrete Fourier transform (finite digital signals) computable and useful !!!!!

  39. Hi, Prof. Feng

  40. Fourier's Song • Integrate your function times a complex exponentialIt's really not so hard you can do it with your pencilAnd when you're done with this calculationYou've got a brand new function - the Fourier TransformationWhat a prism does to sunlight, what the ear does to soundFourier does to signals, it's the coolest trick aroundNow filtering is easy, you don't need to convolveAll you do is multiply in order to solve. • From time into frequency - from frequency to time • Every operation in the time domainHas a Fourier analog - that's what I claimThink of a delay, a simple shift in timeIt becomes a phase rotation - now that's truly sublime!And to differentiate, here's a simple trickJust multiply by J omega, ain't that slick?Integration is the inverse, what you gonna do?Divide instead of multiply - you can do it too. • From time into frequency - from frequency to time • Let's do some examples... consider a sineIt's mapped to a delta, in frequency - not timeNow take that same delta as a function of timeMapped into frequency - of course - it's a sine! • Sine x on x is handy, let's call it a sinc.Its Fourier Transform is simpler than you think.You get a pulse that's shaped just like a top hat...Squeeze the pulse thin, and the sinc grows fat.Or make the pulse wide, and the sinc grows dense,The uncertainty principle is just common sense.

More Related