1 / 33

ECE 4331, Fall, 2009

ECE 4331, Fall, 2009. Zhu Han Department of Electrical and Computer Engineering Class 12 Oct. 1 st , 2009. Outline. Project and Homework Due Exam Line Coding Spectrum MPEG Estimation and Detection Basics Homework 3.3, 3.5, 3.8, 3.9, 3.14, 3.16, 3.18, 3.20, 3.25, (3.30), 3.32, (3.34)

adolph
Download Presentation

ECE 4331, Fall, 2009

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ECE 4331, Fall, 2009 Zhu Han Department of Electrical and Computer Engineering Class 12 Oct. 1st, 2009

  2. Outline • Project and Homework Due • Exam • Line Coding Spectrum • MPEG • Estimation and Detection Basics • Homework • 3.3, 3.5, 3.8, 3.9, 3.14, 3.16, 3.18, 3.20, 3.25, (3.30), 3.32, (3.34) • Due 10/20

  3. Line coding schemes

  4. Digital Communication System • Spectrum of line coding: • Basic pulse function and its spectrum P(w) • For example, rect. function is sinc • Input x is the pulse function with different amplitude • Carry different information with sign and amplitude • Auto correlation is the spectrum of Sx(w) • Overall spectrum

  5. NRZ • R0=1, Rn=0, n>0 • pulse width Tb/2 • P(w)=Tb (sinc(wTb/2))^2 • Bandwidth Rb for pulse width Tb

  6. RZ scheme • DC Nulling • Split phase

  7. Polar biphase: Manchester and differential Manchester schemes • In Manchester and differential Manchester encoding, the transition at the middle of the bit is used for synchronization. • The minimum bandwidth of Manchester and differential Manchester is 2 times that of NRZ. 802.3 token bus and 802.4 Ethernet

  8. Bipolar schemes: AMI and pseudoternary • R0=1/2, R1=-1/4, Rn=0,n>1, • Reason: the phase changes slower

  9. Multilevel: 2B1Q scheme • NRZ with amplitude representing more bits

  10. Pulse Shaping • Sy(w)=|P(w)|^2Sx(w) • Last class: • Sx(w) is improved by the different line codes. • p(t) is assumed to be square • How about improving p(t) and P(w) • Reduce the bandwidth • Reduce interferences to other bands • Remove Inter-symbol-interference (ISI) • In wireless communication, pulse shaping to further save BW • Talk about the pulse shaping in Chapter 6

  11. Motion Pictures Expert Group • Established in 1988 with remit to develop standards for coded representation of audio, video and their combination • operates within framework of Joint ISO/IEC Technical Committee (JTC1 on Information Technology), organised into committees and sub-committees • originally 25 experts, now approximately 350 experts from 200 companies and academic institutions, which meet approx. 3 times/year (depends on committee) • (all) standards work takes a long time, requires international agreement, (potentially) of great industrial strategic importance

  12. MPEG- 1 standards • video standard for low fidelity video, implemented in software codecs, suitable for transmission over computer networks • audio standard has 3 layers, encoding process increases in complexity and data rates become lower as layers increase, • Layer 1 - 192 kbps • Layer 2 - 128 kbps • Layer 3 - 64 kbps (MPEG 1 - Layer 3 = MP3) • (these data rates are doubled for a stereo signal)

  13. MPEG1 - Layer 3 Audio encoding • Encoders analyze an audio signal and compare it to psycho-acoustic models representing limitations in human auditory perception • Encode as much useful information as possible within restrictions set by bit rate and sampling frequency • Discard samples where the amplitude is below the minimum audition threshold for different frequencies • Auditory masking - a louder sound masks a softer sound when played simultaneously or close together, so the softer sound samples can be discarded

  14. Psychoacoustic model Throw away samples which will not be perceived, ie those under the curve

  15. MPEG1 - Layer 3 Audio encoding • Temporal masking - if two tones are close together on the frequency spectrum and are played in quick succession, they may appear indistinct from one another • Reservoir of bytes - data is organised into ‘frames’ - space left over in one frame can be used to store data from adjacent frames that need additional space • Joint stereo - very high and very low frequencies can not be located in space with the same precision as sounds towards the centre of the audible spectrum. Encode these as mono • Huffman encoding removes redundancy in the encoding of repetitive bit patterns (can reduce file sizes by 20%)

  16. Masking effects • Throw samples in region masked by louder tone

  17. Schematic of MPEG1 - Layer 3 encoding http://www.iis.fhg.de/amm/techinf/layer3/index.htm

  18. MPEG - 2 standards • Video standard for high fidelity video • ‘Levels’ define parameters, maximum frame size, data rate and chrominance subsampling • ‘Profiles’ may be implemented at one or more levels • MP@ML (“main profile at main level”) uses CCIR 601 scanning, 4:2:0 chrominance subsampling and supports a data rate of 15Mbps • MP@ML used for digital television broadcasting and DVD • Audio standard essentially same as MPEG-1, with extensions to cope with surround sound

  19. MPEG - 4 • MPEG-4 standard activity aimed to define an audiovisual coding standard to address the needs of the communication, interactive (computing) and broadcasting service (TV/film/entertainment) models • In MPEG-1 and MPEG-2, ‘systems’ referred to overall architecture, multiplexing and synchronisation. • In MPEG-4, systems also includes scene description, interactivity, content description and programmability • Initial call for proposals - July 1995, version 2 amendments - December 2000

  20. MPEG -4 Systems - mission “Develop a coded, streamable representation for audio-visual objects and their associated time-variant data along with a description of how they are combined” • ‘coded representation’ as opposed to ‘textual representation’ - binary encoding for bandwidth efficiency • ‘streamable’ as opposed to ‘downloaded’ - presentations have a temporal extent rather than being being based on files of a finite size • ‘audio-visual objects and their associated time-variant data’as opposedto ‘individual audio or visual streams’. MPEG-4 deals with combinations of streams to create an interactive visual scene, not with encoding of audio or visual data

  21. Estimation Theory • Consider a linear process y = H q + n y = observed data q = sending information, 0 or 1 n = additive noise • If q is known, H is unknown. Then estimation is the problem of finding the statistically optimal H, given y, q and knowledge of noise properties. • If H is known, then detection is the problem of finding the most likely sending information q, given y, H and knowledge of noise properties. • In practical system, the above two steps are conducted iteratively to track the channel changes then transmit data.

  22. Different Approaches for Estimation • Minimum variance unbiased estimators • Subspace estimators • Least Squares • Maximum-likelihood • Maximum a posteriori has no statistical basis uses knowledge of noise PDF uses prior information about q

  23. Least Squares Estimator • Least Squares: qLS = argmin ||y – Hq||2 • Natural estimator– want solution to match observation • Does not use any information about noise • There is a simple solution (a.k.a. pseudo-inverse): qLS = (HTH)-1 HTy • What if we know something about the noise? • Say we know Pr(n)…

  24. Maximum Likelihood Estimator • But if noise is jointly Gaussian with cov. matrix C • Recall C ,E(nnT). Then Pr(n) = e-½ nT C-1 n L(y|q) = ½ (y-Hq)T C-1 (y-Hq) qML = argmin ½ (y-Hq)TC-1(y-Hq) • This also has a closed form solution qML = (HTC-1H)-1 HTC-1y • If n is not Gaussian at all, ML estimators become complicated and non-linear • Fortunately, in most channel noise is usually Gaussian

  25. Bayes Theorem: • Pr(x|y) = Pr(y|x) Pr(x) • Pr(y) Maximum a Posteriori (MAP) Estimate • This is an example of using a signal prior information • Priors are generally expressed in the form of a PDF Pr(x) • Once the likelihood L(x) and prior are known, we have complete statistical knowledge • LS/ML are suboptimal in presence of prior • MAP (aka Bayesian) estimates are optimal likelihood posterior prior

  26. Expectation and Maximization (EM) • Expectation and Maximization (EM) algorithm alternates between performing an expectation (E) step, which computes an expectation of the likelihood by including the latent variables as if they were observed, and a maximization (M) step, which computes the maximum likelihood estimates of the parameters by maximizing the expected likelihood found on the E step. The parameters found on the M step are then used to begin another E step, and the process is repeated. • E-step: Estimation for unobserved event (which Gaussian is used), conditioned on the observation, using the values from the last maximization step. • M-step: You want to maximize the expected log-likelihood of the joint event

  27. Minimum-variance unbiased estimator • Biased and unbiased estimators • An unbiasedestimator of parameters, whose variance is minimized for all values of the parameters. • The Cramer-Rao Lower Bound (CRLB) sets a lower bound on the variance of any unbiased estimator. • Biased estimator might have better performances than unbiased estimator in terms of variance. • Subspace methods • MUSIC • ESPRIT • Widely used in RADA • Helicopter, Weapon detection (from feature)

  28. What is Detection • Deciding whether, and when, an event occurs • a.k.a. Decision Theory, Hypothesis testing • Presence/absence of signal • RADA • Received signal is 0 or 1 • Stock goes high or not • Criminal is convicted or set free • Measures whether statistically significant change has occurred or not

  29. Detection • “Spot the Money”

  30. Hypothesis Testing with Matched Filter • Let the signal be y(t), model be h(t) Hypothesis testing: H0: y(t) = n(t) (no signal) H1: y(t) = h(t) + n(t) (signal) • The optimal decision is given by the Likelihood ratio test (Nieman-Pearson Theorem) Select H1 if L(y) = Pr(y|H1)/Pr(y|H0) > g otherwise select H0

  31. Signal detection paradigm • Signal trials • Noise trials

  32. Signal Detection

  33. Receiver operating characteristic (ROC) curve

More Related