1 / 15

Source Coding Procedure:

Source Coding Procedure:. Check if channel capacity* can cope with source information rate, if yes, source coding can proceed. Understand why. (i) Calculate Source entropy from symbols probabilities* (ii) Source information rate = Source entropy × symbol rate* Coder design:

Download Presentation

Source Coding Procedure:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Source Coding Procedure: • Check if channel capacity* can cope with source information rate, if yes, source coding can proceed. Understand why. • (i) Calculate Source entropy from symbols probabilities* • (ii) Source information rate = Source entropy × symbol rate* • Coder design: • Step 1: Initially starts singly; • Step 2: (i) Coding by your optional method ; • (ii) Calculate average length; • (iii) Coder rate = Average length × symbol rate* • (iv) Check if this is ok, if yes, Goto Step 4; • Step 3: change into a higher level of group coding (from coding singly to • by pair, or from by pair to by threes); then repeat Step 2; • Step 4: Coding finishes. You may still be asked to calculate the probabilities of 0’s or 1’s in the output streams. • ____________________________________________________________ • *These will be given..

  2. Example: An information source produces a long sequence of three independent symbols A, B with probabilities 0.8, 0.2 respectively; 80 such symbols are produced per second. The information is to be transmitted via a noiseless binary channel which can transmit up to 60 binary digits per second. Design a suitable compact instantaneous code and find the probabilities of the binary digit produced . • Check if channel capacity is sufficient for source rate. • Source entropy= -(0.8log0.8+0.2log0.2)= 0.72 bits/symbol • So, source information rate (bit/sec)= Source rate (symbol/sec)* Source entropy (bits/symbols)=0.72*80=57.8 bits/sec. • Because one bit carries one bit information, and 57.8 <60 (channel capacity), hence yes. • p(0)+p(1)=1, so, p(0)=0.615, p(1)=0.385. • threes: p(0)=0.52; p(1)=0.48.

  3. Coder design: 1.Coding singly, A=0, B=1. L=1 digit/symbol. Coder produces 80 bits/sec, which is too fast for 60 bits/sec channel rate. 2. Coding in pairs, L 2= 0.64*1+0.16*2+0.16*3+0.04*3 =1.56 binary digits/pair L=1.56/2=0.78 binary digits/symbol. Coder produces 80*0.78=62.4 binary digits/sec. > 60. Still too fast.

  4. 3. Coding in threes (Hoffman) L 3= 0.518*1+0.128*9+0.032*15+0.008*5 =2.19 binary digits/three L=2.19/3=0.73 binary digits/symbol. Coder produces 80*0.73=58 binary digits/sec. < 60. OK.

  5. Probabilities of 0 and 1 in output sequence: • singly; p(0)=0.8; p(1)=0.2; • in pairs; • threes: p(0)=0.52; p(1)=0.48. p(0)+p(1)=1, so, p(0)=0.615, p(1)=0.385.

  6. Noise In communication theory, any interference or distraction that disrupts the communication process. Noise is a random electrical signal on a communications channel that interferes with the desired signal or data. Noise can either be generated by the circuitry of the communications device itself or come from one of a variety of external sources, including radio and TV signals, lightning, and nearby transmission lines. What are effects of noise? In communication theory, any interference or distraction that disrupts the communication process. It may reach a sufficiently high level to cause spurious errors, thereby corrupting some of the information contained in the signal being propagated. Reduction of information: because the presence of noisy reduce the confidence of the information at the receiver. There are still some uncertainty of validity of the transmission. For example if numerical data are read over a telephone line the listener may be unsure that he heard some of the values correctly, or in the transmission of data by binary pulses some of pulses may be received in error.

  7. Information in Noisy Channel 1 0 1 1 1 0 1 1 1 0 ….. • 0 1 1 1 1(x) 0(x) 1 1 0 …..

  8. Transmitted an event of probability over a noiseless channel noiseless Transmitter Receiver the information gained over the transmission is noisy Transmitter Receiver

  9. Transmitted an event of probability over a noisy channel, —› probability at the receiver Consider that the transmission is decomposed into 2 stages: (1). Transmitted an event of probability over a noisy channel to (2) Retransmit the same event via noiseless channel , starting with probability ( gained in the first transmission) to 1 (1) Receiver Transmitter (2)

  10. Quantity of information in noisy channel Information to make reception correct The information in noisy channel is hence given by is the a priori probability: probability at receiver of event before transmission. is the a posteriori probability: probability at receiver of event after transmission.

  11. Example: A binary system produces Marks and Spaces with equal probabilities, 1/8 of all pulses being received in error. Find the information received for all combinations of input and output. There are four possibilities : (i) Correct (M  M, or SS), 7/8 (ii) Wrong (SM, or MS), 1/8 Probability of correct transmission (MM, or SS) (posterior probability) Probability of Marks or Spaces transmitted (prior probability)

  12. Probability of incorrect transmission (SM, or MS) (posterior probability) Probability of Marks or Spaces transmitted (prior probability) Average information received (per binary digit)

  13. M M P(M)=0.5 7/8 P(S|M)=1/8 P(M|S)=1/8 S S P(S)=0.5 P(S|S)= 7/8

  14. Transmitted source symbols Xi; i=1, ….n, Received symbols Yj; j=1, ….n, joint probability of having transmitted Xi;andYj is known to have been received. conditional probability of Xi having been transmitted if Yi is known to have been received . conditional probability of receiving Yj, when Xi is known to have been transmitted.

  15. A binary system generates 0’s and 1’s with equal probability, 5% of all pulses received are in error, (e.g. 0 1, or 1 0). Find the information received for possible combinations of inputs and outputs, and average information received.

More Related