1 / 9

Some Common Binary Signaling Formats:

Some Common Binary Signaling Formats:. AMI. RZ. Manchester. NRZ-B. NRZ. 1 0 1 0 0 1 1 1 0 1.

astro
Download Presentation

Some Common Binary Signaling Formats:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Some Common Binary Signaling Formats:

  2. AMI RZ Manchester NRZ-B NRZ 1 0 1 0 0 1 1 1 0 1

  3. A “Code” is a system of arranging symbols from a symbol alphabet into words (sequences) which have an agreed upon meaning between the sender and receiver. Symbols, Words, Messages • Written Languages • Spoken languages • The radix system of conveying value/quantity • Roman Numerals • Morse Code • ASCII Codes • Semaphores Codes may be hierarchical, or embedded: Binary > ASCII > Roman Letters > Words > Sentences A “Symbol” in one code may be a “Word” or “Message” in another.

  4. Quantity of Information The two outcomes are not equally likely. You might guess that there is a 99% probability that the answer is “no”, so when I tell you that the answer is “no”, it contains very little information. But if I tell you the answer is “yes”, then that is a big deal, because it contains a great deal of information. “I flipped a coin, and it came up . . . . ?” One Bit of Information is contained in the answer to a question which has two equally likely outcomes. “Is Dr. Lyall going to give everyone a gold bar after class today?” Where piis the probability of outcome i.

  5. Examples The decimal number system uses 10 symbols (0 . . 9). Assuming the occurrence of each symbol is equally likely (10% probability), the information content of each digit is -log2(0.1) = 3.32 bits/symbol = 1 “dit”. • You wish to encode the 26 (lower case) letters of the English alphabet using decimal digits. • Method 1: -log2(1/26) = 4.7 bits = (4.7 bits )/(3.32 bits/dit) = 1.415 dits. • Method 2: -log10(1/26) = 1.415 dits • Since you can’t send a fraction of a symbol, you need two decimal digits for the encoding, but each pair only carries 1.415 dits of iniformation, so the Coding Efficiency is 1.415/2 = 0.775 = 77.5 %. • For binary encoding (two symbols: 0, 1), you need 5 symbols to express 4.7 bits. The Coding Efficiency is 4.7/5 = 0.94 = 94%. • Suppose we wanted to use a three symbol alphabet, {*,#, +}. Each symbol expresses -log2(1/3) = 1.585 bits/symbol. The number of symbols required to express 4.7 bits of information is (4.7 bits)/1.585 bits/symbol = 2.96 symbols, so three are required. Each group of three symbols carries 3 x 1.585 bits = 4.755 bits. The Coding Efficiency is 4.7/4.755 = 2.96/3 = 0.987 ~ 99%. • Decode the following: #+###*####**++++#+

  6. Every weekend I ask Dad for $50 to go out partying. 90% of the time he says NO, 10% of the time he says YES. There are two symbols in the alphabet. The information content of YES is -log2(0.1) = 3.32 bits. The information content of NO is -log2(0.9) = 0.15 bits. On the average, how many bits of information are in his answer? 90% of the time I get 0.15 bits, 10% of the time I get 3.32 bits. On the average, I get (0.9)(0.15) + (0.1)(3.32) = .467 Bits. What if YES and NO were equally likely (50% each)? In general, we attach importance to a message in relation to its ‘unexpectedness.’ An unlikely message (or symbol) carries more information than a likely message (or symbol) . The average information per symbol is called Entropy. Entropy is maximum when all symbols are equally likely.

  7. Entropy Study of English Text

  8. Definitions Baud Rate/Signaling Rate fB =1/TB : Symbols/Second Minimum One-sided Channel BW fc (min) = 1/2TB : Hz Average information Transfer Rate fi = H fB: Bits/Second System Capacity: Maximum Information Transfer Rate (Max H) C = fB log2(M) = 2fc log2(M) Maximum Information Transfer (T Seconds) : IT (max) = CT

  9. Shannon Limit System Capacity: Maximum Information Transfer Rate (Max H) C = fB log2(M) = 2fc log2(M) Shannon Limit for System Capacity C = BW log2(SINAD) If BW > fc (min) Example: For SINAD = 1000 (30 dB) M < 31.6 - Or - For 32 symbol channel (5 bits/symbol) we must have SINAD > 30 dB

More Related