1 / 28

Information theory

Information theory. Cryptography and Authentication A.J. Han Vinck Essen, 2003. Cryptographic model. sender receiver attacker secrecy encrypt M decrypt M read M find key authentication sign test validity modify generate.

torgny
Download Presentation

Information theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information theory Cryptography and Authentication A.J. Han Vinck Essen, 2003

  2. Cryptographic model sender receiver attacker secrecy encrypt M decrypt M read M find key authentication sign test validity modify generate

  3. General (classical) Communication Model K Secure key channel K source M encrypter C decrypter M destination analyst M‘

  4. no information providing ciphers Shannon (1949): Perfect secrecy condition Prob. distribution (M) = Prob. distribution (M|C) and thus: H(M|C) = H(M) (no gain if we want to guess the message given the cipher)

  5. Perfect secrecy condition Furthermore: for perfect secrecy H(M)  H(K) H(M|C)  H(MK|C) = H(K|C) + H(M|CK) C and K  M = H(K|C)  H(K) H(M) = H(M|C)  H(K) perfect secrecy!

  6. Imperfect secrecy How much ciphertext do we need to find unique key-message pair given the cipher? Minimum is called unicity distance.

  7. Imperfect secrecy • Suppose we observe a piece of ciphertext K and ML determineCL Key K, H(K) source M, H(M) Cipher C H(CL )  Llog2|C| CL Key equivocation: H(K| CL) = H(K,CL) – H(CL)

  8. Question: When is H(K| CL) = 0 ? H(K| CL) = H(K) + H(CL|K) – H(CL) = H(K) + H(ML) – H(CL) Let HS(M) be the normalized entropy per output symbol U = the least value of L such that H(K| CL) = 0 Using: H(CL )  Llog2|C|; H(ML)  LHS(M) U  H(K) / [ log2|C|- HS (M)] ( K ML  CL)

  9. conclusion Make HS (M) as large as possible: USE DATA REDUCTION !! H(K) H(K| CL) U is called the unicity point L H(K) / [ log2|C|- HS (M)]

  10. examples: U  H(K) / [ log2|C|- HS(M) ] Substitution cipher: H(K) = log2 26! English: HS(M)  2; |M|=|C|=26; U  32 symbols DES: U  56 / [ 8 – 2 ]  9 ASCII symbols

  11. examples: U  H(K) / [ log2|C|- HS(M) ] Permutation cipher: period 26; H(K) = log226! English: H(M)  2; |M|=|C|=26; U  32 symbols Vigenere: key length 80, U  140 !

  12. Plaintext-ciphertext attack H(K,ML,CL) = H(K|ML,CL) + H( CL|ML ) + H(ML) = H(CL|K,ML) + H( K|ML ) + H(ML) H(K|ML,CL) = H(K) - H( CL|ML ) H( CL|ML )  Llog2|C| thus: U  H(K) / log2|C| CL← K,ML K independent from ML

  13. Wiretapping model Xn Xn sender noiselesss channel receiver S noise Zn wiretapper Send: n binary digits 0: Xn of even weight 1: Xn of odd weight Wiretapper: Pe = P(Zn has odd # of errors) = 1- ½(1+(1-2p)n)

  14. Wiretapping Pe = 1- ½(1+(1-2p)n) Result: for p  ½ Pe  ½ and H(S|Zn) = 1 for p 0, Pe  np and H(S|Zn)  h(np)

  15. Wiretapping general strategy Encoder: use R = k/n error correcting code C carrier c  { 2k codewords } message m  { 2nh(p) vectors as correctable noise } select c at random and transmit c  m Note: 2k 2nh(p) 2n  k/n  1 – h(p)

  16. Communication sender-receiver transmit = receive : c  m first decode: c calculate m = c  m  c = m m c c m

  17. Wiretapper: c  m n‘ z = c  m  n‘ receive z = c  m  n‘ - first decode: c possible when m  n‘ is decodable noise - calculate: c  m  n‘  c = m  n‘ m‘ = m  n‘ is one of 2nh(p‘) messages the # of noise sequences n‘ is |n‘ | ~ 2nh(p‘)

  18. Wiretapping general strategy Result: information rate R = h(p) p‘ small: c decodable and H(Sk|Zn) = nh(p‘) p‘ p: H(Sk|Zn) = nh(p) H(Sk|Zn) nh(p) P P‘

  19. Wiretapping general strategy picture 2k codewords Volume 2nh(p‘) Volume 2nh(p) codeword 2n vectors

  20. authentication Encryption table: message X Key K ( X, K )  Y unique cipher: Y

  21. Authentication: impersonation message: 0 1 select y at random 00 00 10 Pi (y = correct) = ½ key 01 01 00 P(key = i ) = 1/4 10 11 01 P(message = i ) =1/2 11 10 11 cipher Pi is probability that an injected cipher is valid

  22. Authentication: bound |X| Let : |X| # messages |K| # keys |Y| # ciphers Pi prob (random cipher = valid)  |X|/|Y| = probability that we choose one of the colors in a specific row ´ specified by the key |K|

  23. Cont‘d Since: ( Y, K)  X  H(X) = H(Y|K) An improved (Simmons) bound gives: Pi  2H(X)/2H(Y) = 2H(Y|K)-H(Y) = 2-I(Y;K)

  24. Cont‘d Pi  2-I(Y;K) = 2+ H(K |Y) - H(K) For low probability of success: H(K|Y) = 0 For perfect secrecy: H(K|Y) = H(K) Contradiction!

  25. Cont‘d Prob ( key = 0 ) = Prob ( key = 0 ) = ½; Prob (X = 0) = Prob(X = 1) = 1/2 0 1 0 00 01 1 10 11 0 1 0 00 01 1 01 00 prob success = ½ prob success = 1 H(K|Y) = 0 H(K|Y) = 1 no secrecy perfect secrecy

  26. Authentication: impersonation X= 0 1 P(X=0) = P(X=1)= ½ K= 0 0 1 P(K=0) = P(K=1)= ½ 1 1 2 H(K) =1; H(K|Y) = ½ P(0,1,2) = (¼, ½, ¼)  Pi =½+2x¼x½= 0.75 random choise:  Pi = 2/3 Pi 2H(Y|K)-H(Y) = 21-1.5 = 2 –0.5 = 0.7

  27. Authentication: substitution message 0 1 0 0 2 Key 1 1 3 2 0 3 3 1 2 cipher Active wiretapping: replace an observed cipher by another cipher Example: observe 0  replace by 3 probability of success = ½ (accepted only if key = 2)

  28. Authentication: substitutionexamples H(K) = 2; H(K|Y) = 1; Pi ½ 0 1 0 0 2 1 1 3 2 0 3 3 1 2 0 1 0 3 1 2 2 1 3 0 0 1 0 2 1 0 3 1 2 3 Ps= ½ H(X|Y) = 0 Ps= 1 H(X|Y) = 1 Ps= ½ H(X|Y) = 1 Ps = probability( substitution is successful)

More Related