information theoretic secrecy l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Information-Theoretic Secrecy PowerPoint Presentation
Download Presentation
Information-Theoretic Secrecy

Loading in 2 Seconds...

play fullscreen
1 / 16

Information-Theoretic Secrecy - PowerPoint PPT Presentation


  • 159 Views
  • Uploaded on

Probability Theory: Bayes’ theorem Perfect secrecy: definition, some proofs, examples: one-time pad; simple secret sharing Entropy: Definition, Huffman coding property, unicity distance. Information-Theoretic Secrecy. CSCI381 Fall 2005 GWU Reference: Stinson. Bayes’ Theorem.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Information-Theoretic Secrecy' - daisy


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
information theoretic secrecy

Probability Theory: Bayes’ theorem

  • Perfect secrecy: definition, some proofs, examples: one-time pad; simple secret sharing
  • Entropy: Definition, Huffman coding property, unicity distance

Information-Theoretic Secrecy

CSCI381 Fall 2005

GWU

Reference: Stinson

bayes theorem
Bayes’ Theorem

If Pr[y] > 0 then Pr[x|y] = Pr[x]Pr[y|x]/  xXPr[x]Pr[y|x]

What is the probability that the 1st dice throw is 2 when the sum of two dice throws is 5?

What is the probability that the 2nd dice throw is 3 when the product of the two dice throws is: 6, and 5?

CS284/Spring04/GWU/Vora/Shannon Secrecy

im perfect secrecy example
(Im)Perfect Secrecy: Example

P = {1, 2, 3} K = {K1, K2, K3} and C = {2, 3, 4, 5, 6}

Keys chosen equiprobably

Pr[1] = Pr[2] = Pr[3] = 1/3

Pr[c=3] = ?

Pr[m|c=3] = ?

Pr[k|c=3] = ?

CS284/Spring04/GWU/Vora/Shannon Secrecy

im perfect secrecy example4
(Im)Perfect Secrecy: Example

How should the above ciphers be changed to improve the cryptosystem?

What defines a good cryptosystem?

CS284/Spring04/GWU/Vora/Shannon Secrecy

im perfect secrecy example latin square
(Im)Perfect Secrecy: ExampleLatin Square

Assume all keys and messages equiprobable

What’s good about this?

P(k|c)

P(m|c)

CS284/Spring04/GWU/Vora/Shannon Secrecy

perfect secrecy definition
Perfect Secrecy: Definition

A cryptosystem has perfect secrecy if

Pr[x|y] = Pr[x]  xP, yC

a posteriori probability = a priori probability

posterior = prior

CS284/Spring04/GWU/Vora/Shannon Secrecy

example one time pad
Example: one-time pad

P = C = Z2n

dK=eK(x1, x2, …xn) = (x1+K1, x2+K2, …xn+Kn) mod 2

Show that it provides perfect secrecy

CS284/Spring04/GWU/Vora/Shannon Secrecy

some proofs thm 2 4
Some proofs: Thm. 2.4

Thm 2.4: Suppose (P, C, K, E, D) is a cryptosystem where |K| = |P| = |C|. Then the cryptosystem provides perfect secrecy if and only if every key is used with equal probability 1/|K|, and x Pand y C, there is a unique key K such that eK(x) = y

(eg: Latin square)

CS284/Spring04/GWU/Vora/Shannon Secrecy

entropy
Entropy

H(X) = - pi log2 pi

Example: pi = 1/n

Examples: ciphertext and plaintext entropies for examples.

CS284/Spring04/GWU/Vora/Shannon Secrecy

huffman encoding
Huffman encoding

f: X*  {0, 1}*

String of random variables to string of bits

e.g. X = {a, b, c, d} f(a) = 1, f(b) = 10, f(c) = 100, f(d) = 1000

CS284/Spring04/GWU/Vora/Shannon Secrecy

huffman encoding algorithm
Huffman encoding algorithm

X = {a, b, c, d, e}

p(a) = 0.05

p(b) = 0.1

p(c) = 0.12

p(d) = 0.13

p(e) = 0.6

a: 000, b: 001, c: 010, d: 011, e: 1

Average length = ?

Entropy = ?

CS284/Spring04/GWU/Vora/Shannon Secrecy

theorem
Theorem

H(X)  average length of Huffman encoding  H(X) + 1

Without proof

CS284/Spring04/GWU/Vora/Shannon Secrecy

properties of entropy
Properties of Entropy
  • H(X)  log2 n
  • H(X, Y)  H(X) + H(Y)
  • H(X, Y) = H(X) + H(Y|X) = H(Y) + H(X|Y)

Where H(X|Y) = - x y p(y)p(x|y)log2p(x|y)

  • H(X|Y)  H(X)

With proofs and examples

CS284/Spring04/GWU/Vora/Shannon Secrecy

theorem14
Theorem

H(K|C) = H(K) + H(P) – H(C)

Examples: Previous imperfect squares

Proof: H(K, P, C) = H(K, C) = H(K, P)

CS284/Spring04/GWU/Vora/Shannon Secrecy

language entropy and redudancy
Language Entropy and Redudancy

HL = Lim n H(Pn ) /n

(lies between 1 and 1.5 for English)

RL = 1 – HL /log2 |P|

(the amount of “space” in a letter of English for other information)

Need, on average, about n ciphertext characters to break a substitution cipher where:

n = key entropy / RL log2 |P|

n is “unicity distance” of cryptosystem

CS284/Spring04/GWU/Vora/Shannon Secrecy

proof
Proof
  • H(K|Cn) = H(K) + H(Pn) – H(Cn)

CS284/Spring04/GWU/Vora/Shannon Secrecy