1 / 60

Key Derivation from Noisy Sources with More Errors Than Entropy

Benjamin Fuller Joint work with Ran Canetti, Omer Paneth , and Leonid Reyzin May 5, 2014. Key Derivation from Noisy Sources with More Errors Than Entropy. Authenticating Users. Are there alternatives to passwords with high entropy (uncertainty)?.

linore
Download Presentation

Key Derivation from Noisy Sources with More Errors Than Entropy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Benjamin Fuller Joint work with Ran Canetti,Omer Paneth, and Leonid Reyzin May 5, 2014 Key Derivation from Noisy Sources with More Errors Than Entropy

  2. Authenticating Users Are there alternatives to passwords with high entropy (uncertainty)? Users’ private data exists online in a variety of locations Must authenticate users before granting access to private data Passwords are widely used but guessable

  3. Key Derivation from Noisy Sources Physically Unclonable Functions (PUFs) [PappuRechtTaylorGershenfield02] Biometric Data[Daugman04] • Entropic sources are noisy • Source differsover time, first reading wlater readingsx, • Distance is bounded d(w, x) ≤ dmax • Derive stable and strong key from noisy source • w, x map to same key • Different samples from source produce independent keys • Gen( w) ≠ Gen( w’)

  4. Source Fuzzy Extractors key • Assume our source is strong • Traditionally, high entropy • Fuzzy Extractors derive reliable keys from noisy data [DodisOstrovskyReyzinSmith04, 08] (interactive setting in aaaa[BennettBrassardRobert88]) Public (p) • Goals: • Correctness: Gen, Repgive same keyif d(w, x) ≤dmax • Security: (key , p) ≈ (U , p) • Can be statistical or computational [FullerMengReyzin13] Reproduce Generate key key w p

  5. Source Fuzzy Extractors key • Assume our source is strong • Traditionally, high entropy • Fuzzy Extractors derive reliable keys from noisy data [DodisOstrovskyReyzinSmith04, 08] (interactive setting in aaaa[BennettBrassardRobert88]) Public (p) Converts high entropy sources to uniform H∞(W0)≥ k Ext (W0) ≈ U • Traditional Construction • Derive a key using a randomness extractor Reproduce Generate key Ext Ext key w p

  6. Source Fuzzy Extractors key • Assume our source is strong • Traditionally, high entropy • Fuzzy Extractors derive reliable keys from noisy data [DodisOstrovskyReyzinSmith04, 08] (interactive setting in aaaa[BennettBrassardRobert88]) Public (p) • Traditional Construction • Derive a key using a randomness extractor • Error correct to wwith a secure sketch Reproduce Sketch Rec Generate key Ext Ext key w p

  7. Error-Correcting Codes • Subset, C, of metric space • For ec1, ec2 in C, d(w, x) > 2dmax • For any ec’ find closest ec1 in C • Linear codes: • C is span of expanding matrix Gc (generating matrix) ec1 2dmax ec2 ec’

  8. Secure Sketches key p Reproduce Sketch Rec Generate key Ext Ext ec = Gc w Code OffsetSketch p =ec w G – Generating matrixfor code that correctsdmaxerrors

  9. Secure Sketches key p Reproduce Rec Sketch Generate key Ext Ext ec = Gc ec’=Dec(p x) w If w and w are close then w = ec’ p. Code OffsetSketch p  x p =ec w G – Generating matrixfor code that correctsdmaxerrors

  10. Secure Sketches Ext must be able to extract from distributions where key p Reproduce Sketch Rec Generate key Ext Ext w Code OffsetSketch wis unknown (knowing p): (k−k’)– entropy loss p  x p =ec w G – Generating matrixfor code that correctsdmaxerrors p  x’

  11. Entropy Loss From Fuzzy Extractors After these losses,there may not be any key left! • Entropy is at a premium for physical sources • Iris ≈249 [Daugman1996] • Fingerprint ≈82 [RathaConnellBolle2001] • Passwords ≈31 [ShayKomanduri+2010] • Fuzzy extractors have two losses: • Secure sketches lose error correcting capability of the code (k-k’) • Iris ≈200 bit error rate • Randomness extractors lose 2log (1/ε) or between 60-100 bits • After these losses the key may be too short to be useful: 30-60 bits

  12. Entropy Loss From Fuzzy Extractors Can we eliminate either of these entropy losses? [DodisOstrovskyReyzinSmith] Secure Sketch Code (corrects random errors) Means k−k’≥ log |Bdmax|(Ball of radius dmax) • Entropy is at a premium for physical sources • Iris ≈249 [Daugman1996] • Fingerprint ≈82 [RathaConnellBolle2001] • Passwords ≈31 [ShayKomanduri+2010] • Fuzzy extractors have two losses: • Secure sketches lose error correcting capability of the code (k-k’) • Iris ≈200 bit error rate • Randomness extractors lose 2log (1/ε) or between 60-100 bits • After these losses the key may be too short to be useful: 30-60 bits

  13. Error Tolerance and Security at Odds M • Adversary shouldn’t guess x*where d(w, x*) ≤ dmax • Easier as dmaxincreases • Consider a source Wwhere initial readings w (for different physical devices) are close • If there is a point x* close to all points in W, no security is possible Any input to Rep in this ball produces key w

  14. Error Tolerance and Security at Odds M • Adversary shouldn’t guess x*where d(w, x*) ≤dmax • Easier as dmaxincreases • Consider a source Wwhere initial readings w (for different physical devices) are close • If there is a point x* close to all points in W, no security is possible By providing x* to Rep the adversary always learns key x* Let Bdmaxrepresent the points with distance dmax There is a Wwhere

  15. Error Tolerance and Security at Odds M • Adversary shouldn’t guess x*where d(w, x*) ≤dmax • Easier as dmaxincreases • Consider a source Wwhere initial readings w (for different physical devices) are close • If there is a point x* close to all points in W, no security is possible By providing x* to Rep the adversary always learns key x* Call this minimum usable entropy, Husable(W) There is a Wwhere

  16. Minimum Usable Entropy Can we find reasonable properties and accompanying constructions? • Standard Fuzzy Extractors provide worst case security guarantees • Implies |key|≤Husable(W) • Many sources have no minimum usable entropy • Irises are thought to be the “best” biometric,for irises Husable(W) ≈ -707 • Need property other than entropyto secure these sources (e.g. points are not close together)

  17. Hamming Metric w x d(w, x)=4 Security parameter n Sources W = W1,…, Wksymbols Wi over alphabet Z (grows with n) d(w, x)=# of symbols in that differ

  18. Results Security relies on point obfuscation (secure under strong vector DDH [BitanskiCanetti10])

  19. Point Obfuscation • Obfuscator transforms program I into “black-box” [BarakGoldreichImpagliazzo RudichSahaiVadhanYang01] • Possible for point programs (we use need a version achievable under number-theoretic assumptions due to [BitanskiCanetti10] )

  20. Point Obfuscation • Obfuscator transforms program I into “black-box” [BarakGoldreichImpagliazzo RudichSahaiVadhanYang01] • Possible for point programs[Canetti97] • We use a strong version achievable under number-theoretic assumptions (composablevirtual gray-box obfuscation [BitanskiCanetti10] )

  21. Point Obfuscation • Obfuscator transforms program I into “black-box” [BarakGoldreichImpagliazzo RudichSahaiVadhanYang01] • Possible for point programs[Canetti97] • Need a strong version achievable under strong vector DDH (composable virtual gray-box obfuscation [BitanskiCanetti10] ) w

  22. Construction Attempt #1 Two Problems: No key No error tolerance Reproduce Generate key key w p 1/0 w w Hide w using obfuscation Can check if x = wwithout revealing w

  23. Construction Attempt #2 Two Problems: No key No error tolerance Reproduce Generate key key w p 1/0 w w Obfuscate each symbol (recall w= w1,…, wk) Can now learn which symbols match

  24. Construction Attempt #2 Two Problems: No key No error tolerance Reproduce Generate key key w w01 1/0 … p w01 … 1/0 w1 w1 wk wk Obfuscate each symbol (recall w= w1,…, wk) Can learn which symbols match

  25. Construction Attempt #2 Knowing where errors occur is useful in coding theory Reproduce Generate key Leverage a technique from point obfuscation key w w01 1/0 … p w01 … 1/0 w1 w1 wk wk Obfuscate each symbol (recall w= w1,…, wk) Can learn which symbols match

  26. c Lets try this on our construction w w Can specify output of point function [CanettiDakdouk08]

  27. Construction Attempt #3 Knowing where errors occur is useful in coding theory Reproduce Generate key key w w01 1/0 … p w01 … 1/0 w1 w1 wk wk • For each symbol i, flip ci • Obfuscate

  28. Construction Attempt #3 Knowing where errors occur is useful in coding theory Reproduce Generate key key w c1,…,ck w01 1/0 … p w01 … 1/0 w1 w1 wk wk • For each symbol i, flip ci • Obfuscate

  29. Construction Attempt #3 Knowing where errors occur is useful in coding theory Reproduce Generate key key w c1,…,ck c1 w01 c1 1/0 … p w01 ck … ck 1/0 w1 w1 wk wk • For each symbol i, flip ci • Obfuscate

  30. Construction Attempt #3 Can run obfuscations and recover most bits of c Reproduce Generate key key w c1,…,ck c1 w01 c1 1/0 … p w01 ck … ck 1/0 w1 w1 wk wk • For each symbol i, flip ci • Obfuscate

  31. Construction Attempt #3 Can run obfuscations and recover most bits of c Reproduce Generate key key w c1,…,ck c1 w01 c1 1/0 … p w01 ck … ck 1/0 w1 w1 wk wk • For each symbol i, flip ci • Obfuscate

  32. Construction Attempt #3 Can run obfuscations and recover most bits of c Reproduce Generate key key w c1,…,ck c1 w01 c1 … p w01 ck … ck w1 w1 wk wk • For each symbol i, flip ci • Obfuscate

  33. Construction Can run obfuscations and recover most bits of c Reproduce Generate key key w c1,…,ck c1 w01 c1 … p w01 ck … ck w1 w1 wk wk Sample cCfrom binary error correcting code For each symbol i, Obfuscate

  34. Construction Can run obfuscations and recover most bits of c Reproduce Generate key key w c1,…,ck c1 w01 c1 … p w01 ck … Decode ck w1 w1 wk wk Sample cCfrom binary error correcting code For each symbol i, Obfuscate

  35. Construction Use c as output (run c through comp. ext. [Krawczyk10] to create key) Reproduce Generate key key w c1,…,ck c1 w01 c1 … p w01 ck … Decode ck w1 w1 wk wk Sample cCfrom binary error correcting code For each symbol i, Obfuscate

  36. Correctness and Security Security Question:What about w and c is revealed by obfuscations Reproduce Generate key ? … key w c1,…,ck c1 c1 w01 c1 … p w01 ck ck … Decode ck w1 w1 w1 wk wk wk Correctness:Recover all but d(w, x) ≤ dmaxbits of c Exist binary error correcting codes with error tolerance Θ(k)

  37. What is revealed by obfuscations? Reproduce Generate key key w c1,…,ck c1 w01 c1 … p w01 ck … Decode ck w1 w1 wk wk • Need to argue adversary learns little through equality oracle queries to symbols • Enough to argue adversary sees as response to queries with overwhelming probability • That is, they rarely guess the stored value wi

  38. Block Unguessable Distributions Caution:Adaptivity is crucial, there are distributions with high overall entropy that can be guessed using equality queries to individual blocks Let A be an algorithm asking polynomial queries of the form: is wi = xi? Def:W = W1,…, Wk is block unguessable if there exists a set such that for all A,

  39. Block Unguessable: Proceed with Caution W1 W2 Wk w2 wk w1 … An adversary can guess “easy” blocks, and use gained info to guess next block

  40. Block Unguessable Distributions Caution:Adaptivity is crucial, there are distributions with high overall entropy that can be guessed using equality queries to individual blocks Positive Examples: block fixing sources [KampZuckerman07], blocks are independent and many are entropic, all entropic blocks Let A be an algorithm asking polynomial queries of the form: is wi = xi? Def:W = W1,…, Wk is block unguessable if there exists a set such that for all A,

  41. Security Convertible to pseudorandom by comp. ext. Let A be an algorithm asking polynomial queries of the form: is wi = xi? Def:W = W1,…, Wk is block unguessable if there exists a set such that for all A, Thm: When the source is block unguessable, C has computational entropy

  42. Security Let A be an algorithm asking polynomial queries of the form: is wi = xi? Def:W = W1,…, Wk is block unguessable if there exists a set such that for all A, Thm: When the source is block unguessable, C has computational entropy

  43. Security size of the code minus the “guessable” positions Let A be an algorithm asking polynomial queries of the form: is wi = xi? Def:W = W1,…, Wk is block unguessable if there exists a set such that for all A, Thm: When the source is block unguessable, C has log(|C|) - (k-|J |) bits of comp. entropy

  44. Security Note: In computational setting, size of key isn’t as crucial, can be expanded by computational extractor Let A be an algorithm asking polynomial queries of the form: is wi = xi? Def:W = W1,…, Wk is block unguessable if there exists a set such that for all A, Thm: When the source is block unguessable,C has log(|C|) - (k-|J |) bits of comp. entropy

  45. Error Tolerance and Security at Odds M • Adversary shouldn’t guess x*where d(w, x*) ≤dmax • A block unguessable distribution has more unguessable symbols than are corrected • There is at least one symbol an adversary must guess • Get security from adversary’s inability to guess this one symbol w

  46. Error Tolerance and Security at Odds M • Adversary shouldn’t guess x*where d(w, x*) ≤dmax • A block unguessable distribution has more unguessable symbols than are corrected • There is at least one symbol an adversary must guess • Get security from adversary’s inability to guess this symbol

  47. Results Reproduce Generate key key w c1,…,ck c1 Husable≤ 0 if |Z| = ω(poly(n)) &Ccorrects Θ(k)errors w01 c1 … p w01 ck … Decode ck w1 w1 wk wk

  48. Reducing Required Entropy Reproduce Generate key key w c1,…,ck c1 w01 c1 … p w01 ck … Decode ck w1 w1 wk wk • Obfuscating symbols individually leaks equality, entropy ensures A can’t guess stored values • Can we reduce the necessary entropy if we obfuscate multiple symbols together? • Obfuscating all symbols together works but eliminates error tolerance

  49. Reducing Required Entropy w01 Generate c1,…,ck c1 wk w2 w1 key p c2 … … ck w1 w2 wk • Instead of having symbols/obfuscations in 1-1 correspondence, introduce level of indirection • Create random bipartite graph between symbols and obfuscations (published in p) • Each obfuscation has degree α

  50. Reducing Required Entropy w01 Generate c1,…,ck c1 wk w2 w1 key p c2 … … ck w1 w2 wk • Instead of having symbols/obfuscations in 1-1 correspondence, introduce level of indirection • Create random bipartite graph between symbols and obfuscations (published in p ) • Each obfuscation has degree α

More Related