1 / 35

Linux Pseudo random Number Generator (LPRNG)

Real-life cryptography Pfeiffer Alain. Linux Pseudo random Number Generator (LPRNG). Index. Types of PRNG‘s History General Structure User space Entropy types Initialization process Building Blocks Security requirements Conclusion. Types.

scout
Download Presentation

Linux Pseudo random Number Generator (LPRNG)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Real-life cryptography Pfeiffer Alain Linux Pseudo randomNumberGenerator (LPRNG)

  2. Index • Types of PRNG‘s • History • General Structure • User space • Entropy types • Initialization process • Building Blocks • Security requirements • Conclusion

  3. Types • Non-cryptographic deterministic: Should not be used for security (Mersenne Twister) • Cryptographically secure: Algorithm with properties that make it suitable for the use in cryptography (Fortuna) • Entropy inputs: Produces bits non-deterministically as the internal state is frequently refreshed with unpredictable data from one or several external entropy sources (LPRNG)

  4. History • Part of the Linux Kernel since 1994 • Written by Ts‘o • Modified by Mackall • +/- 1700 lines of C code

  5. General Structure • Internal states: • Input pool (128, 32-bit words = 4096 bits) • Blocking pool (32, 32 bit words = 1024 bits) • Nonblocking pool (1024 bits) • Output function: Sha-1 • Mixing function: Linear mixing function ≠ hash • Entropy Counter: • Decremented when bits are extracted • Incremented when new bits are collected

  6. User space • /dev/random • Reads from blocking pool • Limits the number of generated bits • Blocked when not enough entropy • Resumed when new entropy in input pool • /dev/urandom • Reads fromnonblocking • Generates random bits WITHOUT blocking • Writing the data does NOT change the entropy counter!!! • Get_random_bytes() • Kernel space • Reads random bytes from nonblocking pool

  7. Entropyinputs • Backbone of security • Injected: • Into generator for initialization • Through updating mechanism • Usable independently • Does NOT rely on physical non-deterministic phenomena • Hardware RNGs • Available for user space • NOT mixed into LPRNG • Entropy gathering daemon: • Collects the outputs • Feeds them into LPRNG

  8. Entropysources • Reliable Entropy: • User inputs (Keyboard, Mouse) • Disk timings • Interrupt timings are NOT reliable: • Regular interrupts • Miss-use of the „IRQF_SAMPLE_RANDOM“ flag

  9. Entropyevents • „num“ value (Type of event, 32 bits) • Mouse (12 bits) • Keyboard (8 bits) • Interrupts (4 bits) • Hard drive (3 bits) • CPU „cycle“ • Max: 32 bits • AVG: 15 bits • „jiffies“ count (32 bits) • Kernel counter of timer interrupts (avg. 3 – 4 Bits) • Frequency 100 – 1000 ticks/sec  The generator never assumes max entropy.

  10. EntropyEstimationConditions • Unknown distribution: Inputs vary a lot • Unknown correlation: Correlations between inputs are likely • Large sample space: Hard to keep track of 232 Jiffies values. • Limited time: Estimation happens after interrupts, so they must be fast. • Estimation at runtime: Estimation for every input! • Unknown knowledge of the attacker

  11. Initialization Not much entropy in Linux boot process! • At Shutdown: • Generates data from /dev/urandom • Save into file • At Startup: • Writes the saved data to /dev/random • Mixes the data to: • Blocking pool • Nonblocking pool without changing the counter!

  12. Building Blocks • Mixing Function • Entropy Estimator • Output Function • Entropy Extraction

  13. Linear feedbackshiftingregister     …

  14. 1. Mixing Function • Mixes 1 byte after each other • Extend it to 32-bit word • Rotate it by 0-31 • Linear shifting (LFSR) into the pool  No entropy gets lost

  15. 1. Mixing WITHOUT Input Linear feedback shifting register (LFSR) over Galois field: GF(232) with Feedback Polynomial: Q(X) = α3 (P(X) – 1) + 1 where • Primitive element: α • Size of the pool: P(X) • Input Pool: P(X) = X128+X103+X76+X51+X25+X+1 • Output Pool: P(X) = X32+X26+X20+X14+X7+X+1 • Input pool period: 292*32 -1 ≠ 2128*32 -1 • Output pool period: 226*32 -1 ≠ 232*32 -1

  16. 1. Mixing WITHOUT Input (cont.) • Input Pool: P(X) = X128+X103+X76+X51+X25+X+1 • Output Pool: P(X) = X32+X26+X20+X14+X7+X+1 • P(X) is NOT irreducible! • But by changing one feedback position • Input Pool: P(X) = X128+X104+X76+X51+X25+X+1 • Output Pool: P(X) = X32+X26+X19+X14+X7+X+1 • P(X) is irreducible ButNOT primitive! • However by changing α to: • α2 (X32+X26+X23+X14+X7+X+1) • α4 • α7 • … • P(X) is irreducible AND primitive! • Periods: 2128*32 -1 & 232*32 -1

  17. 1. Mixing WITH Input • Function L1: • {0,1}8 {0,1}32 • Rotates • Multiplication in GF(232) • Feedback function L2: ({0,1}32)5  {0,1}32

  18. 2. EntropyEstimator (1) • Random variables: • Identically distributed • Different (single) source • Sample space: D where |D| >> 2 • Jiffies count: ᵹi[1] at time i • Estimator with input Ti: • Logarithm function: • Outcome:

  19. 2. EntropyEstimator (2) • To compute • We must know: • Time ti-1 • Jiffies count: ᵹi-1[1] where [1] = event 1 • Jiffies count: ᵹi-1[2] where [2] = event 2 • Property: invariant under a permutation • Permutation: • Distribution q: • Distribution p:  H(p) ≠ H(q), since it uses the value of a given element and not its probability!

  20. 3. Output Function • Transfer: Input pool  output pool • Generate data from output pool • Uses Sha-1 hash • Feedback phase • Extraction phase

  21. 3. Output – Feedback phase • Sha-1 • Get all pool bytes (32-bit word) • Produce 5-word hash • Send it to • Mixing function • Extraction phase • Mixing function • Get the 5-word hash • Mix it back • Shift 20 times (20 words = 640 bits)

  22. 3. Output – Extractionphase • Sha-1 • Initial value (Hash) • Get (16) Pool-words • Overlap with last word from the feedback function • Overlap with 3 first words of the output pool • Produce 5-word hash • Fold in half • Extract w0xor w1xor w2xor w3xor w4 • Produce 10 byte output

  23. 4. EntropyExtraction • Random Variable: X • Rényi Entropy: H2(X) • Hash function: • Random choice of the hash: G • IF • H2(X) ≥ r • G: uniformly distributed  Entropy is close to r bits

  24. 4. EntropyExtraction - LPRNG • LPRNG fixed hash function: • Assumptions: • Each element has size of • Attacker knows all permutations  Universal hash function: • If the pool contains: • k bits of Rényi entropy • m ≤ k  Entropy close to m bits:

  25. Security requirements • Sound entropy estimation: • Estimate the amount entropy correctly • Guarantee that an attacker who knows the input can NOT guess the output! • Pseudo randomness: • Impossible to compute the: • Internal state • Future outputs • Unable to recover: • Internal state • Future outputs with partial knowledge of the entropy

  26. Sound entropyestimation • Samples:N = 7M • Empirical frequency: • Estimators: • LPRNG entropy: • Shannon entropy: • Min-entropy: • Rényi entropy: • Results:

  27. Pseudorandomness • Sha-1: one-way function  Adversary can NOT recover the content of • output pool • input pool if he only knows the outputs! • Folding: Avoids recognizing patterns  Output of the hash is NOT directly recognizable  Secure if the internal state is NOT compromised!

  28. Security resilience • Backtracking resistance: An attacker with knowledge of the current state should NOT be able to recover previous outputs! • Prediction resistance: An attacker should NOT be able to predict future outputs with enough future entropy inputs!

  29. Securiyresilience LPRNG • Forward security: Knowledge of theinitial state does NOT provide information on previous states. Even if the state was not refreshed by new entropy inputs.  Backtracking provided by: One-way output function • Backward security: Adversary who knows the internal state is able predict • Outputs • Future outputs because the Output function is deterministic… (Bad!)  Prediction provided by: Reseed the internal state between requests!

  30. Forward Security • Attacker knows: • Input pool • Output pool Attacker knows the previous states EXCEPT the 160 bits which were fed back.  BUT without additional knowledge an generic attack would have: • 2160 overhead • 280 solutions

  31. Backward Security • Transferring k bits of entropy means that after: • Generating data from UNKNOWN S1 • Mixing S1 to the KNOWN S2 • Guessing the NEW S2 would cost on average 2k-1 trials for the attacker! • Collecting k bits of entropy means that after: • Processing unknown data from KNOWN S1 • Guessing the NEW S1 would cost on average 2k-1 trials for the observer!

  32. Backward Security – Attacks • 1. Attacker: • Knows the output pool • Does NOT know the input pool • 2. Attacker knows • Input pool • Output pool

  33. Backward Security – Attack 1 Enough entropy (k >= 64 bits)? • Yes! • Transferring k bits from input • Attacker looses k bits of knowledge • NO output before k bits are mixed  Generic attack (2k-1): k bits resistance! • No! • NO bits are transferred • Attacker keeps knowledge • NOoutputbefore k bits are sent from input  Generic attack (2k-1): k bits resistance!

  34. Backward Security – Attack 2 • //k = 64 bits • Collect k bits of entropy (2k-1guessings) • If (counter >= k bits) then • counter-- • Else • counter++ • transfer k bits from input  64 bits resistance

  35. Conclusion • Goodlevelofsecurity • Mixing functioncouldbeimproved! • Newerhash-functioncouldbeused (Sha-3)

More Related