1 / 44

Power, EM and all that: Is your crypto device really secure?

Power, EM and all that: Is your crypto device really secure?. Pankaj Rohatgi Dakshi Agrawal, Bruce Archambeault, Suresh Chari, Josyula R Rao IBM T.J. Watson Research Center. Side-Channel Analysis: Some recent advances. I: A new side channel: EM emanations

akando
Download Presentation

Power, EM and all that: Is your crypto device really secure?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Power, EM and all that:Is your crypto device really secure? Pankaj Rohatgi Dakshi Agrawal, Bruce Archambeault, Suresh Chari, Josyula R Rao IBM T.J. Watson Research Center

  2. Side-Channel Analysis:Some recent advances • I: A new side channel: EM emanations • Extends power analysis style attacks to large classes of cryptographic hardware • SSL Accelerators, Cryptographic tokens. • Additional leakages compared to power. • II: Better Analysis: Template Attacks. • (Near) optimal use of side-channel data. • Single sample attacks for ephemeral keys.

  3. EM History • Classified TEMPEST standards. • Partly declassified Jan '01, http://www.cryptome.org. • Other, openly published work on EM. • EM Leakages from Peripherals, • E.g., Monitors: Van Eck, Anderson & Kuhn. • EM Leakage from smart-cards during computation. • Quisquater & Samyde, E-smart 2001, Gemplus Team [GMO ’01], CHES ’01. • SEMA/DEMA attacks. • Best results required decapsulation of chip packaging and/or precise micro-antenna positioning on chip surface

  4. Our EM Work • Deeper understanding of EM leakages. • Similar to now declassified TEMPEST literature. • Plenty of EM signals are available, provided you know what to look for and where. • Superior signals and attacks possible without micro-antennas or decapsulation. • Some attacks possible from a distance. • EM side-channel(s) >> Power side-channel • EM can break DPA-resistant implementations.

  5. EM Emanations Background • Types of EM Emanations • Direct emanations from intended current flows. • Maxwell’s equations, Ampere’s and Faraday’s laws. • See [Quisquater,Samyde 01], Gemplus [GMO ’01] • Unintentional emanations from coupling effects. • Depend on physical factors, e.g., circuit geometry. • Most couplings ignored by circuit designers. • Manifest as modulation of carriers (e.g. clock harmonics) present/generated/introduced in device. • AM or Angle (FM/Phase) Modulation. • Compromising signals available via demodulation.

  6. AM: Example 1 • ECCLib on 16Mhz Palm Pilot. • Posted on Internet by Feng Zhu, Northeastern Univ. • Point multiplication (k * P) on a Koblitz curve over GF[2^163] using Solinas’s technique. • Doubling replaced by Frobenius Map (t) k = Ssi ti (t-adic NAF decomposition) si 2 {0,1,-1} • Cost of kP ~ |k|/3 ~ 54 point additions/subtractions, • Structure can be heard in demo • Good signal via AM demodulation at 241Mhz.

  7. Map/(Add/Subtract) Sequence

  8. 3 successive Frobenius Map computations

  9. Point Add/Sub

  10. Comparing two different point addition/subraction

  11. AM: Example 2 • PCI-bus based RSA Accelerator S inside a Intel/Linux server. • Multiple AM modulated carriers. • Several carriers at clock harmonics propogate upto 50 feet and through walls. • Precise RSA timing available at 50 feet.

  12. Precise Timing (SSL Music) • S looping with ~3s each of • 512-bit RSA • 1024-bit RSA • 2048-bit RSA • 4096-bit RSA • Music can be heard 50 feet away by AM-demodulating 299MHz clock harmonic carrier. • Enables better timing attacks.

  13. RSA Internals in S • AM-demodulating an intermodulated carrier at 461.4 Mhz, bandwidth 150Khz. • Clear signal available upto 3-4 feet. • Further distance requires statistical techniques. • Large class of attack techniques applicable. • Such techniques earlier restricted to power analysis attacks on RSA in smart-cards.

  14. S: Two identical 2048-bit exponentiations with 12-bit exponent. Data & Key Dependent Exponentiation Data/Modulus Dependent Initialization

  15. S: Initialization (Fixed modulus, 2 exponents X 2 data) E1, D2 E2, D2 E1, D1 E2, D1

  16. S: Data and Key Dependent Exponentiation (2 exponents X 2 data)

  17. Angle Modulation: Example 1 • PCI based RSA/Crypto Accelerator R inside an Intel/Linux server. • AM-demodulate a 99Mhz carrier (clock harmonic).

  18. R performing an RSA operation in a loop

  19. Internals of RSA Exponentiation in R with small exponent Data/Modulus Dependent Initialization Key & Data Dependent Exponentiation

  20. RSA Exponentiation Key/Data Dependent Internals Obscured by interfering signal G GENERATED asynchronously during operation

  21. Can we get RSA internals ? • Not directly…. • But, timing of asynchronously generated G affected by ongoing computation due to coupling effects. • Timing statistics of G (using ~1000 samples) gives information about internals!! • G strong enough to be captured at 10-15 feet.

  22. G: Timing Statistics of G for fixed modulus, 3 exponents (2 same)

  23. EM vs. Power • EM may be only side-channel available. • Is EM useful in the presence of power channel? • Direct emanations: micro-sensor positioning, decorrelated noise [QS01, GMO 01] • Unintended emanations: Several EM carriers: • DEMA/DPA correlation plots show extent of leakage from different EM carriers & comparison with power signal. • Different carriers carry different information. • Some EM leakages substantially different/better than Power leakages.

  24. 4 Time Synchronized DPA/DEMA Correlation Plots

  25. 4 Time Synchronized DPA/DEMA Correlation Plots

  26. Bad Instructions • Instructions where some EM leakage >> Power leakage. • Typically CPU intensive rather than bus intensive. • All architectures have BAD Instructions. • Caution: Bad Instructions can break power analysis resistant implementations. • Bad Instruction Example: Bit-test on several 6805 based systems leaks tested bit.

  27. TESTED BIT = 0 IN BOTH TRACES O

  28. TESTED BIT DIFFERENT O

  29. Part II: Template Attacks • Sometimes a single (few) side-channel sample available. • Stream ciphers, Ephemeral keys. • “System Level Countermeasures” to side channel attacks. • Higher level protocols limit key usage. • Non-linear key update countermeasure (Kocher et al [KJJ ’99]). • Are these inherently immune to side-channel attacks? • Immune to traditional simple/differential attacks • Easy to secure implementations against SPA/SEMA • Ensure signal differences < noise level. • DPA/DEMA and higher order DPA/DEMA inapplicable. • Cannot remove noise by averaging over multiple samples. • Not against Template Attacks (with some assumptions).

  30. Example: RC4 on a smart-card • At best, single trace during RC4 state initialization with ephemeral key available. i= j = 0; for (ctr=0;ctr < 256; ctr++) { j = key[i] + state[ctr] + j; SwapByte(state[ctr], state[j]); i=i+1; } • Can avoid SPA. • No DPA style attack possible. • One key byte used per iteration. • Is a single sample enough to • recover the whole key ? • Can two fixed keys different in 1st byte be distinguished during 1st iteration ?

  31. Power Sample showing 6 iterations of loop

  32. Sample = Signal + Noise

  33. Signals (and signal difference) for two fixed keys with different first byte DIFFUSION Differences start in first iteration (Contamination)

  34. Signals difference for the two fixed keys with different first byte during first iteration of loop (CONTAMINATION) { j = key[i] + state[ctr] + j; SwapByte(state[ctr], state[j]); i=i+1; }

  35. Sample noise vs. Signal differences (6 iterations)

  36. Sample noise vs. Signal difference in first iteration

  37. Template Attack Basics: How to distinguish between the two keys ? • Don’t (cannot) eliminate sample noise, use it! • How ? • Use identical device (assumption) for building signal and noise templates T1 and T2 for keys K1 and K2 in 1st iteration. • T1 = {s1(t), D1(t) }, T2= {s2(t), D2(t) } • Given sample S = s(t) use theoretically optimal maximum likelihood estimator: • Which noise is more likely ? s(t)-s1(t) under D1(t) OR s(t)-s2(t) under D2(t)

  38. Theory vs. Practice • Need T1 = {s1(t), D1(t) }, T2= {s2(t), D2(t) } • s1(t), s2(t) easily estimated by averaging. • What about D1(t) and D2(t) ? • Can be restricted to L sample points where s1(t) and s2(t) differ, e.g., L=42 in example. • Still infeasible to estimate a general probability dist. over RL. • Borrow from the large body of work in Signal Detection and Estimation Theory which deals with precisely this problem! • Several realistic and computable noise models available. • We used the popular Multivariate Gaussian Noise model

  39. Mutivariate Gaussian Noise Model • EstimatingD1(t), D2(t) reduces to estimating • LxL noise covariance matrices SL:1 and SL:2for • keys K1 and K2 • Maximum Likelihood test easy, matrix inverse/ • multiplication/determinant and taking logs.

  40. Generalization and Results Technique generalizes to multiple choices for first key byte: E.g., among these 10 key bytes, correct classification probability: • These 10 keys deliberately chosen to be very close. • Widely different keys easily separated with 100% success even with weaker noise models. • Estimated that only a 5-6% error probability in identifying the correct 1st byte out of 256 possibilities.

  41. Attacking multi-byte keys • 5-6% error in identifying single byte is too much. • Maximum Likelihood: Retains hypothesis with max probability for observed noise (Pmax) • Approximate Approach: Retain ALL hypotheses with noise probability > (Pmax/c), c constant. • Tradeoff between number of hypothesis retained and correctness.

  42. Approximate Approach Tradeoff

  43. Attacking Multi-byte Keys: Extend and Prune • Base case: Narrow candidates for first byte to a small number (e.g., on average 1.3 possibilities with 98.67% correctness for c=e6 ) • Assume Ti candidates for first i key bytes. • Extend: For each candidate, build larger templates for all 256 possible values of next byte. • Prune: Use approximate approach to reduce 256 Ti candidates down to Ti+1 < 1.3 * Ti (diffusion). • Tedious but feasible for reasonable sized keys. • N bytes: Failure probability 1.33N% (N=16, failure prob.= 21.5%) • Number of remaining candidates < 1.3^N (N=16, candidates < 67)

  44. For more information on topics such as • Multiple EM/Power channel attack techniques. • EM vulnerability assessment see http://www.research.ibm.com/intsec & Upcoming CHES 2003. Questions?

More Related