1 / 68

Round-Optimal Secure Two-Party Computation

Round-Optimal Secure Two-Party Computation. Jonathan Katz U. Maryland. Rafail Ostrovsky U.C.L.A. Motivation. Round complexity is a central measure of protocol efficiency. Minimizing the number of rounds is often important in practice.

dalemendez
Download Presentation

Round-Optimal Secure Two-Party Computation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Round-Optimal Secure Two-Party Computation Jonathan Katz U. Maryland Rafail Ostrovsky U.C.L.A. 1/48

  2. Motivation • Round complexity is a central measure of protocol efficiency. • Minimizing the number of rounds is often important in practice. • Lower and upperbounds have deepened our understanding of various tasks… 2/48

  3. For example… • ZK [FS89, GO94, GK96a, GK96b, BLV03, etc.], NIZK [BFM88, etc.], WI [FS89,DN00,BOV03] • Concurrent ZK [DNS98, KPR01, CKPR01, PRS02] • Commitment, identification schemes, … • … • 2-party and multi-party computation [BMR90, IK00, GIKR01, L01, KOS03, etc.] 3/48

  4. This work • We concentrate on secure two-party computation • Encompasses many functionalities of independent interest (e.g., ZK) • Important “special case” of MPC without honest majority • Interestingly, exact round complexity of 2PC was not previously known! 4/48

  5. This work (1) • We exactly characterize black-box round complexity of secure 2PC! • THM1: Impossibility result for any black-box 4-round coin-tossing (also XOR, other functionalities…) 5/48

  6. This work (2) • THM2: 5-round secure 2PC protocol for any functionality, based on trapdoor perms* (e.g. RSA, Rabin) or Homomorphic Encryption (e.g. DDH). 6/48

  7. This work (3) • THM3: 5-round secure 2PC protocol an adaptive adversary corrupting any one party without erasure in 5 rounds. 7/48

  8. Prior work (2PC) • Honest-but-curious setting • 4 rounds using trapdoor perms. [Yao86] • 3 rounds using number-theoretic assumptions (optimal) [Folklore] • Malicious case • “Compiler” for any protocol secure in honest-but-curious setting [GMW87] • Round complexity? 8/48

  9. Round complexity of 2PC? • Upper bounds • O(k) rounds [GMW87] • O(1) rounds [Lindell01] • Unspecified, but roughly 20-30 rounds • Lower bounds (black-box) • No 3-round ZK [GK96] • No 3-round coin-tossing [Lindell01] 9/48

  10. Security definition • We use the standard definitions of [GMW87, GL90, MR91, Ca00] • This will be an informal review, focusing on a static adversary 10/48

  11. Set-up • Functionality F = (F1, F2), possibly randomized; player Pi gets Fi(x, y) • In real world, players execute a protocol to compute F • In ideal world, a trusted party computes F for the players 11/48

  12. Ideal model • Players send x, y to TTP • Malicious player can send any value it likes; honest party sends its input value • If no value sent, a default value is used • TTP chooses uniformly-random r; sends v1 = F1(x, y; r) to P1 • If P1 aborts, TTP sends v2 =  to P2 • Else, TTP sends v2 = F2(x, y; r) to P2 12/48

  13. Ideal model • Let Viewi denote the view of Pi • Let (B1, B2) be strategies • Define IDEAL = (B1(View1), B2(View2)) • Note: for Bi honest, Bi(Viewi) = vi 13/48

  14. Real model • Players execute protocol… • Let (A1, A2) be strategies • Define REAL = (A1(View1), A2(View2)) • Again, if Ai honest, then Ai(Viewi) = vi 14/48

  15. Security… • A pair of strategies is admissible if at least one is honest • Protocol is secure if for all admissible PPT (A1, A2) in the real world, there exist admissible expected poly-time (B1, B2) in ideal world such that REAL and IDEAL are comp. indistinguishable • Even with auxiliary inputs… 15/48

  16. Black-box security • The definition of security requires:  (malicious) Ai,  (malicious) Bi, s.t. Bi satisfies the condition…. • Black-box security imposes stronger requirement: (S1, S2),  (malicious) Ai, (malicious) Bi =SiAi satisfies the condition… 16/48

  17. More formally… • For malicious A1, define B1 as follows: • B1(x, z; r, r’) = S1A1(x, z; r)(x; r’) • S1 not given auxiliary input z • Exp. running time of S1 is a fixed polynomial, independent of A1 • But running time of B1 depends on A1 • The above formulation avoids some technical problems… 17/48

  18. Lower bound 18/48

  19. Theorem 1 • No secure (black-box) 4-round protocol for flipping (log k) coins • This rules out 4-round protocols for other functionalities as well (e.g., XOR) • (Note: 3-round protocols for O(log k) coins do exist [Bl82, GMW87]) • Details: (next) 19/48

  20. Intuition • W.l.o.g., P2 sends the first message • No way to simulate for a malicious P1 who aborts “very often” • Sending different msg1 doesn’t help • P1 starts over with “new randomness” [GK] • Sending different msg3 doesn’t help • P1 anyway aborts “very often” 20/48

  21. Proof details I • Let s() be the expected r.t. of S1 • Define A1 as follows: • Use msg1 to define random string for an “honest” execution of the protocol (using O(s)-wise independent hash function) • After msg3, compute coin c; abort unless first (3log s) bits of c are 0 • Note: here we use |c| = (log k) 21/48

  22. Proof details II • REAL is “non-aborting” with noticeable probability 1/s3 • Thus, IDEAL must be “non-aborting” with roughly the same probability • Conditioned on “good” coin from TTP, S1 must “force” A1 not to abort with probability essentially 1 22/48

  23. Proof details III • Run S1 for at most 2s steps • Now, strict poly-time • Conditioned on “good” coin from TTP, “forces” A1 not to abort with probability essentially 1/2 23/48

  24. Proof details IV • Define A2 as follows: • Feed “good” coin to S1; guess i, j • Send ith query of S1 to P1 as msg1, return msg2 to S1 • Send jth query of S1 to P1 as msg2 • Answer other queries of S1 internally, by either aborting or playing the role of A1 24/48

  25. Proof details V • Analysis: • Conditioned on “correct” guesses of i, j, honest player P1 outputs “good” coin with probability essentially 1/2 • Probability of correct guess > 1/4s2 • So probability that honest P1 outputs “good” coin is at least 1/8s2 > 1/s3 • A2 noticeably biases the coin! 25/48

  26. Implications • No 4-round (black-box) protocol for general secure computation • Note: Could also derive from [GK]… • …but our techniques rule out 4-round protocols for wider class of functions 26/48

  27. THM2: A 5-round protocol for secure two-party computation (for malicious adversary) We construct a 5-round protocol where we “force”’ good behavior on both sides and can “simulate” malicious Adv view from both sides… 27/48

  28. Somewhat easier task • [folklore]: k-round with one player learning the output  (k+1)-round with both players learning the outputs • the output in the kth round includes encrypted and MAC’ed output for other player. • SO: we need a 4-round protocol where, say, player 1 gets the output. 28/48

  29. observation It suffices to consider deterministic functionalities. Rest of the talk: we show a 4-round protocol tolerating malicious players where player 1 learns the output. 29/48

  30. Rest of the talk • 3-round protocol for semi-honest players • Background tools • Some of our new techniques • Our 4-round protocol (if time permits) • Proof of security (if time permits) • Modifications needed for Dynamic Adv. • Conclusions. 30/48

  31. Recall: 1-2-OT [EGL] • Sender has (v0, v1); • Receiver has b, 1-2-OT: • Receiver gets vb • Sender gets nothing 31/48

  32. Semi-honest 1-2-OT [EGL,GMW] • S: generate td perm. (f, f-1); send f • R: yb = f(zb), y1-b rand; send (y0, y1) • S: send ui = h(f-1(yi))vi, for i=0,1 • R computes vb = h(zb)ub Note: extends easily for strings in semi-honest setting 32/48

  33. Yao’s “garbled circuit” • Algorithms (Y1, Y2) s.t.: • Y1(y) outputs “circuit” C, input-wire labels {Zi,b}, • [C “represents” F(.,y)] • Y2(C, Z1,x1, …, Zk,xk) outputs v Correctness: v = F(x, y) 33/48

  34. 3-round semi-honest 2PC • Player 2 sends Yao’s C, f for OT • Player 1 sends OT pairs {(yi,0, yi,1)} • Player 2 sends {(ui,0, ui,1)} to Player 1. Player 1 recovers v. 34/48

  35. Malicious 2PC? • Standard method [GMW87] increases round-complexity: • Coin tossing into the well to fix random tapes of players; • Players commit to their inputs; • ZK arguments of correctness after every round; High round complexity of compilation 35/48

  36. Malicious 2PC in 4 rounds • Our goal: do everything in 4 rounds, (player 1 gets the output) forcing “good” behavior from both sides! • Intuition: do everything “as early as possible” but …things “don’t fit” – we need new tricks to cram it all.. • Surprise: we must “delay” proofs to make it work. 36/48

  37. Reminder:3-Round WI proofs [FS] P claims that graph G has a HC • PV: commit n cycle graphs C1..Cn • VP: random n-bit string Q • PV: for each bit of Q, either • open entire matrix Ci OR • show perm of G onto Ci open non-edges of G in Ci. 37/48

  38. OBSERVATION • Graph G can be determined in the last round. • IF G is determined in the 1st round  this is WI proof of knowledge • IF G is determined in the 3rd round  this is only a WIproof, but it is still sound! 38/48

  39. NEW PROPERTIES FOR 3-ROUND WI-PROOF 39/48

  40. Next: [FS] 4-round ZK • Q can we get similar result for [FS] 4-round ZK argument? 40/48

  41. [FS] 4-round ZK argument: 2 interleaved WI-proofs 41/48

  42. [FS] 4-round ZK-argument 2 interleaved WI proofs: • PV: gives y1,y2 s.t. f(a1)=y1,f(a2)=y2 and WI proof of this fact (3 rounds) • PV: WI proof of witness w that x is in L or w is one of the a’s (starting on the 2nd round). Total of 4 rounds. Proof of knowledge; also ZK. 42/48

  43. New FS properties needed: • Observation: In FS, prover needs to determine the statement in the second round. • Goal: to defer parts of statement to last (4th) round. Previous ideas are not sufficient… 43/48

  44. Technical lemma - we extend [FS] to FS’ so that: • FS’ is a 4-round Zero-knowledge argument where statements can be “Postponed”. • FS’ define conjunctive parts of statement in the second round (with knowledge extraction) and part of statement in the 4th round (without extraction but still sound!) • It is of independent interest (requires equivocal commitment, some other tools) 44/48

  45. [FS] 4-round ZK argument: 2 interleaved WI-proofs 45/48

  46. OUR PROTOCOL PROOF-FLOWS 46/48

  47. Simulation on both sides? we need more tools… • Malicious player 2 gains nothing by using non-random tape in Yao. • Player 1 cannot freely choose his random tape, but full-blown coin-tossing is not necessary (i.e., we don’t need simulatability on both sides) • Player 2 has to commit Yao’s garbled circuit in round 2, but the simulator need to open it arbitrary, so use equivocal comm. 47/48

  48. Equivocal commitments • (Informal): in real execution, sender committed to a single value; in simulation, can open arbitrarily • Construction: Equiv(b) = Com(b0), Com’(b1)ZK argument that b0 = b1Open by opening either b0 or b1 • Can “fold” ZK argument into larger statement already used in 4th round of FS’ 48/48

  49. And now… the 4-round protocol… (only 4 slides, 1 msg per slide) 49/48

  50. Round 1: P1(x)P2(y) • P1 commits {(ri,0, ri,1)}; (random) • starts 3-round WI PoK of either ri,0 or ri,1; • Starts FS’1 (statement TBA by P2 partly in round 2, partly in round 4) 50/48

More Related