1 / 22

Point-Based Trust: Define How Much Privacy is Worth

Point-Based Trust: Define How Much Privacy is Worth. Danfeng Yao Keith B. Frikken Brown University Miami University Mikhail J. Atallah Roberto Tamassia Purdue University Brown University .

celine
Download Presentation

Point-Based Trust: Define How Much Privacy is Worth

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Point-Based Trust: Define How Much Privacy is Worth Danfeng Yao Keith B. Frikken Brown University Miami University Mikhail J. Atallah Roberto Tamassia Purdue University Brown University Funded by NSF IIS-0325345, IIS-0219560, IIS-0312357, and IIS-0242421, ONR N00014-02-1-0364, CERIAS, and Purdue Discovery Park ICICS December, 2006, Raleigh NC

  2. Outline of the talk 1. Introduction to privacy protection in authorization 3. Secure 2-party protocol for knapsack problem 2. Point-based authorization and optimal credential selection 2.1 New York State Division of Motor Vehicle 6-Point Authentication System 4. Applications 2.2 Knapsack problem

  3. Request for discount Request UID RequestBBB BBB UID Grant the discount Protecting private information Alice • Trust negotiation protocols [Winsborough Seamons Jones 00, Yu Ma Winslett 00, Winsborough Li 02, Li Du Boneh 03]

  4. Our goals • Prevent pre-mature information leaking by both parties • Credentials should be exchanged only if services can be established • Support some kind of cumulative privacy quantitatively • Disclosing more credentials should incur higher privacy loss • Support flexible service model • Allow customized (or personalized) access policies • Adjustable services based on qualifications Our ultimate goal is to encourage users to participate in e-commerce

  5. What can we learn from New York State DMV? 6-point proof-of-identity for getting NY driver’s license

  6. Another motivation – adjustable services Adjustable services based on the private informationrevealed

  7. xi= 0 not to disclose Ci xi= 1 disclose Ci • n • Minimize  ai xi • Subject topi xi ≥ T • i=1 • n • i=1 Point-based authorization model • Credential type C1,C2, …,Cn • The service provider defines • Point valuesp1,p2, …,pnof credentials ----- private • ThresholdT for accessing a resource ----- private • The user defines sensitivityscoresa1,a2, …,anof credentials ----- private • Credential selection problem • The user (or client) wants to satisfy threshold T with the minimum disclosure of privacy This can be converted to a knapsack problem

  8. Example Threshold of accessing a resource: 10

  9. Member of Where do points come from? • Reputation systems [Beth Borcherding Klein 94,Tran Hitchens Varadharajan Watters 05,Zouridaki Mark Hejmo Thomas2005] • This is future work, but here is an idea Evaluate Evaluate Evaluate

  10. Defines binary vector y1,y2, …,yn, where yi = 1 –xi {ai}: Private to user {pi }: Private to provider What to pick and steal? Bag of size T’, n = 6 Converting CSP into a knapsack problem • n • Maximize  ai yi • Subject topi yi < T’ • i=1 • n Let T’=  pi- T • n • i=1 • i=1

  11. Dynamic programming of knapsack problem • Dynamic programming for 0/1 knapsack problem • Construct a n-by-T’ table M, where • n T’=  pi- T • i=1 M i, j= M i-1, jif j < pi max{M i-1, j, M i-1, j-pi + ai }if j ≥ pi • {ai }: Private to user • {pi}: Private to provider

  12. M i, j= M i-1, jif j < pi max{M i-1, j , M i-1, j-pi + ai }if j ≥ pi Overview of privacy-preserving knapsack computation • Uses 2-party maximization protocol [Frikken Atallah 04] • Uses homomorphic encryption scheme • E(x)E(y) = E(x + y) • E(x)c = E(xc) • Preserves privacy for both • Two phases: table-filling and traceback max{, - ∞ + ai} • Add maximization and addition of ai to make the two computation procedures indistinguishable

  13. Preliminary: 2-party maximization protocol in a split format * Alice’s share + Amazon’s share = max (Alice1 + Amazon1, Alice2 + Amazon2) Amazon1 Alice1 Amazon2 Alice2 Max Amazon’s share Alice’s share Comparison can be done similarly [Frikken Atallah 04]

  14. Our protocol for dynamic programming of 0/1 knapsack problem • Computed entries are encrypted and stored by the provider • The provider splits the two candidates of Mi, j • The client and provider engage in a 2-party private maximization protocol to compute the maximum • The client encrypts her share of the maximum and sends it to the provider • The provider computes and stores the encrypted Mi, j M i, j= M i-1, jif j < pi max{Mi-1, j , Mi-1, j-pi + ai }if j ≥ pi max{, - ∞ + ai} ai E(Mi-1, j) Alice Amazon Max Alice E(Mi-1, j-pi) Amazon Amazon’s share Alice’s share

  15. Our protocol for knapsack (Cont’d) • At the end of the 2-party dynamic programming, the provider has a n-by-T’ table of encrypted entries • Number of credentials n=4 • n T’=  pi- T • i=1 How does the client find out the optimal selection of credentials?

  16. Traceback protocol: get the optimal credential selection Item 0 Item 1 Item 2 Item 3 Item 4 0 • Security in a semi-honest (honest-but-curious) model E(Fi, j) Fi, j=0 or 1

  17. Security and efficiency of our privacy-preserving knapsack computation • Informally, security means that private information is not leaked • Security definitions • Semi-honest adversarial model • A protocol securely implements function fifthe view of participants are simulatable with an ideal implementation of the protocol • Theorem The basic protocol of the private two-party dynamic programming computation in the point-based trust management model is secure in the semi-honest adversarial model. • TheoremThe communication complexity between the provider and the client of our basic secure dynamic programming protocol is O(nT'), where n is the total number of credentials and T' is the marginal threshold.

  18. Fingerprint protocol: an improved traceback protocol • We want to exclude the provider in the traceback • To prevent tampering and reduce costs 1. Filling knapsack table 2. (Encrypted) last entry 3. Decrypt and identity optimal credential selection Fingerprint protocol is a general solution for traceback in DP

  19. Fingerprint protocol (cont’d)

  20. Application of point-based authorization: fuzzy location query in Presence systems Ex Where is Alice? Boss Mom Alice’s mom Where is Alice? Alice’s boss Where is Alice? Alice’s ex

  21. Related work • Hidden credentials [Bradshaw Holt Seamons 04, Frikken Li Atallah 06] • Private policy negotiation [Kursawe Neven Tuyls06], Optimizing trust negotiation [Chen Clarke Kurose Towsley 05], Trust negotiation protocol/framework [Winsborough Seamons Jones 00, Yu Ma Winslett 00, Winsborough Li 02, Li Du Boneh 03, Li Li Winsborough 05] • Anonymous credential approaches [Chaum 85, Camenisch Lysyanskaya 01] • Secure Multiparty Computation [Atallah Li 04, Atallah Du 01] • OCBE [Li Li 06] • Manet [Zouridaki Mark Hejmo Thomas05] • Platform for Privacy Preferences(P3P) [W3C]

  22. Conclusions and future work • Our point-based model allows a client to choose the optimal selection of credentials • We presented private 2-party protocol for knapsack problem • Our fingerprint protocol is a general solution for traceback • Future work • Add typing to credentials • Reputation systems and points

More Related