1 / 15

Trust and Privacy in Authorization

Trust and Privacy in Authorization. Bharat Bhargava Yuhui Zhong Leszek Lilien CERIAS Security Center CWSA Wireless Center Department of CS and ECE Purdue University Supported by NSF IIS 0209059, NSF IIS 0242840 . Applications/Broad Impacts.

amora
Download Presentation

Trust and Privacy in Authorization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Trust and Privacyin Authorization Bharat Bhargava Yuhui Zhong Leszek Lilien CERIAS Security Center CWSA Wireless Center Department of CS and ECE Purdue University Supported by NSF IIS 0209059, NSF IIS 0242840

  2. Applications/Broad Impacts • Guidelines for the design and deployment of security-sensitive applications in the next-generation networks • Data sharing for medical research and treatment • Collaboration among government agencies for homeland security • Transportation systems (travel security checks, hazardous material disposal) • Collaboration among government officials, law enforcement, security personnel, and health care facilities during bio-terrorism and other emergencies

  3. Trust-based Authorization Authorization based on: • Role Based Access Control model • Uncertain evidence • Dynamic Trust Authorization process considering: • Tradeoff between privacy and trust

  4. A. Trust-based Authorization • Problem • Dynamically establish and maintain trust among entities in an open environment • Research directions • Handling uncertain evidence • Modeling dynamic trust • Challenges • Uncertain information complicates inference • Subjectivity leads to varying interpretations of the same information • Trust is multi-faceted and context-dependent – hence trust modeling requires tradeoffs: • representation comprehensiveness vs. computation simplicity

  5. Uncertain Evidence • Evaluating uncertainty of a role assignment policy given a set of uncertain evidence • Probability-based approach • Atomic formula: Bayes network + causal inference + conditional probability interpretation of opinion • AND/OR expressions: rules [Jøsang'01] • Subjectivity handled by discounting operator [Shafer'76]

  6. Dynamic Trust • Trust established based on direct interaction • Identify behavior patterns and their characteristic features • Determine which pattern is the best match for the current interaction sequence • Develop algorithms establishing trust • Unique feature: we consider behavior patterns • Reputation evaluation • Choose reputation information providers • Scale reputation ratings • Bob’s 0.7 means 0.5 to Alice but 0.8 to Carol

  7. TERA Architecture

  8. Trust Enhanced Role Assignment (TERA) Prototype • Trust enhanced role mapping (TERM) server assigns roles to users based on • Uncertain & subjective evidence • Dynamic trust • Reputation server • Dynamic trust information repository • Evaluate reputation from trust information by using algorithms specified by TERM server Prototype and demo are available at http://www.cs.purdue.edu/homes/bb/NSFtrust/

  9. B. Trading Privacy for Trust • Problems • Minimize loss of privacy necessary to gain the required level of trust • Control dissemination of “traded” private data • Research directions • Measuring privacy • Modelling privacy - trust tradeoff • Controlling private data dissemination • Challenges • Specify policies through metadata and establish guards as procedures • Efficient implementation • self-descriptiveness, apoptosis, evaporation • Define context-dependent privacy disclosure policies • depending on who will get this information, possible uses of this information, information disclosed in the past, etc. • Propose more universal privacy metrics • usually they are ad hoc and customized Details at:http://www.cs.purdue.edu/homes/bb/priv_trust_cerias.ppt

  10. Privacy Metrics • Determine the degree of data privacy • Size-of-anonymity-set metrics • Entropy-based metrics • Privacy metrics should account for: • Dynamics of legitimate users • Dynamics of violators • Associated costs

  11. Privacy-Trust Tradeoff • Gain required level of trust with minimal privacy loss • Build trust based on digital users’ credentials that contain private information • Formulate the privacy-trust tradeoff problem • Estimate privacy loss due to disclosing a set of credentials • Estimate trust gain due to disclosing credentials • Develop algorithms that minimize privacy loss for required trust gain

  12. Controlling Private Data Dissemination • Design self-descriptive private objects • Construct a mechanism for apoptosis of private objects apoptosis = clean self-destruction • Develop proximity-based evaporation of private objects

  13. Used Car Dealer 1 Bank I -Original Guardian Insurance Company C 5 2 Used Car Dealer 3 2 5 Insurance Company A 1 1 2 5 Bank II Used Car Dealer 2 Bank III Insurance Company B Examples of Proximity Metrics • Examples of one-dimensional distance metrics • Distance ~ business type • Distance ~ distrust level: more trusted entities are “closer” • Multi-dimensional distance metrics • Security/reliability as one of dimensions • If a bank is the original guardian, then: • -- any other bank is “closer” than any insurance company • -- any insurance company is “closer” than any used car dealer

  14. Private and Trusted System (PRETTY) Prototype (4) (1) (2) [2c2] (3)User Role [2a] [2b] [2d] [2c1] TERA = Trust-Enhanced Role Assignment

  15. Information Flow in PRETTY • User application sends query to server application. • Server application sends user information to TERA server for trust evaluation and role assignment. • If a higher trust level is required for query, TERA server sends the request for more user’s credentials to privacy negotiator. • Based on server’s privacy policies and the credential requirements, privacy negotiator interacts with user’s privacy negotiator to build a higher level of trust. • Trust gain and privacy loss evaluator selects credentials that will increase trust to the required level with the least privacy loss. Calculation considers credential requirements and credentials disclosed in previous interactions. • According to privacy policies and calculated privacy loss, user’s privacy negotiator decides whether or not to supply credentials to the server. • Once trust level meets the minimum requirements, appropriate roles are assigned to user for execution of his query. • Based on query results, user’s trust level and privacy polices, data disseminator determines: (i) whether to distort data and if so to what degree, and (ii) what privacy enforcement metadata should be associated with it.

More Related