1 / 51

Presentation overview

Presentation overview. Introduction to automated privacy and Identity management. Ontologies: What they are, how they can help Conceptual Mediation: Lawyers, Users, Businesses Ontologies and reasoning: Anonymizing access control Reasoning in Access Control Demo.

tamira
Download Presentation

Presentation overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Presentation overview • Introduction to automated privacy and Identity management. • Ontologies: What they are, how they can help • Conceptual Mediation: Lawyers, Users, Businesses • Ontologies and reasoning: Anonymizing access control • Reasoning in Access Control Demo

  2. A typical human readable privacy policy (http://privacy.yahoo.com/)

  3. Automating privacy protection:Scenario 1:Client Side Architecture

  4. Example XML Statement in P3P Policy <STATEMENT> <PURPOSE> <admin/> <develop/> <pseudo-decision/> </PURPOSE> <RECIPIENT> <ours/> </RECIPIENT> <RETENTION> <indefinitely/> </RETENTION> <DATA-GROUP> <DATA ref="#dynamic.cookies"> <CATEGORIES> <preference/> <navigation/> </CATEGORIES> </DATA> </DATA-GROUP> </STATEMENT>

  5. Example P3P Rule <appel:RULE behavior="block" description="Site sets cookies which are used beyond what is required for stated purpose"> <p3p:POLICY> <p3p:STATEMENT> <p3p:RETENTION appel:connective="non-and"> <p3p:stated-purpose/> </p3p:RETENTION> </p3p:STATEMENT> </p3p:POLICY> </appel:RULE>

  6. Automating Privacy Protection: Scenario 2: Enterprise Architecture Data Flow Ontology Privacy Layer Rules & Rule Engine GUI Security Layer Privacy Based Access Policies Security Policies

  7. Scenario 3:Automated Identity Management APPLICATIONS FRAMEWORK Single Sign On Workflow Automation Access Control Directory Services Personalization Delegated Administration Management Policies & Profiles USERS

  8. Automated Identity Management Based on Credentials APPLICATIONS FRAMEWORK Single Sign On Workflow Automation Access Control Directory Services Personalization Delegated Administration Management Policies & Profiles Tokens/Credentials User

  9. XML based policies describe • Business practices (Enterprise Policies) • User preferences • Obligations • Access conditions • Audit logs

  10. Automated Privacy – Stakeholders • End Users E.g. My mother • Law enforcement E.g. Police, Data Protection Authorities, Article 29 Working group • Business Privacy Concerns Cost eCommerce $15 Billion a yr – Forrester Research • Application developers E.g. Browser developers, EPAL implementations

  11. 4 Key Problems 1. Each group of stakeholders speaks a completely different language • E.g. Many users have never heard of identity management, they just want to sign onto multiple web sites. 2. Enterprises need to be user friendly, but at the same time control liability. 3. Existing languages are not expressive or extensible enough to model all aspects of data protection. 4. The law says you should only collect the minimum data required to carry out the service. BUT - How to work out the minimum data required? Applications are not yet intelligent enough to know what to ask for.

  12. Ontologies Ornithology: the study of birds Oncology: the study of cancer Onychology: study of fingernails and toenails. Ontology: a formal, machine readable specification of terms and their relationships in a specific domain. .

  13. How Ontologies can Help Automated Privacy and IDM • Machine readable description of concepts and relationships between • Data Protection Law • User-metaphors • Enterprise business rules • Application logic  Can translate between legal-ese, user-ese, business-ese and java/c++:

  14. Alignment of Legal, User and Technical Models Legal Rule Systems Developers Program Logic Ontology End-Users Enterprise

  15. How Ontologies can Help Automated Privacy and IDM • Richly Expressive, Precise and Interoperable policy languages • Reasoning capabilities  more powerful policy evaluation: • e.g. To figure out what is the minimum data required, to accept flexible credentials. • Standard language used in user interfaces so businesses can trust policy translations

  16. How Ontologies can Help Automated Privacy and IDM • Extensible to include other ontologies (e.g. geographical ontology for location based services) • Language independence (privacy  riservatezza) • Separate Business Logic, Conceptual Models and Program Logic  more efficient development

  17. Technical Details of Ontologies

  18. Knowledge Base (e.g. Privacy Policy) Description Logics Are languages for describing concepts, and their properties and relations. E.g. • OWL (W3C Standard) • RDFS (W3C Standard) • DAML+OIL (www.daml.org)

  19. Semantics Semantics specify the connection between terms (names) and concepts (meaning) (see e.g. Fodor, Chomsky, RDF Semantics:http://www.w3.org/TR/rdf-mt/)

  20. What is an ontology? Description Logics describe: • Concepts Classes and Subclasses • E.g. Data, health data, data controller • Properties Describe features and attributes • E.g. is Collected by • Restrictions on Properties and Concepts • E.g. If a person is Italian and has a driving license, they are over 18, • health Data is a subclass of Data

  21. Is in category Sensitive Data http://www.prime-project.eu.org/dpontology/religiondatatype RDF • OWL uses RDF – a graph description language which is very well suited to describing concepts • Based on a very simple graph modelling language (The core RDF specification only 2-3 pages long!) • "Triple" - a statement        • [Subject - Predicate – Object] [Religious data – is of type – Sensitive Data] • RDF (in contrast to XML) can describe arbitrarily complex statements and relationships.

  22. Sensitive Data Address Religion Email Data Controller Data Contact Data Subject OWL uses RDF to describe relationships between concepts Related/Unrelated Subclass of Number of Must specify Collects 1 Subclass of About

  23. Policies are expressed in RDF (but XML may also be used for backward compatibility) Email Is in category http://p3p.jrc.it/form.php* Data Object Transfers Performed By Data Subject Data Controller Purpose of Contact details of Via Enrico Fermi Third Party Marketing Is in category Street Name

  24. How ontologies standardize application semantics DP Ontology Based on P3P Data Typing Ontology Based on P3P Email Is in category http://p3p.jrc.it/form.php* Data Object Transfers Performed By Data Transfer Event Data Controller Purpose of Contact details of Via Enrico Fermi Third Party Marketing Is in category Street Name

  25. Ontology Development Tools

  26. Ontology Development Tools: Java Libraries • Jena, developed by HP labs, provides a complete suite of Java tools for processing RDF, OWL, and reasoning using OWL and prolog style rules. • Downloadable from http://jena.sourceforge.net

  27. Ontology Capture Processes • The most important factor in the success of an ontology • Methodologies: • Each concept is defined by a traceable and repeatable process. • Text analysis: Automated or semi-automated analysis of key documents (e.g. legislation) • Interviews and group exercises (e.g. Legal modelling) • Conflict resolution methodologies – describe and resolve situations where groups disagree. • Alignment of different ontologies covering similar domains.

  28. Formal and Informal Ontologies • XML languages such as P3P and XACML are Informal Ontologies -Semantics of terms is informally defined E.g. P3P: <p3p:purpose> <p3p:ours/> </p3p:purpose> = current purpose with human readable definition -XML:not a rigorous or complete framework for semantics but has a high adoption level • Informal ontologies represent a huge body of work towards conceptual consensus.

  29. Example Scenarios for Privacy and IDM • Conceptual mediation between users, lawyers and businesses • Access control: credential reasoning • Demo

  30. Users • Need to • Specify Preferences • Receive Warnings • Understand policies • Using Simple metaphors – e.g. town/house metaphor

  31. Lawyers • Need to • Ensure that business policies are compliant with legislation • Ensure that users have preferences that are compliant with the law. • Provide tools for businesses for checking legal compliance. • Using Precise, unambiguous language

  32. Enterprises • Need to • Create privacy policies • Enforce privacy policies • Communicate good practice to users • Collect and store consent • Protect against liabilities • Using Precise, unambiguous business-process concepts

  33. Application developers String rules = "[(?d rdf:type eg:studentdoctor) (?n rdf:type eg:nurse) ->(?d eg:superiorTo ?n) (?n eg:subordinateTo ?d)]"; rules +="[(?d rdf:type eg:surgeon) (?n rdf:type eg:studentdoctor) ->(?d eg:superiorTo ?n) (?n eg:subordinateTo ?d)]"; rules +="[(?d eg:canShowCredential eg:drivinglicense) -> (?d eg:hasAge ?n) (?n eg:greaterThan 18)]"; • Need to • Implement enterprise policies consistently • Implement user preferences • Translate user metaphors into real practise • Easily updateable applications • Using Pragmatic:Java/C++/UML/Prolog

  34. Example 1 Policy states: • Company X • DISCLOSES data about EMAIL ADDRESS • To UNRELATED THIRD PARTIES • Without CONSENT • Ontology + Rules can then translate this into descriptions and actions which are appropriate to the context:

  35. Example 1 :Conceptual Alignment Data which might lead to spam EMAIL ADDRESS Sensitive Data USERS APPLICATION REGULATORS

  36. Example 1:Conceptual Alignment Consent Consent to data processing I ticked a box USERS APPLICATIONS REGULATORS

  37. Example 1:Conceptual Alignment Cookies Clickstream data Remember my details USERS APPLICATIONS REGULATORS

  38. religion Medical data Criminal record Example 1:Conceptual Alignment Sensitive Data Private Information USERS APPLICATIONS REGULATORS

  39. Example 1: the same concepts in the policy are translated by the rules: Users: • Display a warning in language users can understand, “Warning – submitting this form could cause Spam” Lawyers: • Alert service about illegal practices Application: • Don’t submit any data to this company – or create a pseudonymous email address. • Warn policy creator of illegal practices (E.g. JRC Policy Editor) Business: • Change data handling practices (E.g. display legal language to users e.g. for collecting consent)

  40. Architectural note: • All this can be done with programme logic. • BUT: if you encode this knowledge in an ontology (e.g. email-address leads to spam), you can • reuse it • share it • standardize it. • Put it under the control of the stakeholders.

  41. Ontologies Reasoning for Access Control • Access control applications need to be able to minimize the information required to authenticate an access request. • E.g. instead of asking for my age to access a service (e.g. gambling service), it could check whether I can prove I have a driving license.

  42. Example 2: Anonymizing access control • I want to access a service, but I do not want to reveal my age. • The service however, needs to know that I am over 18 to satisfy legal requirements. • The service already knows that I have a driving license

  43. Example 2: anonymizing access control Suppose the service has access to an ontology which contains (e.g.) the following concepts and relationships: • Concepts: • DRIVERS LICENSE • CREDENTIAL • PERSON • Properties: • HOLDS CRENDENTIAL (can exist between Persons and Credentials – e.g. Giles Hogben Holds a British Passport) • HAS AGE (can exist between Persons and integers – e.g. Giles Hogben HAS AGE XXXX(X is an integer) ) • Restrictions: • If a Person HOLDS CREDENTIAL a DRIVERS LICENSE  that person HAS AGE age > 18

  44. Example 2 • Using the above Ontology, the access control application can allow me access, without asking me what my age is, because it can deduce what it needs to know from the fact that I have a driving license.

  45. Example 3: anonymizing access control • I am a doctor and I want to access the medical records of a certain patient. • In order to have access, I must be a health professional with grade superior to a nurse. • I can present a credential which certifies that I am a surgeon

  46. Example 3: anonymizing access control Suppose the service has access to an ontology which contains (e.g.) the following concepts and relationships: • Concepts: • StudentDoctor (is a doctor) • Surgeon (is a doctor) • Nurse (is a Health Professional) • Doctor (is a Health Professional) • Health Professional • Properties: • SuperiorTo (can exist between Persons) • Restrictions: • SuperiorTo is Transitive (i.e. if x SuperiorTo y and y SuperiorTo z then x SuperiorTo z) • Student Doctors are Superior toNurses • Surgeons are Superior toStudent Doctors

  47. Example 3 Using the above ontology and only the fact that I can prove I am a surgeon, the application can allow me access to the patient’s records See Java App

  48. What do these examples show? • Ontologies can translate between different views of the world – i.e. users, lawyers, enterprises and developers. • Flexible use of credentials and easy reasoning E.g. Ability to allow credential with greater anonymity. Further developed ontology could make judgements about level of anonymity of a credential to select the most anonymous one.

  49. Questions ? (giles.hogben att jrc.it)

  50. Policy Ontology Application Logic Ontology based architecture • Policycontains data specific to the individual or enterprise (may also contain rules) • Ontology defines general concepts and relationships • Application Logic contains generic rules • All 3 may contain rules • Ontologies are Rules which are valid for the whole domain (e.g. one controller per data collection act) and rules which are specific to the enterprise

More Related