Loading in 2 Seconds...
Loading in 2 Seconds...
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Framework for Managing theAssured Information Sharing Lifecycle 2008 MURI project with UMBC, Purdue, U. Texas Dallas, U. Illinois, U. Texas San Antonio, and U. Michigan Objectives: • Create a new framework for assured information sharing recognizing that sharable information has a lifecycle of production, release, advertising, discovery, acquisition and use • Develop techniques grounded in this model to promote information sharing while maintaining appropriate security, privacy and accountability • Evaluate, adapt and improve the AIS concepts and algor- ithms in relevant demonstration systems and test beds See http://aisl.umbc.edu/ for papers and more information
AIS Lifecycle Approach Information value chain • Design a service orientedarchitecture to support theassured information sharinglifecycle • Create new policy models &languages to express and en-force AIS rules & constraints • Develop new data mining techniquesand algorithms to track provenance, increase quality and preserve privacy • Model underlying organizational social networks to estimate trust and information novelty • Design incentive structures to motivate sharing inorganizations and coalitions information has a lifecycle involving a web of producers and consumers All aspects of the lifecycle are shaped by distributed information sharing policies Integration and mining creates new information that may be shared access may involve negotiating policy defined obligations
Selected AISL Recent Results • Progress on models, architectures, languages and mechanisms for trustworthiness-centric assured information sharing (UTSA, Purdue) • Techniques for resolving conflicting facts extracted from different resources (UIUC) • Study of information sharing motivation and quality in online forums (Michigan) • Modeling incentives & trust in info. sharing (UTD) • Learning statistically sound trust metrics (UTD) • Inferring access policies from logs (UMBC) • Policies for privacy in mobile information systems (UMBC, Purdue)
Trustworthiness-centric AIS Framework • Objective: create a trustworthiness-centric assured information sharing framework • Approach: design models, architectures, language and mechanisms to realize it • Key challenges: • Trustworthiness and risk management for end-user decision making • Usage management to extends access control • Attack management, including trustworthiness of infrastructure services • Identity management extending current generation • Provenance management for managing trustworthiness of data, software, and requests 1
trustworthiness-centric assuredinformation sharing framework Usage management (of authorized activities) Attack management (of unauthorized activities) Risk management Trustworthiness management Identity management (of people, organizations, and devices) Provenance management (of data, software, and requests) Note: “trustworthiness risk” in general 1
Progress on Trustworthiness-centric AIS • Initial framework will be published as: S. Xu, R. Sandhu & E. Bertino, Trustworthiness-centric Assured Information Sharing, (invited paper), 3rd IFIP Int. Conf. on Trust Management, 2009 • Design for identity & provenance mgmt underway • Group-centric info sharing model extends traditional dissemination one with new intuitive metaphors: secure meeting room and subscription service • Developed family of security models for semantics of basic group operations (join, leave, add, remove) and proved security properties about them • Results published in recent conference papers 1
Truth Discovery with Multiple Conflicting Information Providers [TKDE’08] Heuristic Rule 2: A web site that provides mostly true facts for many objects will likely provide true facts for other objects Problem: Multiple information provider may provide conflictive facts on the same object E.g., different author names for a book Which is the true fact? Heuristic Rule 1: The false facts on different web sites are less likely to be the same or similar False facts are often introduced by random factors Web sites Facts Objects w1 f1 o1 f2 w2 f3 w3 f4 o2 w4 f5 2 February 2009
Truth-Discovery: Framework Extension • Multi-version of truth • Democrats vs. republicans may have different views • Truth may change with time • A player may win first but then lose • Truth is a relative, dynamically changing judgment • Incremental updates with recent data in data streams • Method: Veracity-Stream • Dynamic information network mining for veracity analysis in multiple data streams • Current Testing Data Sets • Google News: A dynamic news feed that provides functions and facilitates to search and browse 4,500 news sources updated continuously 2 February 2009
Knowledge iN Motivation & quality in information sharing • Analyzed online Q&A forums: 2.6Mquestions, 4.6M answers and interviewswith 26 top answerers • Motivations to contribute include: altruism,learning, competition (via point system) andas a hobby • Users who contribute more often and lessintermittently contribute higher qualityinformation • Users prefer to answer unansweredquestions and to respond to incorrectanswers • See “Questions in, Knowledge iN? A Study of Naver's Question Answering Community”, Nam, Ackerman, Adamic, CHI 2009 3
Incentives & Trust in Assured Information Sharing • Goal: Create means of encouraging desirable behavior within an environment which lacks or cannot support a central governing agent • Approach: Combining intelligence through a loose alliance • Bridges gaps due to sovereign boundaries • Maximizes yield of resources • Discovery of new information through correlation, analysis of the ‘big picture’ • Information exchanged privately between two participants • Drawbacks to sharing include misinformation and freeloading 4
Our Model • Players assumed to be rational • The game of information trading • Strategies: be truthful, lie, refuse to participate • One game played for each possible pair of players, all games played simultaneously in a single round; game repeated ‘infinitely’ • Players may verify the information they received with some cost • When to verify becomes aspect of game • Always verifying works poorly in light of honest equilibrium behavior but never verifying may yield game to lying opponents • Add EigenTrust to game • A distributed trust metric where each player asks others for their opinion of a third • Based on known perfect information 4
Simulation Results • We set δmin = 3, δmax = 7, CV = 2 • Lie threshold is set 6.9 • Honest behavior wins %97 percent of the time if allbehaviors exist. • Experiments show without LivingAgent behavior, honest behavior cannot flourish. “Incentive and Trust Issues in Assured Information Sharing”, Ryan Layfield, Murat Kantarcioglu, and Bhavani Thuraisingham, International Conference on Collaborative Computing, 2008 4
Learning statistically sound trust scores • Goal: Build a statistically sound trust-based scoring system for effective access control through the application of the credit scoring system • Approach: Find appropriate predictive variables by applying concepts and methodologies used in credit scoring systems Incorporate a utility function into the scoring system to set up score-related access policies 5 Trust-Based Access Control Processes
Inferring RBAC Policies • Problem: A system whose access policy is known is more vulnerable to attacks and insider threat Attackers may infer likely policies fromaccess observations, partial knowledgeof subject attributes, and backgroundknowledge • Objective: Strengthen policiesagainst discovery • Approach: Explore techniques topropose policy theories via machinelearning such as ILP • Results: promisinginitial results forsimple Role Based Access Control policies 6
Privacy policies for mobile computing • Problem: mobile devices collect and integrate sensitive private data about their users which they would like to selectively share with others • Objective: Develop a policy-based system for information sharing with an interface enabling end users to write & adapt privacy policies • Approach: prototype component foriConnect on an iPhone and evaluate ina University environment • Example policy rules: share my exactlocation with my family; share currentactivity with my close friends, … 7