1 / 72

720 likes | 941 Views

From Chinese Wall Security Policy Models to Granular Computing. Tsau Young (T.Y.) Lin tylin@cs.sjsu.edu dr.tylin@sbcglobal.net Computer Science Department, San Jose State University, San Jose, CA 95192, and Berkeley Initiative in Soft Computing, UC-Berkeley, Berkeley, CA 94720.

Download Presentation
## From Chinese Wall Security Policy Models to Granular Computing

**An Image/Link below is provided (as is) to download presentation**
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.
Content is provided to you AS IS for your information and personal use only.
Download presentation by click this link.
While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

**From Chinese Wall Security Policy Models to Granular**Computing Tsau Young (T.Y.) Lin tylin@cs.sjsu.edu dr.tylin@sbcglobal.net Computer Science Department, San Jose State University, San Jose, CA 95192, and Berkeley Initiative in Soft Computing, UC-Berkeley, Berkeley, CA 94720**From Chinese Wall Security Policy. . .**• The goal of this talk is to illustrate how granular computing can be used to solved a long outstanding problem in computer security.**Outline**1. Overview(Main Ideas) 2. Detail Theory Background Brewer and Nash Vision Formal Theory 2**Overview**New Methodology: Granular Computing Classical Problem:Trojan Horses**Overview - Granular computing**Historical Notes 1. Zadeh (1979) Fuzzy sets and granularity 2. Pawlak, Tony Lee (1982):Partition Theory(RS) 3. Lin 1988/9: Neighborhood Systems(NS) and Chinese Wall (a set of binary relations. A non-reflexive. . .) 4. Stefanowski 1989 (Fuzzified partition) 5. Qing Liu &Lin 1990 (Neighborhood system)**Overview-Granular computing**Historical Notes 6. Lin (1992):Topological and Fuzzy Rough Sets 7. Lin & Liu: Operator View of RS and NS (1993) 8. Lin & Hadjimichael : Non-classificatory hierarchy (1996)**OverviewProblem Solving Paradigm**Divide and Conquer 1. Divide: Partition (= Equivalence Relation) 2. Conquer: Quotient sets (Bo ZHANG, Knowledge Level Processing) 3. Could this be generalized?**Overview-Example**Partition: disjoint granules(Equivalence Class) [0]4 = {. . . , 0, 4, 8, . . .}={4n}, [1]4 = {. . . , 1, 5, 9, . . .} ={4n+1}, [2]4 = {. . . , 2, 6, 10, . . .} ={4n+2}, [3]4 = {. . . , 3, 7, 11, . . .} ={4n+3}. Quotient set = Z/4 (Z/m)**Overview-New Challenge?**Granulation: overlapping granules B0 = {. . . , 0, 4, 8, 12,. . . 5,9, } B1 = {. . . , 1, 5, 9, . . .} B2 = {. . . , 2, 6, 10, . . ., 7,} B3 = {. . . , 3, 7, 11, . . ., 6, }. Quotient ?**Overview-Granular Computing - NewParadigm ?**Classical paradigm is unavailable for general granulation Research Direction: New Paradigm ?**Overview- Granular Computing a New Problem Solving Paradigm**Divide and Conquer (incremental development) 1. Divide: Granulation (binary relation) Topological Partition 2. Conquer: Topological Quotient Set**Application - New Paradigm ?**Report: Applying an incremental progress in granulation to Classical problem in computer security**Overview - Trojan Horses**• Classical Problem Trojan Horses, e.g.virus propagation**Overview - Trojan Horses**Grader G is a conscientious student but lacking computer skills. So a classmate C sets up a tool box that includes, e.g., editor, spread sheet, …;**Overview - Trojan Horses**C embeds a “copy program” into G’s tool; it sends a copy of G’s file to C (university system normally allows students to exchange information)**Overview - Trojan Horses**• As the Grader is not aware of such Trojan Horses, he cannot stop them; • The system has to stop them! Can it?**Overview - Trojan Horses**Can it? In general, NO With constraints, YES Chinese (Great) Wall Security Policy.**Overview - Trojan Horses**Direct Information flow(DIF); CIF, a sequence of DIF’s, leaks the information legally !!! Grader DIF Trojan horse(DIF) Professor CIF Student**Overview**• End of Overview**Details**Background**Background**In UK, a financial service company may consulted by competing companies. Therefore it is vital to have a lawfully enforceable security policy. 3**Background**• Brewer and Nash (BN) proposed Chinese Wall Security Policy Model (CWSP) 1989 for this purpose**Background**• The idea of CWSP was, and still is, fascinating; • Unfortunately, BN made a technical error.**Outline**• BN’s Vision**BN: Intuitive Wall Model**• Built a set of impenetrable Chinese Walls among company datasets so that • No corporate data that are in conflict can be stored in the same side of the Walls • 5**Policy: Simple CWSP (SCWSP)**"Simple Security", BN asserted that "people (agents) are only allowed access to information which is not held to conflict with any other information that they (agents) already possess."**Could Policy Enforce the Goal?**• “YES” BN’s intent; technical flaw • Yes, but it relates an outstanding difficult problem in Computer Security**First analysis**Simple CWSP(SCWSP): No single agent can read data X and Y that are in CONFLICT Is SCWSP adequate?**Formal Simple CWSP**SCWSP says that a system is secure, if “(X, Y) CIR X NDIF Y “ “(X, Y) CIR X DIF Y “ (need to know may apply) CIR=Conflict of Interests Binary Relation**More Analysis**SCWSP requires no single agent can read X and Y, • but do not exclude the possibility a sequence of agents may read them Is it secure?**Aggressive CWSP (ACWSP)**The Intuitive Wall Model implicitly requires: No sequence of agents can read X and Y: A0 reads X=X0and X1, A1 reads X1and X1, . . . An reads Xn=Y**Can SCWSP enforce ACWSP?**Related to a Classical Problem Trojan Horses**Current States**1.BN-Theory (Rough Computing)-failed 2.Granular Computing Method**Formal Model**When an agent, who has read both X and Y, considers a decision for Y, • information in X may be used consciously or unconsciously.**Formal Model (DIF)**So the fair assumptions are: if the same agent can read X and Y • X has direct information flowed into Y, in notation, X DIF Y • also Y DIF X . . .**Formal Simple CWSP**SCWSP says that a system is secure, if “(X, Y) CIR X NDIF Y “ “(X, Y) CIR X DIF Y “ CIR=Conflict of Interests Binary Relation**CompositeInformation flow**CompositeInformation flow(CIF) is a sequence of DIFs , denoted by such that X=X0X1 . . . Xn=Y And we write X CIF Y NCIF: No CIF**Formal Aggressive CWSP**Aggressive CWSP says that a system is secure, if “(X, Y) CIR X NCIF Y “ “(X, Y) CIR X CIF Y “**The Problem**Simple CWSP ? Aggressive CWSP This is a malicious Trojan Horse problem**Need ACWSP Theorem**• Theorem If CIR is anti-reflexive, symmetric and anti-transitive, then • Simple CWSP Aggressive CWSP**Solution**• BN’s solution • GrC Solution**BN-Theory(failed)**BN assumed: • Corporate data are decomposed into Conflict of Interest Classes (CIR-classes) (implies CIR is an equivalence relation)**BN-Theory**BN assumption: CIR-classes Class B i, j, k f, g, h Class C ClassA l, m, n**BN-Theory**• Can they be partitioned? France, German C US, Russia UK?**BN-theory**• Is CIR Equivalence Relation? NO (will prove)**Some Mathematics**A partition Equivalence Relation Class B i, j, k f, g, h Class C ClassA l, m, n**Some Mathematics**Partition Equivalence relation • X Y (Equivalence Relation) if and only if • both belong to the same class/granule**Equivalence Relation**Generalized Identity • X X (Reflexive) • X Y implies Y X (Symmetric) • X Y, Y Z implies X Z (Transitive)**Is CIR Symmetric?**• US (conflict) USSR implies • USSR (conflict) US ? • YES**Is CIR Transitive?**• US (conflict) Russia • Russia (conflict)UK • UK ? US NO

More Related