1 / 18

A SLA evaluation Methodology in Service Oriented Architectures

A SLA evaluation Methodology in Service Oriented Architectures. V.Casola , A.Mazzeo, N.Mazzocca, M.Rak University of Naples “Federico II”, Italy Second University of Naples, Italy. Outline. Context Objectives Methodology Policy Formalization Evaluation technique Applicability

konane
Download Presentation

A SLA evaluation Methodology in Service Oriented Architectures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A SLA evaluation Methodology in Service Oriented Architectures V.Casola, A.Mazzeo, N.Mazzocca, M.Rak University of Naples “Federico II”, Italy Second University of Naples, Italy

  2. Outline • Context • Objectives • Methodology • Policy Formalization • Evaluation technique • Applicability • Conclusions • Future Works

  3. Context: service cooperation, a trust point of view • Service Oriented Architectures are capable of intelligent interaction and are able to discover and compose themselves into more complex services; • The emerging technologies and standards allow a dynamic service composition to offer advanced services; • The open issue is: how to guarantee the “quality” of a service built at run-time in a potential un-trusted domain?

  4. Context: Service Level Agreement • Actually, these problems are faced by an explicit agreement among services: • Each service defines its own Service Level Agreement and publishes them in a public document; • People from the various organization that want to cooperate, manually evaluate the different SLAs and decide to agree or not. • SLA are expressed by means of a free text document; it contains “quality of services” and “security” parameters, it can be used to decide to extend trust to other services, too (cfr. “qualified” services);

  5. Objectives • To introduce a methodology to formalize SLA and evaluate the associated quality/security level through the definition of a metric function; • The automatic adoption of the methodology helps in: • The initial agreement among cooperative services (when a service must be “qualified” to adhere to an existing, qualified, Cooperative Connection System); • The run-time agreement among services (when the aggregation of services is made in an open network and when services are located through a public registry).

  6. Methodology – target and applicability context • We have defined a Methodology to: • Express security through a semi-formal and not ambiguous policy; the chosen formalization must be “easy to adopt” for technical and organizational people; • Evaluate thesecurity level that a security infrastructure is able to guarantee by aggregating the security associated to all policy provisions. • Compare different services according to the measured security level.

  7. The Reference Evaluation Model (REM) components The methodology core is the REM definition: REM = <Formalization, Technique, Reference Levels> • [Formalization] represents the semi-formal representation of the policy. The chosen formalization will affect final evaluation, and it takes into account technical and organizational aspects; • [Technique] represents the evaluation technique that can be applied to compare policies; the evaluation technique strictly depends on the policy formal representation. • [Reference Levels] are instances of policies, which represent different security levels.

  8. Policy Formalization (1) Policy formalization needs to be: Not ambiguous, (this is a problem for high level languages – semantically reach), Correct respect to the described system, Complete !!! • Textual provisions have been structured and refined in a fine-grain and a grammar of enumerative data-types has been proposed, so reducing semantical complexity; • The defined data-structures are new atomic or enumerative types and a total order relation among their values has been defined;

  9. Policy Formalization (2) • We have associated a Local Security Level to each provision instance (applying different security metrics to each one); • Example: Data-type: Private_Key_Protection_mechanism Enumerative and Ordered values : No Protection < Protection on Floppy < Protection on Smart Card < Protection on Smart Card with Biometric Sensor

  10. Policy Formalization (3) • The proposed structure is a hierarchical tree represented by an XML document; • Tree nodes identify complex security provisions, leaves identify simple security provisions.

  11. The Evaluation Technique • How to quantify the system security? • The introduced technique is based on the definition of a metric policy space and a distance criterium by which we could represent policies and compare different policies. • After the policy formalization, each provision is represented by an enumerative-ordered data-type with its Local Security Level. • After the evaluation the whole policy is represented by an aggregated value (Global Security Level)

  12. The metrical space Technique • The policy space is made homogeneous thanks to threshold functions (F-functions) which allow to associate a Local Security Level to each provision; • The policy space is represented by a n x 4 matrix; • The distance criterium for the definition of the metric space is the Euclidean distance among matrices, defined as: • d(A,B) = √( σ (A-B,A-B)) • where σ (A,B) = Tr (ABT)

  13. Revocation request grace period 1 1 1 0 CRL issuance frequency 1 1 1 0 CRL checking requirements 1 1 1 0 Site location, construction and physical access 1 1 0 0 CA trusted roles 1 1 1 0 LRA trusted roles 1 1 0 0 The metrical space Technique: the policy matrix • The policy space is represented by a n x 4 matrix (total number of provisions for the number of Local Security Levels)

  14. The last component of the REM is the set of reference security levels that could be used as a reference scale for the numerical evaluation of security. Example of evaluation of 4 security levels with the metrical technique: Reference Levels d10 = d(REFL1, ) = 7,07 d20 = d(REFL2, ) = 11,18 d30 = d(REFL3, ) = 12 d40 = d(REFL4, ) = 12,65

  15. The reference levels and the metric function The metric function for the evaluation of the Global Security Level of Px: if dX0 ≤ d10 ==> LPX = L0, if d10 < dX0 < d20 ==> LPX = L1, if d20 < dX0 < d30 ==> LPX = L2, if d30 < dX0 < d40 ==> LPX = L3, if d40 ≤ dX0 ==> LPX = L4, LPX =

  16. Scenario 1- pre-defined cross qualification There is a master who sets the REM components; The target services (TS) which wishes to be part of the cooperative system is evaluated against the master REM so its policy is formatted according to it, too. The result of the evaluation is the service level that TS could guarantee and, the subset of services which could cooperate with it without degrading their quality. Scenario 2- Run-time cross qualification There is NOT a master who can set the REM components; It is a peer-to-peer agreement, i.e. the requestor service and the provider service have the same role; Both services build their own REM; The result could be different for the two services (different REMs), in this case, the quality level is determined in function of who is the requestor in the specific transaction. Application of the metrical technique

  17. Conclusions • SLA definition in SOA is a technical, organizational and standards problem; • The proposed methodology aims at addressing all these aspects in a unifying way and proposing an evaluation model; • The applicability is the building of trust services, able to automatically evaluate the SLAs associated to other cooperative services (at run-time).

  18. Future Works • Definition of a framework for SLA management (and monitoring); • Definition of a set of trusted cooperative-services based on the methodology; • Performance-trustability trade-off evaluation.

More Related