1 / 30

Rajeev R. Raje Andrew M. Olson Barrett R. Bryant Carol C. Burt Mikhail Auguston

Rajeev R. Raje Andrew M. Olson Barrett R. Bryant Carol C. Burt Mikhail Auguston funded by the DoD and Office of Naval Research under the CIP/SW Program. Overview (SERC Showcase, Dec. 6, 7, 2001). Objective UniFrame Approach Parts of UniFrame UMM Standards and OMG QoS Summary.

Download Presentation

Rajeev R. Raje Andrew M. Olson Barrett R. Bryant Carol C. Burt Mikhail Auguston

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Rajeev R. Raje Andrew M. Olson Barrett R. Bryant Carol C. Burt Mikhail Auguston funded by the DoD and Office of Naval Research under the CIP/SW Program

  2. Overview (SERC Showcase, Dec. 6, 7, 2001) • Objective • UniFrame Approach • Parts of UniFrame • UMM • Standards and OMG • QoS • Summary

  3. Objective To create a unified framework (UniFrame) that will allow a seamless integration of heterogeneous and distributed software components

  4. Service Provider Publish Bind Service Broker Service Requestor Find Sounds like“Web Services”! • Web Services focus is on using the Internet protocols for a • messaging (SOAP/HTTP and XML), and a UDDI directory for locating services defined in WSDL

  5. That leaves a lot of interesting problems… • Need for a component meta-model in support of generative techniques for mappings to existing components models • Need for multi-approach highly intelligent location services • QoS instrumentation and metrics • Unified approach to using generative techniques with strict QoS requirements • Validation of dynamic system compositions

  6. UniFrame Approach • UMM • Components, QoS, Infrastructure • GDM • Domain model, Composition/Decomposition Rules, Generative Programming • TLG • Formalism based two-level grammars • Process • For integration

  7. Unified Meta-Model (UMM) • Component • Autonomous and non-uniform • Service and its guarantees • Offered by each component with QoS • Infrastructure • Environment • Headhunters • Internet Component Broker

  8. Aspects of Components • Computational • Reflects the tasks carried out by an object • Cooperative • Expected collaborators • Auxiliary • Mobility, Security, Fault-tolerance

  9. Service and Guarantees • Each component must specify its QoS and ensure it • QoS Parameter Catalog • static – design oriented • dynamic – run-time oriented • QoS of a component/integrated DCS • based on “event traces”

  10. Infrastructure • Head-hunters • Pro-active discovery of new component • Multi-level and multi-approach • Internet Component Broker • Allows heterogeneous components talk to one another • Analogous to Object Request Broker • Generated adapter technology • Instrumentation as a part of the architecture

  11. Architecture For Component Discovery Domain Security Manager ICB Search Engine Client 5 4 5 5 4 4 4 Meta-Registry Meta-Registry Meta-Registry Meta-Registry 3 3 3 3 Head-hunter Head-hunter Head-hunter 2 2 Head-hunter Client 2 2 2 2 2 RMI Model CORBA Model EJB Model S5 S5 1 S1 1 EJB Container S6 1 ORB 1 RMI Registry S2 1 1 1 S5 S7 1 1 S3 1 1 S5 S4 S8

  12. Component Development and Deployment Process Component Implementation Domain KB UMM Spec Translator IG Imp QV Deploy TLG Satisfy? Yes Interface Generator QoS validation Refine Specifications/implementations No

  13. System Integration UMM TLG Domain KB GDM Generative Rules QoS Constraints Query Query Processor System Generator Iterative Experiments Deploy Satisfy? Yes No HHs Select Another option Refine Query

  14. Leverage & Drive OMG work • Infrastructure & Interoperability • CORBA, CORBA Services, CCM, IIOP, COM/CORBA, SOAP/CORBA, CSIv2, • Head-hunters, Internet Component Broker • Validation Metrics / Instrumentation • Model Driven Architecture • PIM to PSM mapping • Consistent with our Meta-model approach • Concept of a QoS Catalog & Interface generation

  15. Interoperability & Infrastructure • Internet Component Broker • Leverage “lessons learned” in development of orbs – standard protocol, standard component mappings & portable component adapters • Leverage SOAP/CORBA and XML valuetypes • Native protocol? • Headhunter • Use Naming/Trading, Interface Repository • Need Standard Implementation Repository? • Federation, Native protocol, API?

  16. Model Driven Architecture • We need a standard QoS catalog for Model Driven Architectures • Static – design oriented • Dynamic – runtime oriented • We need to standardize the way that QoS parameters are used to generate interfaces (static QoS) • We need to standardize how QoS parameters are used for generated instrumentation (dynamic QoS)

  17. Quality of Service Reference Model • A general categorization of different kinds of QoS; including QoS that are fixed at design time as well as ones that are managed dynamically • Identification of the basic conceptual elements involved in QoS and their mutual relationships. This involves the ability to associate QoS parameters to model elements (specification)

  18. Qualify of Service Parameters • These are parameters that describe the fundamental aspects of the various specific kinds of QoS based on the QoS categorization identified in the reference model. This includes but is not limited to the following: • time-related characteristics (delays, freshness) • importance-related characteristics (priority, precedence) • capacity-related characteristics (throughput, capacity) • integrity related characteristics (accuracy) • safety-related characteristics • availability and reliability characteristics

  19. QoS Catalog • Motivation: • Creation of a QoS Catalog for Software Components would help the component developer by: • Acting as a reference manual for incorporating QoS attributes into the components being developed • Allowing him to enhance the performance of his component in an iterative fashion by being able to quantify their QoS attributes • Enable him to advertise the Quality of his components by utilizing the QoS metrics.

  20. QoS Catalog • The system developer by • Enabling him to specify the QoS requirements of the components that are incorporated into his system • Allowing him to verify and validate the claims of the component developer • Allowing him to make an objective comparison of Quality of components having the same functionality • Empower him with the means to choose the best suited components for his system

  21. QoS Catalog • The catalog is broadly based upon the software patterns catalog. • The catalog follows the following format:

  22. QoS Catalog • Incorporation of methodologies into the catalog is made based on their: • Reproducibility • Indicativeness (capability to identify parts of the component which need to be improved) • Correctness • Objectivity • Precision • Meaningfulness of measure • Suitability to the component framework • Error Situation • Aliases • Resources

  23. QoS Catalog • Name: DEPENDABILTY • Intent: It is a measure of confidence that the component is free from errors.  • Description: It is defined as the probability that the component is defect free.  • Motivation: • It allows an evaluation of degree of Dependability of a given component. • It allows Dependability of different components to be compared. • It allows for modifications to a component to increase its Dependability. • Applicability: • Can be used in any system, which requires its components to offer a specific level of dependability. • Using the model, the Dependability of a given component can be calculated before being incorporated into the system. • Model Used: Dependability model by Jeffrey Voas • Metrics used: Testability Score, Dependability Score Testability is a measure of the likely hood that a particular statement in a component will hide a defect during testing.

  24. QoS Catalog • Influencing Factors: • Degree of testing • Fault hiding ability of the code • The likelihood that a statement in a component is executed • The likelihood that a mutated statement will infect the component’s state • The likelihood that a corrupted state will propagate and cause the component output to be mutated • Evaluation Procedure: • Perform Execution Analysis on the component • Perform Propagation Analysis on the component • Calculate the Testability value of each statement in the component • From the Testability scores of each statement of the component; Select the lowest score as the Testability score of the component • Calculate the Dependability Score of the Component

  25. QoS Catalog • Evaluation Formulae: • T = E * P TTestability Score EExecution Estimate PPropagation Estimate • D = 1-(1-T)N DDependability Score NNumber of successful tests • Result Type: Floating Point Value between 0 to 1 • Static/Dynamic: Static • Consequence: • Greater amounts of testing and greater Testability scores result in greater Dependability • Lower amounts of testing and lower Testability scores result in lesser Dependability • Doing additional testing can improve a poor score

  26. QoS Catalog 4. Lesser amount of testing is required to provide a fixed dependability score for higher Testability Scores • Related Parameters: Availability, Error Rate, Stability • Domain of Usage: Domain Independent • Error Situation: Low dependability results in: • 1.      Unreliable component behavior. • 2.      Improper execution/termination. • 3. Erroneous results. • Aliases: Maturity, Fault Hiding Ability, Degree of Testing

  27. Summary of Approach • Address key issues that need to be resolved to assist organizations to manage their distributed software systems • Meta-model allows a seamless integration of heterogeneous components • Formal specifications assist in automated construction and verification of parts and the whole of a distributed computing system (DCS) • Support a unified approach to iterative as a pragmatic solution for software development of DCS • Incorporation and validation of QoS implies the creation of more reliable DCS • Interactions with the industry and standards organizations provide practical feedback and enable proliferation of research results in a timely manner

  28. Salient Features • A meta-model and a unified approach • QoS-based generative process • Generation based on distributed resources in the form of components – use of HHs • Event grammars for dynamic QoS metrics • Automation (to the extent feasible) for system generation

  29. Webpage Http://www.cs.iupui.edu/uniFrame.html

More Related