1 / 25

How to hear this lecture

How to hear this lecture. Click on the icon: to hear the narration for each slide. fisher.osu.edu. Fisher logo. Lecture 7 - Architecture Dr. Rajiv Ramnath Director Collaborative for Enterprise Transformation and Innovation (CETI)

lynde
Download Presentation

How to hear this lecture

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to hear this lecture Click on the icon: to hear the narration for each slide. Partnership for Performance

  2. fisher.osu.edu Fisher logo Lecture 7 - Architecture Dr. Rajiv Ramnath Director Collaborative for Enterprise Transformation and Innovation (CETI) Department of Computer Science and Engineering, College of Engineering The Ohio State University Ramnath.6@osu.edu http://www.ceti.cse.ohio-state.edu Partnership for Performance Partnership for Performance

  3. Software Architecture • The software architecture of a computing system is the structure(s) of the system, which comprise software elements, the externally visibleproperties of these elements, and the relationships between them. Courtesy: SEI Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  4. Things Common to All Architectures • Goals – that lead to “quality attributes” – Performance, Availability etc. • Modules and layers • Needing multiple views to describe • Static structure and dynamic behavior • Tradeoffs • Leading to common ways of defining, describing, creating, analyzing, evaluating, and documenting. Partnership for Performance

  5. Styles of Architectural Views • Module: • Decomposition = contains • Uses = DEPENDS on the CORRECT execution of • Layered = constrains what calls what • Component-and-Connector: • Process, concurrency, shared data (flow), client-server • Allocation: • Deployment to infrastructure • Implementation – to file structure • Work-assignment to teams • Reason for Views? To systematically understand the functioning of the system and how it implements its quality attributes Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  6. ‘4+1 View’ Model of Software Architecture Partnership for Performance

  7. Designing Architecture

  8. Attribute Driven Design – a Recursive Decomposition Process • Start with a module (in the beginning, the system) and refine it, as follows: • Create a prioritization of the architectural requirements – functional and NFR (most relevant) quantified into Quality Attributes • Starting with the 1st requirement, design the architecture using architectural tactics • Decompose into child modules and interfaces as necessary • Verify using use cases and quality scenarios • Document this design using views Partnership for Performance

  9. Quality Attributes • System quality: • Availability • Modifiability (includes scalability, portability) • Performance • Security • Usability • Testability • Business qualities: • Time to market, cost/benefit, lifetime, target market, schedule, integration Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  10. Tactics for implementing Quality Attributes - 1 • Availability: • Fault detection: Ping, heartbeat, exception • Recovery: Redundancy (active, passive), reboot • Prevention: Demand management, transactions • Modifiability (including Extensibility and Scalability • Localize, prevent ripple effects, defer binding time, limit allowable modifications • Essentially: Reduce coupling and increase cohesion • Performance • Reduce resource use (memory, communications, data) • Reduce switching • Increase resources • Increase parallelism • Control demand Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  11. Tactics for implementing Quality Attributes - 2 • Usability • Runtime: Feedback, use task, user and system models • Design time: Separate UI (modifiability tactic) • Testability • Goal: Show the PRESENCE of faults • Record/playback, • Stub interface from implementation • Specialized testing interface • Use internal monitoring • Security • Detect: Monitor and compare access patterns, store and analyze data • Resist: Authentication, levels of authorization to , limit exposure by limiting access, encryption, preservation of integrity through checksums • Recover: availability tactics, audit trails for identification and investigation Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  12. Analyzing and Validating Architecture using The Architectural Tradeoff Analysis Method (ATAM)

  13. ATAM Component - Scenarios • Scenarios are used to • Understand quality attributes • Scenarios should cover a range of • Anticipated uses of (use case scenarios), • Anticipated changes to (growth scenarios), or • Unanticipated stresses (exploratory scenarios) to the system. • A good scenario makes clear what the stimulus is that causes it and what responses are of interest. • Where have you seen this before? Acceptance Tests!! Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  14. ATAM Component - Utility Tree Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  15. ATAM – Scenario Examples • Use case scenario • Remote user requests a database report via the Web during peak period and receives it within 5 seconds. • Growth scenario • Add a new data server to reduce latency in scenario 1 to 2.5 seconds within 1 person-week. • Exploratory scenario • Half of the servers go down during normal operation without affecting overall system availability. • Scenarios should be as specific as possible. Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  16. ATAM - Analyze Architectural Approaches • Scenarios • Used to Pose • Architecture • Questions Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  17. ATAM - Analysis Examples • Scenario-based questions elicit the architectural decisions made. • Examples • Performance • How are priorities assigned to processes? • What are the message arrival rates? • What are transaction processing times? • Modifiability • Are there any places where layers/facades are circumvented ? • What components rely on detailed knowledge of message formats? • What components are connected asynchronously? Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  18. ATAM - Sensitivity, Tradeoffs and Risks • Sensitivity – A property of a component that is critical to success of system. • The number of simultaneous database clients will affect the number of transaction a database can process per second. This assignment is a sensitivity point for the performance • Keeping a backup database affects reliability • Power of encryption (Security) sensitive to number of bits of the key • Tradeoff point- A property that affects more than one attribute or sensitivity point. • In order to achieve the required level of performance in the discrete event generation component, assembly language had to be used thereby reducing the portability of this component. • Keeping the backup database affects performance also so it’s a trade-off between reliability and performance • Risk point - If a sensitivity or tradeoff point is close to limits, this is a risk point Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  19. How Much Architecting Is Needed? Ref: Software Architecture In Practice, Len Bass et. al., Safari Partnership for Performance

  20. Architecture Work-Products

  21. Architecture Work-Products • Target environment • = 4+1 Deployment view • Subsystem model • = 4+1 Conceptual view • System Architecture • Other views Partnership for Performance

  22. Target environment • Hardware, OS and runtime environment • Purpose is to document deployment environment • Participants - customer and system architects • Timing - along with NFRs and then elaborated at design • Technique - start with NFRs • Notation: Free format text with appropriate diagrams. Partnership for Performance

  23. Subsystem model • Delegation of system responsibilities into subsystems • Clearly define interfaces • Participants • Architects, project manager • Timing • Along with system architecture • Technique • If possible start with analysis • Use facades Partnership for Performance

  24. System Architecture • Global, project-wide design decisions on: • Layering, communication patterns, distribution, persistence, security, error-handling and recovery, debugging, reuse • Application, application support and utility sub-domains • Usually done by one person • Technique: • Start with prioritized NFRswith global impact (performance, error handling, UI etc.) • If an NFR does not exist, create appropriate ones at this stage! • Make architectural decisions to meet these • Transform a set of Analysis Sequence Diagrams to validate these decisions Partnership for Performance

  25. The End Partnership for Performance

More Related