software engineering software quality assurance n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Software Engineering: Software Quality Assurance PowerPoint Presentation
Download Presentation
Software Engineering: Software Quality Assurance

Loading in 2 Seconds...

play fullscreen
1 / 104

Software Engineering: Software Quality Assurance - PowerPoint PPT Presentation


  • 137 Views
  • Uploaded on

Software Engineering: Software Quality Assurance. Romi Satria Wahon o romi@romisatriawahono.net http://romisatriawahono.net +6281586220090. Romi Satria Wahono. SD Sompok Semarang (1987) SMPN 8 Semarang (1990) SMA Taruna Nusantara , Magelang (1993)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Software Engineering: Software Quality Assurance' - chaney


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
software engineering software quality assurance

Software Engineering:Software Quality Assurance

Romi Satria Wahonoromi@romisatriawahono.nethttp://romisatriawahono.net+6281586220090

romi satria wahono
Romi Satria Wahono
  • SD Sompok Semarang (1987)
  • SMPN 8 Semarang (1990)
  • SMA Taruna Nusantara, Magelang (1993)
  • S1, S2 dan S3 (on-leave)Department of Computer SciencesSaitama University, Japan (1994-2004)
  • Research Interests: Software Engineering,Intelligent Systems
  • Founder danKoordinatorIlmuKomputer.Com
  • Peneliti LIPI (2004-2007)
  • Founder dan CEO PT Brainmatics Cipta Informatika
course contents 1
Course Contents-1-
  • Introduction to Software Engineering
    • WhatisSoftware
    • WhatisSoftware Engineering
    • Discipline andCurriculumof Software Engineering
  • Software Engineering Profession
    • Profession, Ethics and Certification
    • Software Industry and Market
    • Internet Business Model and Trends
course contents 2
Course Contents-2-
  • Software Engineering Process
    • Software Development Life Cycle (SDLC)
    • Software Development Methodologies
    • Software Development Notation (UML) and Tools
    • Object-Oriented Paradigm
  • Software Construction
    • Software Construction Process
    • Estimating the Size of Software Project
course contents 3
Course Contents-3-
  • Software Quality Assurance
    • The Uniqueness of SoftwareQualityAssurance
    • What is Software Quality
    • Software Quality Factor
    • Software Testing
  • Software Engineering Research
    • ComputingResearch Methodology
    • Research Trendsin Software Engineering
    • Case Study: Developing Research Proposal in Software Engineering Field
contents
Contents
  • The Uniquenessof SoftwareQualityAssurance
  • What is Software Quality
  • Software Quality Factor
  • Software Testing
software vs other industrial products
Software vsOther Industrial Products

Product Complexity

Product Visibility

Product Development Process

warranty lawsuits
Warranty Lawsuits

Mortensonvs. Timeberline Software (≈1993)

Mortenson used a TS application when creating a bid to build a hospital

The software created a bid that was $2M too low

TS knew about the bug, but had not sent an update to Mortenson

The State of Washington Supreme Court ruled in favor of TS

warranty laws
Warranty Laws

Article 2 of the Uniform Commercial Code

Uniform Computer Information Transaction Act (UCITA)allows software manufacturers to: (≈1999)

disclaim all liability for defects

prevent the transfer of software from person to person

remotely disable licensed software during a dispute

does not apply to embedded systems

disclaimer of warranties
Disclaimer of Warranties

DISCLAIMER OF WARRANTIES. TO THE MAXIMUM EXTENT PERMITTED BY APPLICABLE LAW, MICROSOFT AND ITS SUPPLIERS PROVIDE TO YOU THE SOFTWARE COMPONENT, AND ANY (IF ANY) SUPPORT SERVICES RELATED TO THE SOFTWARE COMPONENT ("SUPPORT SERVICES") AS IS AND WITH ALL FAULTS; AND MICROSOFT AND ITS SUPPLIERS HEREBY DISCLAIM WITH RESPECT TO THE SOFTWARE COMPONENT AND SUPPORT SERVICES ALL WARRANTIES AND CONDITIONS, WHETHER EXPRESS, IMPLIED OR STATUTORY, INCLUDING, BUT NOT LIMITED TO, ANY (IF ANY) WARRANTIES OR CONDITIONS OF OR RELATED TO: TITLE, NON-INFRINGEMENT, MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, LACK OF VIRUSES, ACCURACY OR COMPLETENESS OF RESPONSES, RESULTS, LACK OF NEGLIGENCE OR LACK OF WORKMANLIKE EFFORT, QUIET ENJOYMENT, QUIET POSSESSION, AND CORRESPONDENCE TO DESCRIPTION. THE ENTIRE RISK ARISING OUT OF USE OR PERFORMANCE OF THE SOFTWARE COMPONENT AND ANY SUPPORT SERVICES REMAINS WITH YOU.

software crisis
"Software Crisis"

Term coined by DoD years ago

Problem Today: complexity of problems addressed by software has outpaced improvements in software creation process

demand

Programmers

supply

Time

slide17

"We have repeatedly reported on cost rising by millions of dollars, schedule delays, of not months but years, and multi-billion-dollar systems that don't perform as envisioned.

The understanding of software as a product and of software development as a process is not keeping pace with the growing complexity and software dependence of existing and emerging mission-critical systems."

Government Accounting Office

slide18

"Few fields have so large a gap between best current practice and average current practice."

Department of Defense

why software quality assurance
WhySoftwareQualityAssurance?
  • Softwarefailuresarecostly!
    • One hour downtime costs >$6M on average (Gartner, 1998)
    • Account for 30% of system failures (Marcus, 2000)
    • Contribute to 55% of most severe vulnerabilities (CERT)
    • Cost billions annually (NIST, 2002)
    • $22.2B annual potential cost reduction from feasible
    • infrastructureimprovements (NIST, 2002)
the objective of sqa
The Objective of SQA
  • Assuring an acceptable level of confidence that the software will conform to functional technical requirements
  • Assuring an acceptable level of confidence that the software will conform to managerial scheduling and budgetary requirements
  • Initiation and management of activities for the improvement and greater efficiency of software development and SQA activities
software errors faults failures
Software Errors, Faults, Failures

Software errors are sections of the code that are partially or totally incorrect as a result of a grammatical, logical or other mistake made by a systems analyst, a programmer, or another member of the software development team

Software faults are software errors that cause the incorrect functioning of the software during a specific application

Software faults become software failures only when they are “activated”, that is, when a user tries to apply the specific software section that is faulty. Thus, the root of any software failure is a software error

9 causes of software errors
9 Causes of Software Errors
  • Faulty requirements definition
  • Client–developer communication failures
  • Deliberate deviations from software requirements
  • Logical design errors
  • Coding errors
  • Non-compliance with documentation and coding instructions
  • Shortcomings of the testing process
  • Procedure errors
  • Documentation errors
cost of errors
Cost of Errors

"Software bugs, or errors, are so prevalent and so detrimental that they cost the U.S. economy an estimated $59.5 billion annually, or about 0.6 percent of the gross domestic product. …

Although all errors cannot be removed, more than a third of these costs, or an estimated $22.2 billion, could be eliminated by an improved testing infrastructure that enables earlier and more effective identification and removal of software defects. These are the savings associated with finding an increased percentage (but not 100 percent) of errors closer to the development stages in which they are introduced. Currently, over half of all errors are not found until "downstream" in the development process or during post-sale software use.“

(US Dept of Commerce, June 2002)

some famous software errors
Some Famous Software Errors

Therac-25

Patriot Missile System

ESA's Ariane 5 Launch System

Source: www.wikipedia.org

therac 25 the problem
Therac-25 - the problem

When operating in soft X-ray mode, the machine was designed to rotate three components into the path of the electron beam, in order to shape and moderate the power of the beam. …

The accidents occurred when the high-energy electron-beam was activated without the target having been rotated into place; the machine's software did not detect that this had occurred, and did not therefore determine that the patient was receiving a potentially lethal dose of radiation, or prevent this from occurring.

therac 25 the reasons
Therac-25 - the reasons

The design did not have any hardware interlocks to prevent the electron-beam from operating in its high-energy mode without the target in place

The hardware provided no way for the software to verify that sensors were working correctly

The equipment control task did not properly synchronize with the operator interface task, so that race conditions occurred if the operator changed the setup too quickly. This was evidently missed during testing, since it took some practice before operators were able to work quickly enough for the problem to occur

The software set a flag variable by incrementing it. Occasionally an arithmetic overflow occurred, causing the software to bypass safety checks

patriot missile system
Patriot Missile System

On February 25, 1991, the Patriot missile battery at Dharan, Saudi Arabia had been in operation for 100 hours, by which time the system's internal clock had drifted by one third of a second. For a target moving as fast as an inbound TBM, this was equivalent to a position error of 600 meters

The radar system had successfully detected the Scud and predicted where to look for it next, but because of the time error, looked in the wrong part of the sky and found no missile. With no missile, the initial detection was assumed to be a spurious track and the missile was removed from the system. No interception was attempted, and the missile impacted on a barracks killing 28 soldiers

ariane 5 rocket
Ariane 5 Rocket

June 4, 1996 was the first test flight of the Ariane 5 launch system. The rocket tore itself apart 37 seconds after launch, making the fault one of the most expensive computer bugs in history.

The Ariane 5 software reused the specifications from the Ariane 4, but the Ariane 5's flight path was considerably different and beyond the range for which the reused code had been designed. Specifically, the Ariane 5's greater acceleration caused the back-up and primary inertial guidance computers to crash, after which the launcher's nozzles were directed by spurious data. Pre-flight tests had never been performed on the re-alignment code under simulated Ariane 5 flight conditions, so the error was not discovered before launch.

Because of the different flight path, a data conversion from a 64-bit floating point to 16-bit signed integer caused a hardware exception (more specifically, an arithmetic overflow, as the floating point number had a value too large to be represented by a 16-bit signed integer). Efficiency considerations had led to the disabling of the exception handler for this error. This led to a cascade of problems, culminating in destruction of the entire flight.

question
Question

To know that quality has improved, it would be helpful to be able to measure quality.

How can we measure quality?

mccall s factor model
McCall’s Factor Model
  • McCall’s factor model classifies all software requirements into 11 software quality factors
  • The 11 factors are grouped into three categories – product operation, product revision and product transition:
    • Product operation factors: Correctness, Reliability, Efficiency, Integrity, Usability
    • Product revision factors: Maintainability, Flexibility, Testability
    • Product transition factors: Portability, Reusability, Interoperability
alternative factor models
Alternative Factor Models
  • The two factor models from the late 1980s, alternatives to the McCall classic factor model:
    • The Evans and Marciniak factor model
    • The Deutsch and Willis factor model
  • These alternative models suggest adding five factors to McCall’s model. Two of these factors are very similar to two of McCall’s factors; only three factors are “new”:
    • Both models add the factor Verifiability
    • The Deutsch and Willis model adds the factors Safety and Manageability
iso 9126 software quality factors
ISO 9126 Software Quality Factors

Functionality

Reliability

Usability

Efficiency

Maintainability

Portability

bobot pengukuran product
BobotPengukuran Product
  • Dapatdipelihara (maintainable)
  • Dapatdiandalkan (reliable)
  • Aman (Secure)
  • EfektifdanEfisien
  • Kemampupakaian (Usabilitas)
  • Kemampupakaiankembali (Reusabilitas)

- Software game harusinteraktifdanresponsif

- Software phone switching harushandal

- Software ecommerce danperbankanharusaman (secure)

contoh pengukuran product
ContohPengukuran Product*

Fa = w1c1 + w2c2 + … + wncn

F= Factor, W= Weight, C=Criteria

*Source: MengukurKualitasPerangkatLunak, RomiSatriaWahono.Net

capability maturity model cmm
Capability Maturity Model (CMM)
  • Level 1 – Initialtanpaprosedurdan planning, tidakkonsisten
  • Level 2 – Repeatableadamanajemen, jaminankualitas, prosedur, individual performance tanpa model formal
  • Level 3 – Definedprosesterdefinisi, danmengarahkeperbaikanprosessecarakualitatif
  • Level 4 – Managedperbaikandanprediksiprosessecarakuantitatif
  • Level 5 – Optimizingmemperbaikiprosessecaraberkesinambungan, inovatif, direncanakan, dianggarkandan integral dalamprosesorganisasi

Capability Maturity Model (CMM), Software Engineering Institute

history
History
  • 1986 - Effort started by SEI and MITRE Corporation
    • assess capability of DoD contractors
  • First version published in 1991
  • closely related to TQM
    • goal is customer satisfaction
      • not required that customer be "delighted"
some fundamental ideas
Some Fundamental Ideas
  • Process improvement is based on small steps, rather than revolutionary innovation.
  • CMM is not exhaustive or dictatorial.
  • CMM focuses on processes that are of value across the organization.
benefits of using the model
Benefits of using the model
  • helps forge a shared visionof purpose of process improvement within the organization
  • establishes common languagefor the software process
  • defines a set of prioritiesfor attacking problems
  • supports measurement
    • via reliable appraisals
    • objective
    • supports industry-wide comparisons
risks of using the model
Risks of using the model
  • "All models are wrong; some models are useful."
  • Not a silver bullet
  • Does not address all of the issues that are important for successful projects.
    • For example
      • how to hire and retain good people
      • expertise in the application domain
levels
Levels
  • Initial
  • Repeatable
  • Defined
  • Managed
  • Optimizing
level 1 the initial level
Level 1 : The Initial Level
  • ad hoc, sometimes chaotic
  • overcommitment leads to a series of crises
  • during a crisis, projects abandon plans
  • capability is characteristic of individuals, not the organization
      • when a good manager leaves, the success leaves with them
level 2 the repeatable level
Level 2 : The Repeatable Level
  • Planning is based on experience with similar projects
    • past successes can be repeated
  • Policies for Managing and Implementation
    • installed basic management controls
    • track costs and schedules
    • notice and deal with problems as they arise
level 3 the defined level
Level 3 : The Defined Level
  • Standard Processes defined across the organization and used by all projects
      • standard set of roles, activities, quality tracking, etc
      • each project uses a tailored version of this standard process
  • Training Program is in place to ensure everyone has the skills required for their assigned role
level 4 the managed level
Level 4 : The Managed Level
  • Quantitative Quality Goals
      • for both Products and Processes
  • Organization-wide Process Database
    • meaningful variations in process performance can be distinguished from random noise
      • actions are then taken to correct the situation
  • Products are of predictably high quality
level 5 the optimizing level
Level 5 : The Optimizing Level
  • Organization has the means to identify weaknesses and strengthen the process proactively
  • teams analyze defects to determine their cause, and disseminate lessons learned throughout the organization
  • major focus on eliminating waste
    • e.g. reduce amount of rework
reality check
Reality Check...
  • Is CMM well-suited for everyone?
criticisms of cmm
Criticisms of CMM
  • CMM is well suited for bureaucratic organizations such as government agencies and large corporations.
    • Its formality is a hindrance to projects where time-to-market is more important than quality.
  • Promotes process over substance
    • For example, emphasizing predictability over service provided to end users.

en.wikipedia.com

who uses cmm
Who uses CMM
  • 75% of organizations are probably Level One
    • To get to Level Two, you must have control over the requirements documents. Hence, shrink-wrap companies like Microsoft are Level One.
  • 75% of Level Five organizations are in India.
level comparison risk
Level Comparison - Risk
  • Level 1
    • Just do it
  • Level 2
    • problems are recognized and corrected as they occur
  • Level 3
    • problems are anticipated and prevented, or impacts minimized
  • Levels 4 and 5
    • sources of problems are understood and eliminated
level comparison people
Level Comparison - People
  • Level 1
    • success depends on individual heroics
    • fire fighting is the way of life
  • Level 2
    • people are trained and supported by management
    • success depends on individuals
  • Level 3
    • people are trained for their role(s)
    • groups work together
  • Levels 4
    • strong sense of teamwork in every project
  • Level 5
    • strong sense of teamwork across the organization
    • everyone does process improvement
level comparison measurement
Level Comparison - Measurement
  • Level 1
    • ad hoc data collection and analysis
  • Level 2
    • individual projects use planning data
  • Level 3
    • data collected for all processes
    • data shared across projects
  • Levels 4
    • data standardized across the organization
  • Level 5
    • data used for process improvement
cmmi levels summary
CMMI Levels - Summary
  • Level 1 - characterized by chaos, periodic panics, and heroic efforts required by individuals to successfully complete projects. Few if any processes in place; successes may not be repeatable.
  • Level 2 - software project tracking, requirements management, realistic planning, and configuration management processes are in place; successful practices can be repeated.
  • Level 3 - standard software development and maintenance processes are integrated throughout an organization; a Software Engineering Process Group is is in place to oversee software processes, and training programs are used to ensure understanding and compliance.
  • Level 4 - metrics are used to track productivity, processes, and products. Project performance is predictable, and quality is consistently high.
  • Level 5 - the focus is on continuous process improvement. The impact of new processes and technologies can be predicted and effectively implemented when required.
quiz 1
Quiz 1
  • Your Role : SQA specialist
  • Situation :
    • Initial Unit Testing reports indicate a bug rate of 4.5 / KSLOC.
    • Further checking finds
      • Average initial bug rate is 3.1 per KSLOC
      • StdDev of 0.5
      • weighted rate is also higher than average
  • What CMM level enables this amount of visibility into the process?
quiz 2
Quiz 2
  • Your Role : Project Manager
  • Phase : Unit Testing
  • Problem : You notice that design, implementation, and testing of the database component will probably take 3 weeks instead of the planned 4 weeks. These tasks are not on the critical path.
  • Possible Actions?
    • and what level CMM characterizes that action?
quiz 3
Quiz 3
  • Your Role : Project Manager
  • Phase : Planning
  • Task : Schedule the Testing of the Database
    • Estimated Duration: 3 days
    • Required Resources:
      • the database requirements specs
      • the implementation (source code)
      • real data from customer
      • test person that has a DB Test certificate
  • How do the different capability levels affect your ability to schedule and monitor this task?
quiz 4
Quiz 4
  • Your Role : a development team leader
  • Problems :
    • Well into development, you get an email indicating a change in the interface requirements based on a demo of the prototype done with the customer. The change will require a good amount of code rework.
  • What would be the reaction of groups with
    • Level One
    • Level Three
    • Level Five
reality check1

72 / 19

Reality Check…
  • Is an SQA plan just busy-work, or does it really pay off?
  • Hughes Aircraft
    • moved from level 2 in 1987 to level 3 in 1990
    • cost = $500K
    • benefit = $2M annually
  • Raytheon
    • moved from level 1 in 1988 to level 3 in 1993
    • productivity doubled
    • ROI = $7.70 per $1 invested
pre class reading cmm effectiveness case studies

73 / 19

Pre-Class ReadingCMM Effectiveness Case Studies
  • Hughes Aircraft
  • Schlumberger
  • Texas Instruments
  • Tinker AFB
sqa system components

74 / 19

SQA System Components
  • Pre-Project Components
  • Development and Maintenance Activities
  • Error Reduction Infrastructure
  • SQ Management Components
  • SQA System Assessment
  • Human Components
1 pre project components

76 / 19

1. Pre-Project Components
  • Contract Review
  • Development and Quality Plans
    • Development Plans
      • schedules
      • manpower requirements
      • tools
    • Quality Plans
      • measurable quality goals
      • success criteria for each project phase
      • scheduled V&V activities
2 life cycle components

77 / 19

2. Life Cycle Components
  • Software Testing
  • Reviews
    • varying levels of formality
    • specs, designs, code modules, documents, etc
  • Maintenance
    • corrective
    • adaptive
    • functional
3 error prevention and improvement infrastructure

78 / 19

3. Error Prevention and Improvement Infrastructure
  • work procedures
  • templates and checklists
  • staff training
  • preventive actions
  • configuration control
  • document control
4 management components

79 / 19

4. Management Components
  • Project Progress
    • schedules, budgets, risk analysis, …
  • Quality Metrics
  • Quality Costs
5 sqa assessment

80 / 19

5. SQA Assessment
  • Quality Management Standards
    • SEI CMM
    • ISO 9001
  • Process Standards
    • IEEE 1012
    • ISO 12207
6 human components

81 / 19

6. Human Components
  • Management
  • SQA Unit
  • SQA committees and forums
sei cmm levels

82 / 19

SEI CMM Levels
  • Initial
    • ad hoc, perhaps chaotic
  • Repeatable
    • tracks costs, has a schedule
    • similar projects can repeat earlier successes
  • Defined
    • process in documented and standardized
  • Managed
    • detailed process and product measurements
  • Optimizing
    • continuous process improvement
iso 15504

83 / 19

ISO 15504
  • SPICE = Software Process Improvement Capability Determination
  • framework for process improvement similar to SEI CMM
iso standards for quality

84 / 19

ISO Standards for Quality
  • ISO 9000 : Quality Management and Quality Assurance Standards - Guidelines for selection and use
  • ISO 9001 : Quality Systems - Model for quality assurance in design/development, installation, and servicing
  • ISO 9000-3 : Guidelines to applying 9001 to software
iso 9000

85 / 19

ISO 9000

ISO 9000 seeks to set criteria which achieve a goal and is not prescriptive as to methods. The requirements come in Sections 4 to 8.

  • Section 4 is entitled General Requirements
  • Section 5 is entitled Management Responsibility
  • Section 6 is entitled Resource Management
  • Section 7 is entitled Product Realization
  • Section 8 is entitled Measurement, analysis and improvement

In each of these areas, ISO 9001: 2000 seeks to set out key requirements, which if met will ensure quality.

http://en.wikipedia.org/wiki/ISO_9000

ieee std 1012 ieee standard for software verification and validation

86 / 19

IEEE Std 1012 - IEEE Standard for Software Verification and Validation

1. Overview

2. Normative references

3. Definitions, abbreviations, and conventions

4. V&V software integrity levels

5. V&V processes

5.1 Process: Management

5.2 Process: Acquisition

5.3 Process: Supply

5.4 Process: Development

5.4.2 Activity: Requirements V&V

5.4.3 Activity: Design V&V

5.4.4 Activity: Implementation V&V

5.4.5 Activity: Test V&V

5.4.6 Activity: Installation and Checkout V&V

5.5 Process: Operation

5.6 Process: Maintenance

6. Software V&V reporting, administrative, and documentation requirements

Annex A Mapping of ISO/IEC 12207 V&V requirements to IEEE Std 1012 V&V activities and tasks

iso 12207

87 / 19

ISO 12207
  • ISO 12207 is an ISO standard for software life cycle processes.
  • Standard ISO 12207 establishes a process of life cycle for software, including processes and activities applied during the acquisition and configuration of the services of the system. The standard has the main objective of supplying a common structure so that the buyers, suppliers, developers, maintainers, operators, managers and technicians involved with the software development use a common language. This common language is established in the form of well defined processes.

http://en.wikipedia.org/wiki/ISO_12207

software testing
Software Testing
  • Testing can never prove there are no errors
  • The purpose is not to demonstrate that the system is free of errors
  • The purpose is to detect as many errors as possible
  • Question:
    • What are considered good test results?
testing philosophy
Testing Philosophy
  • It is dangerous to test early modules without an overall testing plan
  • It may be difficult to reproduce sequence of events causing an error
  • Testing must be done systematically and results documented carefully
test planning
Address all products created during development

So develop test plan early

Example, test completeness of CRC cards

Each test:

Has as specific objective

Has specific test cases to examine

Uses test specifications

If the tested class requires methods that aren't ready

Use stubs (hard coded fake methods)

Test Planning
stages of testing
Stages of Testing
  • Unit testing
    • Tests each module to assure that it performs its function
  • Integration testing
    • Tests the interaction of modules to assure that they work together
  • System testing
    • Tests to assure that the software works well as part of the overall system
  • Acceptance testing
    • Tests to assure that the system serves organizational needs
unit testing
Unit Testing
  • Tests a single unit (a class)
  • Type of unit testing:
    • Black Box Testing
      • Most common
      • Looks just at inputs and outputs
      • Tests whether the unit meets requirements stated in specification
    • White-Box Testing
      • Looks inside the module to test its major elements
      • Limited usefulness in OO design
        • because units are so small
integration testing
Integration Testing
  • After the classes pass unit tests
  • Test classes that must work together
  • Four types of Integrating tests
    • User interface testing
      • Tests each interface function
      • Move through each menu/screen
    • Use-case testing
      • Ensures that each use case works correctly
      • Step through each use case
      • Often combined with UI testing
integration testing1
Integration Testing

3. Interaction testing

  • Start with a package
  • Each method is a stub
  • Add methods one at a time, testing as you go
  • Once all packages are done, repeat on the package level

4. System interface testing

  • Ensures data transfer between systems
system testing
System Testing
  • Requirements Testing
  • Usability Testing
  • Security Testing
  • Performance Testing
  • Documentation Testing
system testing1
System Testing
  • See that all classes work together
  • Similar to integration testing but broader
    • Requirements Testing
      • Are business requirements met?
      • Ensures that integration did not cause new errors
    • Usability Testing
      • Tests how easy and error-free the system is in use
      • Informal or formal
system testing2
System Testing
  • Security Testing
    • Assures that security functions are handled properly
      • e.g. Disaster recovery
  • Performance Testing
    • Assures that the system works under high volumes of activity
  • Documentation Testing
    • Analysts check that documentation and examples work properly
acceptance testing
Acceptance Testing
  • Done by users with support from project team
  • Ensure the system meets the originally stated requirements
  • Type of AcceptanceTests:
    • Alpha Testing
      • Repeat tests by users to assure they accept the system, uses known data
    • Beta Testing
      • Uses real data, not test data
referensi foundation
Referensi (Foundation)
  • Roger S. Pressman, Software Engineering: A Practitioner’s Approach SeventEdition, McGraw-Hill, 2009
  • Ian Sommerville, Software Engineering 9th Edition, Addison-Wesley, 2010
  • Albert Endresdan Dieter Rombach, A Handbook of Software and Systems Engineering, Pearson Education Limited, 2003
  • Yingxu Wang, Software Engineering Foundations: A Software Science Perspective, Auerbach Publications, Taylor & Francis Group, 2008
  • Guide to the Software Engineering Body of Knowledge 2004 Version (SWEBOK), IEEE Computer Society, http://www.swebok.org, 2004
referensi process
Referensi (Process)
  • Alan Dennisetal, Systems Analysis and Design with UML – 3rd Edition, John Wiley and Sons, 2010
  • Dan Pilone and Russ Miles, Head First Software Development, O’Reilly Media, 2008
  • BarclayandSavage, Object-Oriented Design with UML and Java, Elsevier, 2004
  • Paul Kimmel, UML Demystified, McGraw-Hill, 2005
  • Kim HamiltonandRussell Miles, Learning UML 2.0, O'Reilly, 2006
  • Howard Podeswa, UML for the IT Business Analyst,CourseTechnology, 2009
  • Deloitte, Business Process Modeling – BasicGuidelineand Tips, 2008
referensi quality assurance
Referensi(QualityAssurance)
  • Daniel Galin, Software Quality Assurance, Addison-Wesley, 2004
  • Jeff Tian, Software Quality Engineering, John Wiley & Sons, Inc., 2005
  • G. Gordon Schulmeyer, Handbook of Software Quality Assurance Fourth Edition, Artech House, 2008
  • KshirasagarNaik and PriyadarshiTripathy, Software Testing and Quality Assurance, John Wiley & Sons, Inc., 2008