Ims1002 cse1205 systems analysis and design
Download
1 / 57

IMS1002 /CSE1205 Systems Analysis and Design - PowerPoint PPT Presentation


  • 234 Views
  • Uploaded on

IMS1002 /CSE1205 Systems Analysis and Design. Lecture 11 Implementation Issues. Lecture Objectives. At the completion of this lecture you should: be aware of the tasks involved in the implementation phase of information system development

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'IMS1002 /CSE1205 Systems Analysis and Design' - liam


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Ims1002 cse1205 systems analysis and design l.jpg

IMS1002 /CSE1205 Systems Analysis and Design

Lecture 11

Implementation Issues


Lecture objectives l.jpg
Lecture Objectives

  • At the completion of this lecture you should:

    • be aware of the tasks involved in the implementation phase of information system development

    • be aware of the responsibilities of the systems analyst, the client and the users in this phase

    • be able to develop a test plan for an information system and to perform testing according to that plan

    • be able to develop a suitable plan for conversion from an existing system to a new system

    • be aware of training and other user-oriented issues in installation of a new system


Systems implementation l.jpg

IMPLEMENTATIONPLANNING

Acceptance Checklist, Implementation Schedule, Training Schedule, Re-estimate

Training Guides, User Manuals

Test Data Preparation, System Test: Functional & Performance, Test Conversion

Acceptance Test

Computer Documents, I/O Documents, Operating Guide

REVIEW

FINALISE DOCUMENTATION

REVIEW

CONDUCT SYSTEM TESTING

REVIEW

CONDUCT ACCEPTANCE TESTING

REVIEW

OPERATIONS HANDOVER

REVIEW

Systems Implementation


Systems implementation4 l.jpg
Systems Implementation

CONDUCT TRAINING

GET SYSTEM READY

FOR START-UP

Distribute Manuals, Test Equipment, Conduct Training, Set up / Convert Files

System Installation, Monitor Operations, Secure Acceptance, Run Benchmark Tests, Tune System

Hand over Technical Documentation, Post Implementation Review (What went wrong ?)

REVIEW

CONDUCT SYSTSEM

ACCEPTANCE

REVIEW

WRAP UP

REVIEW


Testing l.jpg
Testing

  • Testing is ...

    • " the process of exercising or evaluating a system by manual or automatic means to verify that it satisfies specified requirements or to identify differences between expected and actual results "

      (IEEE, 1983)

  • " Anyone who believes that his or her program will run correctly the first time is either a fool, an optimist, or a novice programmer.” (Anon.)


Principles of testing l.jpg
Principles of Testing

  • Testing is the process of executing a program with the intention of finding errors

    • an attempt to ‘break’ the program

  • It is impossible to completely test any nontrivial module or any system

    • when do you stop testing ?


Software errors l.jpg
Software Errors

  • Can arise for any of several reasons

    • the specification may be wrong

    • the specification may specify something that is physically impossible given the H/W and S/W

    • the system design may be at fault

    • the program design may be at fault

    • the program code may be wrong


Testing steps l.jpg
Testing Steps

  • All testing involves the following steps:

    • select what is to be measured by the test

    • decide how it is to be tested

    • develop the test cases

    • determine the expected or correct results (you must ensure that expected results can be measured - vagueness does not encourage adequate testing)

    • execute the test cases

    • compare actual results to expected results


Testing approaches l.jpg
Testing Approaches

  • Any software can be tested in two ways:

White box (or glass box)

Knowing the internal workings of a module so that its logical structure and operations can be systematically tested.

Knowing functions that the systems is supposed to perform and testing to see system to see if it performs the functions properly.

Black box


Stages of testing l.jpg

Installation test

system in use

User requirements

Acceptance test

accepted system

Performance test

Systems analysis & design

validated software

Systems specifications

Function test

functioning system

Program specifications

Systems design

Integration test

integrated modules

Programs, procedures, data

Systems implementation

Unit (module) test

tested modules

Stages of Testing


Module or unit testing l.jpg
Module or Unit Testing

  • Each module is tested individually

    • Lists what is being tested

    • Lists expected outcome

    • Identifies data to be used .. all possible combinations

  • Who carries out Module Testing?

    • Programmer - tests at code level

    • Analyst - tests at application level


Test plan l.jpg
Test Plan

Module Test Plan

Prepared by:

Date:

Page: of

Module being tested:

Testing method:

Test No

Condition being tested

Expected results


Integration testing l.jpg
Integration Testing

  • Verifies that the components of a system work together as described in the program design and system design specifications. It is necessary because

    • data can be lost across interfaces

    • a function may not perform as expected when combined with another function

    • one module can have an adverse effect on another

  • Integrating modules is best done using an incremental approach - easier to detect and correct errors.


Integration testing14 l.jpg
Integration Testing

  • There are a number of strategies that can be used to carry out integration testing:

    • Big-bang testing

      Incremental Approaches:

    • Top-down testing

    • Bottom-up testing

    • Sandwich testing

  • Any incremental integration testing needs a combination of stubs and drivers to work


Using stubs and drivers l.jpg
Using Stubs and Drivers

  • Stubs and drivers link modules to enable them to run in an environment close to the real one of the future.

Stubs: take the place of modules that are called but have not yet been coded may be invoked or receive or transmit data to the test module as required.

Drivers: call the module under test and pass it test data

existing routines

driver

module under test

stub

stub


Big bang testing l.jpg
Big Bang Testing

  • Throw them all together at once

    • Advantages:

      • None - perceived to be faster

    • Disadvantages:

      • difficult to find and isolate the cause of any errors that appear

      • interface errors cannot easily be distinguished from other errors.


Incremental approach to testing l.jpg
Incremental Approach to Testing

  • REPEAT UNTIL the system is complete

    • Implement and unit test a module

    • Add the module to the existing combination

    • Test and debug the new combination

  • END REPEAT

  • Deliver the system

  • Each time through the loop, the part of the system implemented will be working

    • crucial interfaces are not left till the end

    • resource usage is better distributed


Top down testing l.jpg

module under test

stub

stub

Top Down Testing

  • Implement the top module of a structure chart first

  • Each subordinate module is simulated by a stub or dummy module.

  • Each stub is replaced by a real module and the structure re-tested until the bottom level of the chart has been reached.


Top down testing19 l.jpg
Top Down Testing

  • Advantages

    • Feedback to users

    • Skeleton versions

    • Project less likely to be axed

    • Major system interfaces are tested

    • Testing resources are distributed more evenly

    • Implementers can see early results

    • If time is short, can begin other parts of the development cycle - is this appropriate?

    • Shows progress - working modules vs kilos of code

  • Disadvantages

    • A large number of stubs may be required

    • Writing realistic lower level stubs may be difficult and time consuming, i.e. more costly


Bottom up testing l.jpg

driver

module under test

already tested modules

Bottom Up Testing

  • Implement the lowest modules of a structure chart first

  • Each boss module is simulated by a driver module.

  • Each driver module is replaced by a real module and the structure re-tested until the top level of the chart has been reached.


Bottom up testing21 l.jpg
Bottom Up Testing

  • Advantages

    • Project less likely to be axed

    • Testing resources are distributed more evenly

    • Implementers can see early results

    • Feedback to users (to some degree)

    • Driver modules are generally easier to develop than stubs ... therefore less costly

  • Disadvantages

    • No working program can be demonstrated until the last module is tested

    • Major top-level interfaces that may be critical are tested late

    • Cannot implement intermediate versions of the system


Sandwich testing l.jpg

stub

target layer

Sandwich Testing

  • Combines the top-down and bottom-up approaches

  • A target layer is chosen based on the structure and characteristics of the module hierarchy

  • The target layer is usually the one just above all the general purpose utility modules

  • A top-down approach is used above the target layer

  • A bottom-up approach is used below the target layer

  • Testing converges on the target layer


System testing l.jpg
System Testing

  • The process of testing the integrated software in the context of the total system it supports

    • performed after all unit and integration testing is complete

  • Who carries out System Testing ?

    • systems analyst, systems implementers, technical support


System testing24 l.jpg
System Testing

  • Tests conducted at this stage include

    • Function tests - demonstrate that all the functions specified for the system in the requirements specification are operational

    • Performance tests - demonstrate that the system meets the non-functional requirements specified.


Function testing l.jpg
Function Testing

  • Performed after all programming and integration testing is finished

    • Test cases

      • must cover every aspect of the system’s functionality

      • should have a high probability of detecting errors

    • Test plan

      • should be developed from the original specification

      • must include expected results that are measurable


Function testing26 l.jpg
Function Testing

  • Performed after all programming and integration testing is finished

    • Guidelines for function tests

      • use a test team independent of designers and programmers

      • know what the expected actions and outputs are

      • test both valid and invalid input

      • never modify the system being tested to make testing easier

      • know when the tests should stop


Performance testing l.jpg
Performance Testing

  • Compares the integrated modules with the non-functional system requirements such as speed, performance

    • Stress tests Volume tests

    • Configuration tests Compatibility tests

    • Regression tests Security tests

    • Timing tests Environmental tests

    • Quality tests Recovery tests

    • Maintenance tests Documentation tests

    • Human factors tests


Acceptance testing l.jpg
Acceptance Testing

  • Commences when the developers are confident that the system is ready to be used

  • Is where the user decides if the system is ready for use

  • Similar to system testing but politically very different

  • System testing can dispose of bugs while no one is watching

  • Acceptance testing is done under a spotlight, with the user watching (when you wish you had done more and better system testing)


Acceptance testing29 l.jpg
Acceptance Testing

  • May be completely in user's hands, but often shared between analyst and user

  • Criteria for acceptance

    • Is specification

      • presented to the user

      • signed by the user

    • Or

      • produce a definite plan for agreement on the criteria in the specification before you begin - must include results that can be measured


Acceptance testing30 l.jpg
Acceptance Testing

  • Involves installing the system at user sites and is required when acceptance testing has not been performed on site

  • The test focuses on completeness of the installed system and verification of any functional or nonfunctional characteristics that may be affected by site conditions

  • Testing is complete

    • When the customer is satisfied with the results

    • The system can then be formally delivered


Implementing the system l.jpg
Implementing the System

  • Other implementation tasks

    • implementation planning

    • finalise documentation

    • prepare the site

    • convert data into required form and media

    • conduct training

    • install system

    • monitor system

    • transition to maintenance mode

    • post-implementation review


Implementation planning l.jpg
Implementation Planning

  • Implementation stage of the project

    • requires a great deal of co-ordination with professionals outside the development team

  • Implementation plan

    • will have been developed at earlier stage of project

    • will need to be extended in greater detail

    • must be updated to reflect the current situation

  • Poor planning can cause significant delays in deadline!

  • Tasks

    • finalise acceptance checklist

    • complete and confirm training schedule

    • review and revise implementation plan


Finalise documentation l.jpg
Finalise Documentation

  • Documentation describes how a system works to a wide audience

  • The four main areas are

  • Training documentation

    • used specifically during the training sessions

    • especially designed to put the novice user at ease

  • User documentation

    • tells users how to work with the system and perform their tasks

    • may be a user manual, on-line help, quick reference guide etc


Finalise documentation34 l.jpg
Finalise Documentation

  • System documentation

    • a communications tool and to review and revise the system during development

    • also facilitates maintenance and enhancement of the system

  • Operations documentation

    • aimed at a centralised operations group (not on-line operators)

    • details what tasks an operator needs to carry out for a particular program


Prepare the site l.jpg
Prepare the Site

  • Ensure that facilities are adequate

    • varies in complexity

    • may require new facilities or re-modelling of current facilities for first-time computer systems

    • consider issues such as

      • adequate space for all resources, ergonomic furniture, noise reduction, privacy, security, appropriate electrical connections, uninterrupted power, etc.


Prepare the site36 l.jpg
Prepare the Site

  • Ensure that facilities are adequate

    • install the hardware and software required to run the system

      • usually done to a specification

      • must be tested to ensure no damage during transportation, product not defective, product changes between purchase and delivery are acceptable

    • People responsible

      • Vendor Engineer

      • Technical Support Group


Data conversion l.jpg
Data Conversion

  • Current production data could be converted in 3 ways

    • Format, Content, Storage Medium

    • Done according to the conversion plan

    • Manual file conversion is a time-consuming task

    • Often needs specially written conversion programs eg

      • Database Load Program

      • Record Transformation Program

    • Data must be confirmed to be correct


Data conversion38 l.jpg
Data Conversion

  • May be simple or complex

    • depends on system

  • May need to support both files

    • can introduce time lag

    • files may be out of step

  • General procedures involved

    • prepare existing files ... no errors, up-to-date

    • prepare manual files

    • build new files and validate

    • begin maintenance of new and old files

    • work towards established cut-off date

    • final check of accuracy


Training l.jpg
Training

  • “If you think education is expensive and time-consuming - try ignorance.”

    Bok, 1978


Conduct training l.jpg
Conduct Training

  • Need to consider:

    • who is the audience?

    • what level of detail should be imparted to the audience?

    • who should conduct the training?

    • where should the training be conducted?

    • when should the training be conducted?


Building user understanding l.jpg
Building User Understanding

  • Training - a complete and concentrated course in system use at the time of delivery

  • Training must be planned

    • methods

    • resources

    • should also consider Help during and after installation for new users, infrequent users and users who want to "brush up"


Building user understanding42 l.jpg
Building User Understanding

  • Training aids

    • must be easy to use

    • reliable

    • demonstrations and classes

    • documentation

    • on-line help and icons

    • expert users

  • Supportive User Manager who provides training, motivation, support


Install the system l.jpg
Install the System

  • Method of installation depends on several criteria

    • Cost - if there are cost constraints certain choices are not viable

    • System criticality - if system failure would be disastrous, the safest approach should be selected regardless of cost

    • User computer experience -the more experience the users have, the less necessary it is to delay changeover

    • System complexity - the more complex the system, the greater the chance of flaws ... a safer approach is better

    • User resistance - need to consider what the users are best able to cope with


Install the system44 l.jpg
Install the System

  • Alternatives

    • Direct installation or Abrupt cut-over

    • Parallel installation

    • Phased installation or Staged installation

    • Pilot installation or Single Location conversion


Direct installation abrupt cutover l.jpg
Direct Installation(Abrupt Cutover)

  • Old system stops and new system starts

Total cutover

Old system

New system


Direct installation abrupt cutover46 l.jpg
Direct Installation(Abrupt Cutover)

  • This approach is meaningful when

    • the system is not replacing any other system

    • the old system is judged absolutely without value

    • the old system is either very small and/or very simple

    • the new system is completely different from the old and comparisons would be meaningless

  • Advantages

    • costs minimised

  • Disadvantages

    • high risk


Parallel installation l.jpg

Total cutover

Old system

New system

Parallel Installation

  • Old and new systems operated concurrently


Parallel installation48 l.jpg
Parallel Installation

  • Old and new systems operated concurrently

  • Cut-over at end of a business cycle

  • Balancing between both systems

  • Advantages

    • risks low if problems occur

  • Disadvantages

    • cost of operating both systems 2.5 times the resources


Phased installation staged installation l.jpg

New system

Total cutover

Old system

Phased Installation(Staged Installation)

  • System installed in stages


Phased installation staged installation50 l.jpg
Phased Installation(Staged Installation)

  • System installed in stages

  • Subsequent stages provide more features

  • Phases or stages need to be identified at general design

  • Advantages

    • lower costs for earlier results

    • benefits can be realised earlier

    • rate of change for users minimised


Phased installation staged installation51 l.jpg
Phased Installation(Staged Installation)

  • Disadvantages

    • close control of systems development is essential

    • costs associated with the development of temporary interfaces to old systems

    • limited applicability

    • demoralising - no sense of completing a system.


Pilot installation l.jpg
Pilot Installation

  • Old and new systems operated concurrently

Total cutover

Old system

Old system

New system

Old system

New system

New system


Pilot installation53 l.jpg
Pilot Installation

  • Old and new systems operated concurrently

  • Only part of the organisation tries out the new system

  • The pilot system must prove itself at the test site

  • Advantages

    • risks relatively low if problems occur

    • errors are localised

    • can be used to train users before implementation at their own site

  • Disadvantages

    • lack of consistency between different parts of organisation


Monitor operations l.jpg
Monitor Operations

  • Monitor user satisfaction

    • with functional requirements

    • with system performance

  • Run benchmark tests

  • Tune system


Transition to maintenance l.jpg
Transition to Maintenance

  • Most organisations have formal procedures set up

  • A "maintenance" section is responsible!

  • Procedures should be set up to request maintenance

  • Owners of the new system must be informed of relevant procedures


Post implementation review l.jpg
Post Implementation Review

  • A PIR analyses what went right and wrong with a project. It is conducted 2 to 6 months after conversion by a team which includes user reps, development staff, internal auditors and sometimes external consultants - development team is not in charge!

    • look at original requirements and objectives

    • evaluate how well they were met

    • compare costs of development and operation against original estimates (maintenance costs ??)

    • compare original and actual benefits

    • new system reviewed to see whether more of original or additional benefits can be realised

Must not become a witch-hunt

What went wrong ???

Learn for the future !!!


Slide57 l.jpg

References

Hoffer, J.A., George, J.F. and Valacich, J.S., (1999), 2nd ed., Modern Systems Analysis and Design, Benjamin/Cummings, Massachusetts.

Chapter 20

Whitten, J.L. & Bentley, L.D. and Dittman, K.C., (2001), 5th ed., Systems Analysis and Design Methods, McGraw-Hill Irwin, Boston MA.

Chapters 16, 17