Stickyminds com and better software magazine presents
Download
1 / 27

StickyMinds and Better Software magazine presents… - PowerPoint PPT Presentation


  • 67 Views
  • Uploaded on

StickyMinds.com and Better Software magazine presents…. Avoid Throwaway Test Automation Sponsored by Cognizant Non-streaming participants should call 1-866-761-8643 International Non-streaming participants should call 1-904-596-2362 . Setting the Context.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' StickyMinds and Better Software magazine presents…' - yered


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Stickyminds com and better software magazine presents

StickyMinds.com and Better Software magazine presents…

Avoid Throwaway Test Automation

Sponsored by Cognizant

Non-streaming participants should call 1-866-761-8643

International Non-streaming participants should call 1-904-596-2362


Setting the context
Setting the Context

  • What we mean by “automated testing”

  • Other types of tool-assisted testing

  • Principles will apply to other types

  • Many topics deserve more attention

  • Automated testing is software development


What typically goes wrong
What Typically Goes Wrong

  • Create automated tests that don’t run anymore

  • Spending too much time babysitting and maintaining the automation

  • Automated tests are too brittle

  • Tools don’t work in environment

  • Automated tests not providing value


Common mistakes
Common Mistakes

  • No plan for implementation

  • No buy-in from staff or management

  • No training for automators

  • No time allotted to automate

  • No time allotted for maintenance

  • No framework for reusability

  • Good intentions, poor execution


Why automate
Why Automate?

  • Sounds cool

  • Boss said so

  • Can’t keep up

  • Lots of repetitive tests

  • Lots of data driven tests

  • Reduce time spent on regression testing


Develop an automation plan
Develop an Automation Plan

  • Why you will automate

  • What to automate

  • When to automate

  • Who will automate, execute, maintain

  • How to automate – Framework

  • How to report results

  • Where to run tests


What to automate
What to automate

  • Smoke tests

  • Repetitive tests

  • Can run autonomously

  • Big risks

  • Take less time to automate than execute

    • Should run 3-5 times without changing

  • Data intensive tests


When to automate
When to automate

  • When you’re ready

  • Depends on development lifecycle

    • Waterfall – may wait until the end

    • Agile – may need continuous automation

  • Not too early, not too late

    • Need stable-ish UI

    • Before you have to regression test

  • Plan time for automation and maintenance

    • Manage as part of regression testing time


Selecting the right tool
Selecting the Right Tool

  • Define your tool requirements

    • What you need it to do

    • Compatibility with your application

    • Compatibility with your skillsets

  • Try it out

  • Beware of the hype

    • “Record and Playback” rarely is that simple


Open source tools
Open Source Tools

  • Free to acquire, not to use

  • More time required for implementation

    • Installation and configuration

    • Learning to use the product

  • More technical skills required

  • http://opensourcetesting.org


Develop a framework
Develop a Framework

  • Organization of artifacts

  • Aim for reusability

    • Across features, product versions, and products

    • Separate interface from functionality

  • Dealing with common activities

    • Object recognition

    • Navigation

    • Data validation


Dealing with change
Dealing with Change

  • Plan for changes in UI

    • How to respond to test failures

    • Flexible object recognition

  • Make tests data independent

    • Reduce dependencies between tests

    • Set up test data in cleanup scripts

    • Script tests to use dynamic data

  • Enlist help of developers to ease automation



Objective
Objective

Background

Objective

  • Automation Assessment Approach

    • Process

    • Infrastructure

    • Tools

    • Framework

    • Operating Model

    • Best Practices


Assessment Scope

  • Automation Prioritization

  • Planning & Strategy

  • Approach

  • Estimation Model

  • Documents

  • Guidelines

  • Review procedures & Checklists

  • Metrics Collection

  • Configuration management

  • Audits/Assessments

  • Change Management

  • Environment management Process

  • Defect management

  • Maintenance Approach

  • Environment Management

  • Test bed creation and maintenance

  • Architecture & Type

  • Test Data Management

  • Reporting Mechanism

  • Error & Exception

  • Handling

  • Folder Structure

  • Scalability

  • Reusability

  • Function Library

  • Object Repository

  • Database Testing

  • Batch Execution

Infrastructure

Framework

Focus

Areas

Process

Best

Practices

Tools

  • Scripting Standards

  • User Guides

  • Maintenance Process Handbook

  • Dynamic Script Allocation

  • Automation Review Tool

  • KR portal

  • Integration of automation

  • scripts to test management tools

Operating

Model

  • Roles & Responsibility

  • Governance Model

  • Organizational SLAs

  • Project structuring

  • Communication

  • Functional Automation

  • Test Management

  • Configuration Management

  • Defect Management

  • License Management

15

15


Process
Process

  • Communication and Collaboration – (BA’s, Developer’s, Manual Tester’s, etc)

  • Identification and Prioritization

  • Planning and Estimation

  • Change Management

  • Maintenance Approach


Tools
Tools

  • Functional Automation

  • Test Management

  • Configuration Management

  • Defect Management

  • Open Source


Framework
Framework

  • Architecture and Type

  • Test Data Management

  • Reporting Mechanism

  • Reusability

  • Maintainability

  • Object Repository

  • Database Testing


Infrastructure
Infrastructure

  • Environment Management

  • Test Bed creation and management


Best practices
Best Practices

  • Scripting Standards

  • User Guides

  • Maintenance Process Handbook

  • Automation Review Tool

  • KR portal


Operating model
Operating Model

  • Roles and Responsibilities

  • Organizations SLA’s

  • Project Structuring


Approach - Highlights

Focus Areas

Highlights

Definition of automation framework

Customized metrics framework

Define governance model

Benefits

Structured methodology automation testing

Well defined Organization structure and Governance model in place

Consolidation of automation tool

Assess current automation capabilities

Fully customized metrics framework for implementation across application

Identify the ideal automation tool

Set up communication model and status reporting

Defined communication and workload processes for onsite - offshore co-ordination

Use of reusable automation scripts

Well defined independent and peer review procedures in place


Testing services practice overview
Testing Services Practice Overview

?

$

IV

&

V

8500 (E)

Independent Verification & Validation Service (IV&V)

5000

2400

800

170

75

2001 & 2002 2003 2004 2005 2006 2007

Enhanced service offerings such as compliance testing, package testing, White-box testing as well as Domain/ Product Testing (VisionPLUS, FACETS & POS)

End-to-end IV&V services provided

Brought in domain alignment (Domain Product Testing and BA/QA Offering)

Engaged with clients to setup Managed Test Centers

Commenced new client engagements with Test Consulting

Focused on Automation and Mainframe CoEs

Delivery excellence through deployment of innovative methodologies.

Expand Global footprint

Launched to provide specialized functional testing services to existing Cognizant customers

Offered as a distinct service offering to customers

Established onsite-offshore model for testing

Integrated with other value added services such as Performance testing

OUR DOMAIN FOUNDATION

Integrated BA / QA Offering in collaboration with domain practices

Insurance

Communications

Manlog

Technology

Healthcare

Life Sciences

IME

BFS

Retail

CENTER OF

EXCELLENCE

INDEPENDENCE

PEOPLE

CLIENTS

ALLIANCES

Invested in focused groups around tools & frameworks to provide client value-adds

Established alliances with leading tool vendors like Mercury, Borland & IBM Rational

Over 70% of testing performed against code provided by client or third-party vendors.

Team of over 5000 dedicated SQA professionals

200+ Clients with 10+

Deep client engagements with over 100 people


Value Adds

CRAFT defines the method for scripting of business functionalities as reusable libraries that are repetitive among test cases

CRAFT

CRAFT 2.0 is a tool which streamlines the test execution activity during test automation, it dynamically executes the test cases in multiple machines in a distributed environment

Bulk uploads QTP scripts, attachments and folder structure to Quality Center

CRAFT 2.0

AHEAD

WS Test

Professional

DataXpress

SOA testing solutions to test business logic. It enables client to execute data-driven web service testing without any programming knowledge

DataXpress is a automated test generation tool which enables to streamline the test data preparation activity


Value Adds

Return on investment details to have maximum transparency to client before an automation engagement.

ROI

Calculator

It is an automation functional test tool developed for web automation

It integrates and synchronizes the defects management module of Quality Center with that of Bugzilla.

Selenium test manager

QC2bugzilla

Win2Pro

Watir

Web Application Testing in Ruby (WATIR) is an open source function testing framework to test any web application built on ASP, .NET, J2EE or PHP

Converts Winrunner scripts to QTP automatically




ad