scalability tools automated testing 30 minutes l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Scalability Tools: Automated Testing (30 minutes) PowerPoint Presentation
Download Presentation
Scalability Tools: Automated Testing (30 minutes)

Loading in 2 Seconds...

play fullscreen
1 / 35

Scalability Tools: Automated Testing (30 minutes) - PowerPoint PPT Presentation


  • 183 Views
  • Uploaded on

Scalability Tools: Automated Testing (30 minutes). Overview Hooking up your game  external tools  internal game changes Applications & Gotchas  engineering, QA, operations  production & management Summary & Questions. (2). High-level, actionable reports for many audiences.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Scalability Tools: Automated Testing (30 minutes)' - sandra_john


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
scalability tools automated testing 30 minutes
Scalability Tools: Automated Testing(30 minutes)

Overview

Hooking up your game

 external tools

 internal game changes

Applications & Gotchas

 engineering, QA, operations

 production & management

Summary & Questions

review controlled tests actionable results useful for many purposes

(2)

High-level, actionable

reports for many audiences

Review: controlled tests & actionable results useful for many purposes

(1)

Repeatable tests, using N

synchronized game clients

Test Game

Programmer

Development Director

Executive

automated testing accelerates large scale game development helps predictability

TSO case study: developer efficiency

Strong test support

Weak test support

Automated testing accelerates large-scale game development & helps predictability

Better game

earlier

Ship Date

% Complete

Oops

autoTest

Time

Time

Target

Launch

Project

Start

measurable targets projected trends give you actionable progress metrics early enough to react

First Passing

Test

Now

Measurable targets & projected trends give you actionable progress metrics, early enough to react

Target

Oops

Any test

(e.g. # clients)

Time

Any Time

(e.g. Alpha)

success stories
Success stories
  • Many game teams work with automated testing
    • EA, Microsoft, any MMO, …
  • Automated testing has many highly successful applications outside of game development
  • Caveat: there are many ways to fail…
how to succeed
How to succeed
  • Plan for testing early
    • Non-trivial system
    • Architectural implications
  • Fast, cheap test coverage is a major change in production, be willing to adapt your processes
    • Make sure the entire team is on board
    • Deeper integration leads to greater value
  • Kearneyism: “make it easier to use than not to use”
automated testing components

Repeatable, Sync’ed

Test I/O

Collection

&

Analysis

Scriptable Test Client(s)

Emulated User Play Session(s)

Multi-client synchronization

Report Manager

Raw Data Collection

Aggregation / Summarization

Alarm Triggers

Automated testing components

Any Game

Startup

&

Control

Test Manager

Test Selection/Setup

Control N Clients

RT probes

input systems for automated testing
Input systems for automated testing

scripted

algorithmic

recorders

Game code

Multiple test applications are required, but each input type differs in value per application. Scripting gives the best coverage.

hierarchical automated testing
Hierarchical automated testing

subsystem

system

unit

Multiple levels of testing gives you

  • Faster ways to work with each level of code
  • Incremental testing avoids noise & speeds defect isolation
input scripted test clients
Input (Scripted Test Clients)

Pseudo-code script of users play the game, and what the game should do in response

Command steps

createAvatar [sam]

enterLevel 99

buyObject knife

attack [opponent]

Validation steps

checkAvatar [sam exists]

checkLevel 99 [loaded]

checkInventory [knife]

checkDamage [opponent]

scripted players implementation

Test Client (Null View) Game Client

Or, load both

State

State

Scripted Players: Implementation

Script Engine

Game GUI

Commands

Presentation Layer

Game Logic

test specific input output via a data driven test client gives maximum flexibility
Test-specific input & output via a data-driven test client gives maximum flexibility

Load

Regression

Reusable

Scripts & Data

Input

API

Test Client

Output

API

Key Game States

Pass/Fail

Responsiveness

Script-Specific

Logs & Metrics

a presentation layer is often unique to a game
A Presentation Layer is often unique to a game
  • Some automation scripts should read just like QA test scripts for your game
  • TSO examples
    • routeAvatar, useObject
    • buyLot, enterLot
    • socialInteraction (makeFriends, chat, …)

NullView Client

input data sets

Repeatable

Debugging & benchmarking

Edge cases & real world performance

Random

Input (data sets)

Repeatable tests in development, faster load, edge conditions

Mock data

Unpredictable user element

finds different bugs

Real data

common gotchas
Common Gotchas
  • Not designing for testability
    • Retrofitting is expensive
  • Blowing the implementation
    • Brittle code
    • Addressing perceived needs, not real needs
  • Use automated testing incorrectly
    • Testing the wrong thing @ the wrong time
    • Not integrating with your processes
    • Poor testing methodology
testing the wrong time at the wrong time

Build Acceptance Tests (BAT)

  • Stabilize the critical path for your team
  • Keep people working by keeping critical things from breaking

Final Acceptance Tests (FAT)

  • Detailed tests to measure progress against milestones
  • “Is the game done yet?” tests need to be phased in
Testing the wrong time at the wrong time

Applying detailed testing while the game design is still shifting and the code is still incomplete introduces noise and the need to keep re-writing tests

more gotchas poor testing methodology tools
More gotchas: poor testing methodology & tools
  • Case 1: recorders
    • Load & regression were needed; not understanding maintenance cost
  • Case 2: completely invalid test procedures
    • Distorted view of what really worked (GIGO)
  • Case 3: poor implementation planning
    • Limited usage (nature of tests led to high test cost & programming skill required)
  • Case 4: not adapting development processes
  • Common theme: no senior engineering analysis committed to the testing problem
automated testing for online games
Automated Testing for Online Games

Overview

Hooking up your game

 external tools

 internal game changes

Applications

 engineering, QA, operations

 production & management

Summary & Questions

slide19

Automated testing: strengths

  • Repeat massive numbers of simple, easily measurable tasks
  • Mine the results
  • Do all the above, in parallel, for rapid iteration

“The difference between us and a computer is that the computer is blindingly stupid, but it is capable of being stupid many, many millions of times a second.”

Douglas Adams (1997 SCO Forum)

semi automated testing is best for game development

Manual Testing

  • Creative bug hunting, visuals
  • Judgment calls, playability
  • Reacting to change,
  • Evaluating autoTest results
Semi-automated testing is best for game development

Testing

Requirements

  • Rote work (“does door108 still open?”)
  • Scale
  • Repeatability
  • Accuracy
  • Parallelism

Automation

Integrate the two for best impact

plan your attack with stakeholders r etire risk early qa production management
Plan your attack with stakeholders(retire risk early: QA, Production, Management)
  • Tough shipping requirements (e.g.)
    • Scale, reliability
    • Regression costs
  • Development risk
    • Cost / risk of engineering & debugging
    • Impact on content creation
  • Management risk
    • Schedule predictability & visibility
automation focus areas larry s top 5

Critical path stability

Keep team going forward

Non-determinism

Gets in the way of everything

Content regression

Massive, recurring $$

Compatibility & install

Improves life for you & user

Automation focus areas (Larry’s “top 5”)

Performance

Scale is hard to get right

yikes that all sounds very expensive
Yikes, that all sounds very expensive!
  • Yes, but remember, the alternative costs are higher and do not always work
  • Costs of QA for a 6 player game – you need at least 6 testers at the same time
      • Testers
      • Consoles, TVs and disks & network
      • Non-determinism
  • MMO regression costs: yikes2
      • 10s to 100s of testers
      • 10 year code life cycle
      • Constant release iterations
stability keep the team working tso use case critical path analysis
Stability: keep the team working!(TSO use case: critical path analysis)

Test Case: Can an Avatar Sit in a Chair?

use_object ()

  • Failures on the Critical Path block access to much of the game

buy_object ()

enter_house ()

buy_house ()

create_avatar ()

login ()

prevent critical path code breaks that take down your team
Prevent critical path code breaks that take down your team

Candidate code

Development

Safe code

Sniff Test

Pass / fail, diagnostics

Checkin

stability non determinism monkey tests

Code Repository

Compilers

Reference Servers

Stability & non-determinism (monkey tests)

Continual Repetition of Critical Path Unit Tests

autotest addresses non determinism
AutoTest addresses non-determinism
  • Detection & reproduction of race condition defects
    • Even low probability errors are exposed with sufficient testing (random, structured, load, aging)
  • Measurability of race condition defects
    • Occurs x% of the time, over 400x test runs
content testing areas
Content testing (areas)
  • Regression
  • Error detection
  • Balancing / tuning
  • This topic is a tutorial in and of itself
    • Content regression is a huge cost problem
    • Many ways to automate it (algorithmic, scripted & combined, …)
    • Differs wildly across game genres
content testing more examples
Content testing (more examples)
  • Light mapping, shadow detection
  • Asset correctness / sameness
  • Compatibility testing
  • Armor / damage
  • Class balances
  • Validating against old userData
  • … (unique to each game)
automated testing for online games one hour
Automated Testing for Online Games(One Hour)

Overview

Hooking up your game

 external tools

 internal game changes

Applications

 engineering, QA, operations

 production & management

Summary & Questions

summary automated testing
Summary: automated testing
  • Start early & make it easy to use
    • Strongly impacts your success
  • The bigger & more complex your game, the more automated testing you need
  • You need commitment across the team
    • Engineering, QA, management, content creation
q a other resources
Q&A & other resources
  • My email: larry.mellon_@_emergent.net
  • More material on automated testing for games
    • http://www.maggotranch.com/mmp.html
      • Last year’s online engineering slides
      • This year’s slides
      • Talks on automated testing & scaling the development process
    • www.amazon.com: “Massively Multiplayer Game Development II”
      • Chapters on automated testing and automated metrics systems
    • www.gamasutra.com: Dag Frommhold, Fabian Röken
      • Lengthy article on applying automated testing in games
    • Microsoft: various groups & writings
  • From outside the gaming world
    • Kent Beck: anything on test-driven development
    • http://www.martinfowler.com/articles/continuousIntegration.html#id108619: Continual integration testing
    • Amazon & Google: inside & outside our industry