slide1
Download
Skip this Video
Download Presentation
GDC 2006 tutorial abstract: Engineering issues for online games

Loading in 2 Seconds...

play fullscreen
1 / 43

GDC 2006 tutorial abstract: Engineering issues for online games - PowerPoint PPT Presentation


  • 280 Views
  • Uploaded on

GDC 2006 tutorial abstract: Engineering issues for online games.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'GDC 2006 tutorial abstract: Engineering issues for online games' - PamelaLan


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
gdc 2006 tutorial abstract engineering issues for online games
GDC 2006 tutorial abstract: Engineering issues for online games
  • As the size and scope of multiplayer games continue to grow, the engineering requirements of multiplayer development expand drastically. Moreover, the lifecycle demands for successful massively multiplayer games can involve more than a decade of sustained development after launch.
  • This tutorial focuses on the software engineering challenges of producing multiplayer games, including single-player versus multi-player code, testing and regressions, security in peer to peer and server-oriented games, and protecting the consumers in an increasingly dangerous online environment. Common fail points of single-player engineering tactics in a multi-player world are addressed, and the longer term issues of building and maintaining server-based games when downtime means a direct and public loss of revenue and market share.
  • This slide deck contains several “background” slides, hidden in slide-show mode. Print for complete data.
tutorial takeaway messages
Tutorial Takeaway Messages
  • Building online games with single-player game techniques is painful
  • Early changes to your development process & system architecture to accommodate online play will greatly ease the pain
  • “Lessons Learned” from our background will provide your project with ways to avoid especially painful places we’ve found ourselves…
today s schedule
Today’s Schedule

10:00am  Automated Testing for Online Games

Larry Mellon, Emergent Game Technologies

11:00am Coffee break

11:15am  Crash Course in Security: What it is, and why you need it

Dave Weinstein, Microsoft

12:30pm Lunch Break

2:00pm  Integrating reliability and performance into the production

process: How to survive when “five nines” is a must

Neil Kirby, Bell Labs

3:00pm  Single player woes for MP design

Gordon Walton, Bioware (Austin)

4:00pm Snack break

4:15pm  Building continually updated technology – the MMO lifecycle

Bill Dalton , Bioware (Austin)

5:30pm  Questions & War Stories (All panelists)

Question to ponder for this session: Inherited problems.

What do you do once the bad decisions have already been made, and you’re the one who has to deal with them?

6:00pm End of tutorial

introduction

Playing with friends

Exciting competition

New business models

Limited media

Non-linear time

Community

Challenges

Persistence

Game Play

Fairness

Mechanical

Style

Network distortion & multi-player input

Difficult to design

Introduction

Why Online?

Our Focus

Non-determinism & multi-process

Difficult to debug

Difficult to get right

Scale, reliability, long lifecycle, …

automated testing supports your ability to deal with all these problems
Automated testing supports your ability to deal with all these problems

Automated testing

Development & Operations

Multi-player testing

Find and reproduce

hard bugs, @ scale

+

Scale & repeatability

Speed of automation

Prediction, stability

& focus

+

Accurate, repeatable tests

automated testing for online games one hour
Automated Testing for Online Games(One Hour)

Overview

Hooking up your game

 external tools

 internal game changes

Applications & Gotchas

 engineering, QA, operations

 production & management

Summary & Questions

big green autotest button gives controlled tests actionable results that helps across your team

(2)

High-level, actionable

reports for many audiences

Big green “autoTest” button gives controlled tests & actionable results that helps across your team

(1)

Repeatable tests, using N

synchronized game clients

Test Game

Programmer

Development Director

Executive

automated testing accelerates online game development helps predictability

MMP Developer Efficiency

Strong test support

Weak test support

Automated testing accelerates online game development & helps predictability

Ship Date

% Complete

Oops

autoTest

Time

Time

Target

Launch

Project

Start

measurable targets projected trends give you actionable progress metrics early enough to react

First Passing

Test

Now

Measurable targets & projected trends give you actionable progress metrics, early enough to react

Target

Oops

Any test

(e.g. # clients)

Time

Any Time

(e.g. Alpha)

success stories
Success stories
  • Many game teams work with automated testing
    • EA, Microsoft, any MMO, …
  • Automated testing has many highly successful applications outside of game development
  • Caveat: there are many ways to fail…
how to succeed
How to succeed
  • Plan for testing early
    • Non-trivial system
    • Architectural implications
  • Fast, cheap test coverage is a major change in production, be willing to adapt your processes
    • Make sure the entire team is on board
    • Deeper integration leads to greater value
  • Kearneyism: “make it easier to use than not to use”
automated testing components

Repeatable, Sync’ed

Test I/O

Collection

&

Analysis

Scriptable Test Clients

Emulated User Play Sessions

Multi-client synchronization

Report Manager

Raw Data Collection

Aggregation / Summarization

Alarm Triggers

Automated testing components

Any Online Game

Startup

&

Control

Test Manager

Test Selection/Setup

Control N Clients

RT probes

input systems for automated testing
Input systems for automated testing

scripted

algorithmic

recorders

Game code

Multiple test applications are required, but each input type differs in value per application. Scripting gives the best coverage.

input scripted test clients
Input (Scripted Test Clients)

Pseudo-code script of users play the game, and what the game should do in response

Command steps

createAvatar [sam]

enterLevel 99

buyObject knife

attack [opponent]

Validation steps

checkAvatar [sam exists]

checkLevel 99 [loaded]

checkInventory [knife]

checkDamage [opponent]

scripted players implementation

Test Client (Null View) Game Client

Or, load both

State

State

Scripted Players: Implementation

Script Engine

Game GUI

Commands

Presentation Layer

Client-Side Game Logic

test specific input output via a data driven test client gives maximum flexibility
Test-specific input & output via a data-driven test client gives maximum flexibility

Load

Regression

Reusable

Scripts & Data

Input

API

Test Client

Output

API

Key Game States

Pass/Fail

Responsiveness

Script-Specific

Logs & Metrics

a presentation layer is often unique to a game
A Presentation Layer is often unique to a game
  • Some automation scripts should read just like QA test scripts for your game
  • TSO examples
    • routeAvatar, useObject
    • buyLot, enterLot
    • socialInteraction (makeFriends, chat, …)

NullView Client

input data sets

Repeatable

Debugging & benchmarking

Edge cases & real world performance

Random

Input (data sets)

Repeatable tests in development, faster load, edge conditions

Mock data

Unpredictable user element

finds different bugs

Real data

input client synchronization
Input (client synchronization)

RemoteCommand (x)

Ordered actions to clients

waitFor (time)

Brittle, less reproducible

waitUntil (localStateChange)

Most realistic & flexible

common gotchas
Common Gotchas
  • Not designing for testability
    • Retrofitting is expensive
  • Blowing the implementation
    • Code blowout
    • Addressing perceived needs, not real needs
  • Use automated testing incorrectly
    • Testing the wrong thing @ the wrong time
    • Not integrating with your processes
    • Poor testing methodology
testing the wrong time at the wrong time

Build Acceptance Tests (BAT)

  • Stabilize the critical path for your team
  • Keep people working by keeping critical things from breaking

Final Acceptance Tests (FAT)

  • Detailed tests to measure progress against milestones
  • “Is the game done yet?” tests need to be phased in
Testing the wrong time at the wrong time

Applying detailed testing while the game design is still shifting and the code is still incomplete introduces noise and the need to keep re-writing tests

more gotchas poor testing methodology tools
More gotchas: poor testing methodology & tools
  • Case 1: recorders
    • Load & regression were needed; not understanding maintenance cost
  • Case 2: completely invalid test procedures
    • Distorted view of what really worked (GIGO)
  • Case 3: poor implementation planning
    • Limited usage (nature of tests led to high test cost & programming skill required)
  • Common theme: limited or no senior engineering committed to the testing problem
automated testing for online games one hour24
Automated Testing for Online Games(One Hour)

Overview

Hooking up your game

 external tools

 internal game changes

Applications

 engineering, QA, operations

 production & management

Summary & Questions

slide25
The strength of automated testing is the ability to repeat massive numbers of simple, easily measurable tasks and mine results

“The difference between us and a computer is that the computer is blindingly stupid, but it is capable of being stupid many, many millions of times a second.”

Douglas Adams (1997 SCO Forum)

semi automated testing is best for game development

Manual Testing

  • Creative bug hunting, visuals
  • Judgment calls, playability
  • Reacting to change, evaluating autoTest results
Semi-automated testing is best for game development

Testing

Requirements

  • Rote work (“does door108 still open?”)
  • Scale
  • Repeatability
  • Accuracy
  • Parallelism

Automation

Integrate the two for best impact

plan your attack r etire risk early
Plan your attack (retire risk early)
  • Tough shipping requirements (e.g.)
    • Scale, reliability
    • Regression costs
  • Development risk
    • Cost / risk of engineering & debugging
    • Impact on content creation
  • Management risk
    • Schedule predictability & visibility
automation focus areas larry s top 5

Build / server stability

Keep team going forward

Non-determinism

Gets in the way of everything

Content regression

Massive, recurring $$ for MMO

Compatibility & install

Saves $$ for some games

Automation focus areas (Larry’s “top 5”)

Load testing

Scale is hard to get right

yikes that all sounds very expensive
Yikes, that all sounds very expensive!
  • Yes, but remember, the alternative costs are higher and do not always work
  • Costs of QA for a 6 player game – you need at least 6 testers at the same time
      • Testers
      • Consoles, TVs and disks
      • Network connections
  • MMO regression costs: yikes2
      • 10s to 100s of testers
      • 10 year code life cycle
      • Constant release iterations
stability analysis code servers what brings down the team
Stability analysis (code & servers)What brings down the team?

Test Case: Can an Avatar Sit in a Chair?

use_object ()

  • Failures on the Critical Path block access to much of the game

buy_object ()

enter_house ()

buy_house ()

create_avatar ()

login ()

unstable builds are expensive slow down your entire team

Bug introduced

Feedback takes

hours (or days)

Unstable builds are expensive & slow down your entire team!

Development

Checkin

Repeated cost of detection & validation

Firefighting, not going forward

Build

Impact on others

Smoke

Regression

Dev Servers

prevent critical path code breaks that take down your team
Prevent critical path code breaks that take down your team

Candidate code

Development

Safe code

Sniff Test

Pass / fail, diagnostics

Checkin

stability non determinism monkey tests

Code Repository

Compilers

Reference Servers

Stability & non-determinism (monkey tests)

Continual Repetition of Critical Path Unit Tests

build stability full testing comb filtering

Smoke Test, Server Sniff

- Is the game playable?

- Are the servers stable

under a light load?

- Do all key features work?

Full Feature Regression, Full Load Test

- Do all test suites pass?

- Are the servers stable

under peak load conditions?

$$$

$$

Promotable to

full testing

Full system build

Promotable

Build stability & full testing: comb filtering

Sniff Test, Monkey Tests

- Fast to run

- Catch major errors

- Keeps coders working

$

New code

  • Cheap tests to catch gross errors early in the pipeline
  • More expensive tests only run on known functional builds
content testing areas
Content testing (areas)
  • Regression
  • Error detection
  • Balancing / tuning
  • This topic is a tutorial in and of itself
    • Content regression is a huge cost problem
    • Many ways to automate it (algorithmic, scripted & combined, …)
    • Differs wildly across game genres
content testing more examples
Content testing (more examples)
  • Light mapping, shadow detection
  • Asset correctness / sameness
  • Compatibility testing
  • Armor / damage
  • Class balances
  • Validating against old userData
  • … (unique to each game)
load testing before paying customers show up
Load testing, before paying customers show up

Expose issues that only occur at scale

Establish hardware requirements

Establish play is acceptable @ scale

load testing catches non scalable designs

Local data

Local data

Local data

Global data

(MP) shared data must be

packaged, transmitted, unpackaged,

and constantly refreshed

Load testing catches non-scalable designs

Global data

(SP) all data is always available & up to date

Scalability is hard: shared data grows with #players, AI, objects, terrain, …, & more bugs!

load testing find poor resource utilization
Load testing: find poor resource utilization

22,000,000 DS Queries! 7,000 next highest

automated testing for online games one hour41
Automated Testing for Online Games(One Hour)

Overview

Hooking up your game

 external tools

 internal game changes

Applications

 engineering, QA, operations

 production & management

Summary & Questions

summary automated testing
Summary: automated testing
  • Start early & make it easy to use
    • Strongly impacts your success
  • The bigger & more complex your game, the more automated testing you need
  • You need commitment across the team
    • Engineering, QA, management, content creation
resources
Resources
  • Slides are on the web at: www.emergent.net
  • More material on automated testing for games
    • http://www.maggotranch.com/mmp.html
      • Last year’s online engineering slides
      • Talks on automated testing & scaling the development process
    • www.amazon.com: “Massively Multiplayer Game Development II”
      • Chapters on automated testing and automated metrics systems
    • www.gamasutra.com: Dag Frommhold, Fabian Röken
      • Lengthy article on applying automated testing in games
    • Microsoft: various groups & writings
  • From outside the gaming world
    • Kent Beck: anything on test-driven development
    • http://www.martinfowler.com/articles/continuousIntegration.html#id108619: Continual integration testing
    • Amazon & Google: inside & outside our industry
ad