1 / 25

Tapping the Collective Creativity of Your Team

Tapping the Collective Creativity of Your Team Marcin Chady Technical Director, Radical Entertainment (Activision Blizzard). Acknowledgment – George Mawle. Paradox. We expect team members to be “passionate” about games Not just animation or physics programming or lighting

chakra
Download Presentation

Tapping the Collective Creativity of Your Team

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Tapping the Collective Creativity of Your Team Marcin ChadyTechnical Director, Radical Entertainment (Activision Blizzard)

  2. Acknowledgment – George Mawle

  3. Paradox • We expect team members to be “passionate” about games • Not just animation or physics programming or lighting • We give them highly specialised roles • With little influence on other areas • Now, what do we actually want them to do with this passion? • When creativity and passion breach the role boundaries • “They are smart, passionate and creative. Let them figure it out.”

  4. Economics of Making a Pitch The value of a new idea (as subjectively perceived) vs Cost of pitching it (preparation, figuring out who to talk to, getting their attention) vs Risk of it being dismissed (because it’s lame, impractical, or because someone already thought of it)

  5. Disenfranchisement • If we don’t give them enough to make an informed calculation, e.g. • Who to talk to? • Has this been already discussed? • They exaggerate the cost or the risk* • The calculation comes out negative • “Not worth making fuss over such a small detail” • “It’s not my job to worry about this, after all” * That’s human nature: see e.g. Stephen Pinker, “How the Mind Works”

  6. Should We Care? Design by Committee Ivory Tower Syndrome Don’t listen to the team Stay focused Tunnel vision Missed opportunities Might get a polished turd • Listen to the team • Distraction • Lateral thinking • Might get the lowest common denominator ?

  7. The Goal • Retain independent creative decision making • Help team members make informed assessments of new ideas • A way to give and receive feedback that is: • Easy • Accessible • Public • Unintimidating • Unimposing • Impersonal • Structured • Constructive

  8. Other Goals • A home for suggestions • The awkward, unwanted bastard sibling of bugs • Rarely tracked in a bug database • Usually forgotten, unless they have a champion • Engage QA more in game-making process • Quality is more than being bug-free • Testers are hypersensitive to issues • An indicator of game quality

  9. Prior Art • Encouragement • Q&A sessions • Suggestion boxes • Surveys • All of them are manual: • Cost time • The first to go when deadlines loom

  10. General Idea • Hierarchical feature break-down • Comments • Ratings • Subscriptions • Database-backed • Ability to update one’s comment or rating • Agree/disagree option

  11. Implementation • About 5 days’ worth work • Embedded in the build tool • As opposed to a separate tool or a website • Feature breakdown and subscriptions stored in a simple YAML file • Feature breakdown owned by QA (a major headache solved) • People can manage their own subscriptions • Version controlled • Control spam • One score/comment per feature per person • One score, multiple comments per feature per person

  12. Scoring Scheme • 0to 100 percent or 1 to 5 stars? • Is mediocre 50 or 70? • No clear answer • Our choice: 0 – 100 scale with 5 point increments

  13. Policy • Build trust that feedback will be read and considered • Producer role critical • Feature owners free to decide whether to act on feedback • But prodded by producers to at least respond to new concerns

  14. Roll-Out • Initial resistance • “Unnecessary distraction” • “Leave it to professionals” • “Too technocratic – what we need is a change in attitudes” • Concerns about the name: BetterCritic→ Feedback Tool • Need buy-in and commitment from the top - The ratings feature! • Devolve ownership quickly • QA engagement • Users to control subscriptions

  15. Results • Between Oct 2010 and  Oct 2011 we collected: - 4000 operations (adding or modifying feedback or agreeing/disagreeing with it) - 10 per day, 30 per person • 3275 comments, 9 per day, 26 per person • 2561 scores, 7 per day, 20 per person • All comments resulted in an email sent to 10+ people • Most followed up by at least one email replied to all

  16. Contributions per User

  17. Contributions per Feature

  18. Goals Achieved • Most of the team has used it • Lots of positive feedback on the tool itself • Feature owners appreciated getting regular feedback • Some used it to discern patterns and trends • E.g. polarised opinions often indicate an interesting gameplay mechanic • Catalyst for brainstorming • Translates directly to the increased quality of the game • Targeting • Gliding • Camera • Story • Helicopter controls • Profanity

  19. A Life of Its Own • Ideas for the next game • Ideas for tools and tech improvements • Feedback on the studio, e.g. • Overtime food • Air conditioning

  20. Other Benefits • Water cooler effect • Catalyst effect • Critical mass effect • Calling out unsung heroes • Taking of team’s temperature • Repository of evidence • “Is this an issue? It clearly is.” • “Does this mission suck? Apparently not.”

  21. What Didn’t Work • Reading feedback • UI left much to be desired • Feature owners often relied entirely on email notifications • Many people didn’t like the scoring part • Onerous – “Why do I have to give a score?” • Unclear – “Not sure how to score” • Threatening – “Not fair that someone rated my mission 3/10” • Generally unreliable

  22. In Defence of Mandatory Ratings • Part of the package • Not meant to be absolute • Is game score going up? • What areas are dragging it down? • Are our effort in area X paying off? • Encourages people to give the feature full consideration • Makes feedback more readable and easier to organise • E.g. when faced with 200 comments, start with the highest and lowest-rating ones

  23. Future Work • “Feed-forward” to close the loop, i.e. • to pre-empt suggestions which have been already dismissed • to pre-empt feedback which is already being addressed • can take form of a special comment from the feature owner • Better UI • Data-mining tools • Charts and reports • Search • Binary ratings

More Related