Crowdsourced testing a quick overview
This presentation is the property of its rightful owner.
Sponsored Links
1 / 12

Crowdsourced Testing: A Quick Overview PowerPoint PPT Presentation


  • 115 Views
  • Uploaded on
  • Presentation posted in: General

Crowdsourced Testing: A Quick Overview. I first heard of it (and the broader crowdsourcing topic) at the Jul 2008 TMF in a talk by James Whittaker (who said it was part of Microsoft’s vision of the future).

Download Presentation

Crowdsourced Testing: A Quick Overview

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Crowdsourced testing a quick overview

Crowdsourced Testing: A Quick Overview

  • I first heard of it (and the broader crowdsourcing topic) at the Jul 2008 TMF in a talk by James Whittaker (who said it was part of Microsoft’s vision of the future).

  • Piqued my interest as I was at the BBC wrestling with extensive browser/OS coverage requirements triggered by the BBC Charter.

  • uTest.com the only player in the game back then.

  • uTest had raised about $12m at that stage (now $37m), all of it after the credit crunch.


Crowdsourced testing a quick overview1

Crowdsourced Testing: A Quick Overview

  • Unlike crowdcasting (e.g. logo design) where the winner takes all, in testing everyone can contribute a piece of the overall solution.

  • That makes testing ideally suited to engaging a motivated crowd.

  • Go to the brilliant Zappers event run by TCL (uTest’s UK affiliate) and you’ll meet excellent crowd testers who are motivated by non-monetary factors (more on that shortly).

  • The existing maturity of test management tools has provided an excellent basis for the (essential) crowd management.


Introducing utest com

Introducing Utest.com

  • US-based company, pilot system launched in Feb ’08.

  • Circa 49,000 in the crowd (circa 3000 in the UK). But how many active?

  • Sales team in US (at least some of them), handing over to TCL (uTest’s UK affiliate) to project manage the cycles.

  • A functional management interface (but not commensurate with the funding received or the time since launch).

  • iPhone app available (but bugged as of last summer).

  • Fixed price model for the client, testers paid per bug or artefact on the other side of the equation.

  • TCL PM drives crowd selection, but with a client steer.

  • Daily reporting (if desired) with an optional test lead to sanity check submissions at minimal cost (approx £8 per day).

  • Offers functional and non-functional (although I’m not sure about the repeatability of the latter). Covers mobile, web and desktop.


Alternatives to utest

Alternatives To uTest

I’ve briefly looked at, but not used:

  • Bugfinders (UK company, client pays per bug).

  • Centre4Testing (Leverages C4T’s candidate pool, UK crowd, formally engaged to deliver coverage. In quoting for my case study they wanted a lot more info and lead time than uTest. About 10% dearer than uTest on the 1st cycle but about 50% cheaper for subsequent cycles. Also slightly cheaper on 1st cycle if you exclude uTest’s new client discount).

    I’m aware of, but have not looked at:

  • Bugpub

  • 99tests (Seemingly not associated with 99designs.com)

    I’m sure there are (and will be) others.


Crowdsourced testers some motivations

Crowdsourced Testers: Some Motivations

  • Practising their skills, breaking things, trying the latest software, having fun, increasing their market exposure through rankings, networking*, etc…

  • For many (most?), money is not a material factor and they have a day job for subsistence.

    * I’ve since met the uTest team lead for my case study and would happily engage him in a full-time role.


Crowdsourced testing some pros

Crowdsourced Testing: Some Pros

  • Significantly more eyeballs on the case £ for £; compares extremely favourably with a single contractor.

  • Externalise accommodation costs (desks, PCs etc).

  • Ultra rapid ramp-up time (vis-à-vis recruitment) and feedback loop (potentially get bugs within minutes).

  • Flexible engagement model for reactively rolling testers on and off the project (e.g. for post-release testing).

  • Mitigates costs of test environment provision in the face platform proliferation.

  • Evening and weekend productivity at zero marginal cost.

  • Cheap form of usability testing if you know how to frame the survey.


Crowdsourced testing some cons

Crowdsourced Testing: Some Cons

  • Lack of direct accountability (some ‘soft’ sanctions like star ratings and feedback to platform operator).

  • System knowledge vested in un-contracted resources.

  • Could be unsettling to in-house testers.

  • Could be seen as de-valuing testing within the organisation.

  • If it’s a desktop application and it gets leaked – a watermark system may be scant consolation.

  • Testers may (probably?) care less than their internal counterparts.

  • Need to provide strangers with access to your test instance.

  • Could confer unwarranted confidence; relies on stakeholders understanding these downsides.


Crowdsourced testing applicability

Crowdsourced Testing: Applicability

  • Start-ups that can’t sustain in house test resource.

  • Organisations that aren’t exclusively reliant on the crowd (not withstanding my point above).

  • Agile teams that want out-of-hours productivity (especially for smoke testing purposes) and/or to mitigate the effect of having a single embedded tester working in mini-water falls.

  • Public facing systems that need to undergo testing out in the wild (corporate systems involving sensitive/financial data are much less appropriate).

  • Organisations where the testing workload is highly bi-modal.

  • Places that lack the time/environments to test in-scope OS/browser combinations.

  • Environments where exploratory testing has been culturally accepted.

  • Environments that may want to target users in specific geographies to enable, for example, localisation testing out in the wild.


Case study the context

Case Study: The Context

  • One of Europe’s largest price comparison sites with profits measured in the millions.

  • Wholly reliant on TDD (within Kanban) save for 1 over-worked manual tester located on the continent.

  • Requirement was for post-release testing of a UK-only web proposition in public beta.

  • No time or money to recruit contract testers.

  • 3 week window of opportunity for testing but with scope for it to slip.

  • No meaningful collateral from which to derive tests (just a load of binned post-it notes).

  • Engaged by the Programme Director who wanted to de-risk the testing and was open to suggestions. He knew about uTest but had no time/patience with working through the relevant questions.

  • I wanted crowdsourcing experience so I offered take this off his plate.


Case study the solution

Case Study: The Solution

  • I became the uTest interface and clarified the commercials, legals, process etc. Need to read the terms – uTest can quote you as a case study unless you opt out.

  • uTest sales team willing to progress paperwork over the weekend.

  • Commissioned a UK-only crowd for 3 calendar weeks of exploratory testing at a cost of $3000 which factored in a $600 new client discount. We paid $250 to have a team lead (in addition to the TCL PM) to sanity check submissions. $3,250=£2,075.

  • uTest provided a fixed IP to enable test traffic to be removed from merchant billing.

  • Coverage: Windows: Chrome, FF, IE 7/8/9 (+ Mac/Safari).

  • Testers were given a minimal briefing with goals comparable in detail to a charter within session-based test management.

  • TCL PM provided lightweight daily reports tailored to our requirements (weekdays only).


Case study the outcome

Case Study: The Outcome

  • TCL invited 35 testers of which 17 accepted with 10 submitting defects. Around 80 defects (from memory) with a rejection rate circa 10%. Some useful site review feedback was also provided (at no extra cost).

  • The reporting and defect sanity checking worked well and made the internal triage process more effective.

  • Bug detection continued during weekends and evenings through to the early hours of the morning.

  • Rightly or wrongly, the client was delighted with the results.

  • Programme Director – who is a contractor - has vowed to “try and use uTest in as many future roles as possible as it worked brilliantly”.

  • Whilst recognising that the bar had been set low for adding value (post-release, no internal testers, non-complex web app etc) I also felt positive about the experience.

  • I felt it was too cheap; I wonder what time horizon the uTest backers have in mind for turning a profit.


Conclusions

Conclusions

  • A rich and burgeoning topic area in which venture capitalists are quite active.

  • Still a young area with massively un-tapped potential, but established enough to not be dismissed out of hand.

  • uTest looks dominant but there are other options.

  • In the right hands and circumstances it can be a very powerful addition to your context-sensitive toolbox.

  • My uTest experience was positive but the bar was set low.

  • Its disadvantages will be unacceptable for some clients.

  • In my view, crowdsourced testing does have legs which will see its adoption rise over the coming years.


  • Login