1 / 9

WG2 Task Force “Crowdsourcing”

WG2 Task Force “Crowdsourcing”. Tobias Hossfeld, Matthias Hirth , Bruno Gardlo , Michal Ries , Sebastian Egger, Raimund Schatz, Katrien de Moor, Christian Keimel , Martin Varela, Lea Skorin-Kapov. Agenda. Report on Task Force progress by Tobias Hoßfeld

mervin
Download Presentation

WG2 Task Force “Crowdsourcing”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WG2 Task Force“Crowdsourcing” Tobias Hossfeld, Matthias Hirth, Bruno Gardlo, Michal Ries,Sebastian Egger, Raimund Schatz, Katrien de Moor, Christian Keimel, Martin Varela, Lea Skorin-Kapov WG2 Mechanisms and Models

  2. Agenda • Report on Task Force progress by Tobias Hoßfeld • STSM report by Bruno Gardlo: “Improving Reliability for Crowdsourcing-Based QoETesting” • “QualityCrowdfor video quality assessment” by Christian Keimel • TF crowdsourcing discussion WG2 Task Force „Crowdsourcing“

  3. „Tools“ forCollaboration • Wiki: • Idea: living document for discussion, collaboration, information for all • E.g. collecting and storing knowledge on experiments, design of experiments, etc. • Currently: mainly used for describing activities • Mailing list • Used for monthly reports, asking for input • Used for announcing crowdsourcing tests • STSMs • 3 STSMs relatedto crowdsourcing • Great! Successfully leads to joint work and joint results (per definition) • In summary • collaboration and discussion very good within group of active TF members • direct communication of members interested in certain topic WG2 Task Force „Crowdsourcing“

  4. Efforts in 2012: Joint Studies and Experiments • Video quality and impact of crowdsourcing platform and screening techniques (Bruno Gardlo, Tobias Hoßfeld) • Waiting times especially for YouTube video streaming (Tobias Hoßfeld, Raimund Schatz, Sebastian Egger) • Crowdsourced multidimensional Web QoE test campaign: performance, visual appeal, ease of use (Marin Varela, Lea Skorin-Kapov) • Visual Privacy (PavelKorshunov, EPFL) WG2 Task Force „Crowdsourcing“

  5. Efforts in 2012: Joint Publications • YouTube QoEwith crowdsourcing tests • 3 publications: Tobias Hoßfeld, Raimund Schatz, Sebastian Egger • Initial Delay vs. stalling for Internet video streaming • Similiarresults in lab and crowdsourcing (after filtering) • Crowdsourcing for audio-visualQoE Tests • 2 pubs: Bruno Gardlo, Michal Ries, Tobias Hoßfeld, Raimund Schatz • Impact of Screening Technique on Crowdsourcing • Microworkers vs. Facebook: The Impact of Crowdsourcing Platform Choice on Experimental Results • QualityCrowd: Platformfor Video Crowdsourcing Tests • 3 pubs: Christian Keimel, Julian Habigt, Clemens Horch, Klaus Diepold • Framework forconductingtestsandownexperiences • Details in wikihttps://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd WG2 Task Force „Crowdsourcing“

  6. Planning 2013 • Joint Qualinetpapers • Best practices QoEtesting with crowdsourcing: Tobias Hoßfeld, Bruno Gardlo, Christian Keimel, Matthias Hirth, Julian Habigt • QoEEvaluation via Crowdsourcing and comparison lab / crowd tests Bruno Gardlo, Katrien de Moor, Raimund Schatz, Michal Ries, Tobias Hoßfeld • Web QoEresultsLea Skorin-Kapov, Marin Varela • … • Further experiments • E.g. expectations tests via crowdsourcing in the context of authentication of social networks Tobias Hossfeld, Markus Fiedler • Further web QoEtestsLea Skorin-Kapov, Marin Varela • DropboxtestsRaimund Schatz, Tobias Hoßfeld • … WG2 Task Force „Crowdsourcing“

  7. Reflecting: Goals of this Task Force • to identify scientific challenges and problems for QoE assessment via crowdsourcing but also the strengths and benefits, • to derive a methodology and setup for crowdsourcing in QoE assessment, • to challenge crowdsourcing QoE assessment approach with usual “lab” methodologies, comparison of QoE tests • to develop mechanisms and statistical approaches for identifying reliable ratings from remote crowdsourcing users, • to define requirements onto crowdsourcing platforms for improved QoE assessment. • Joint activities, collaboration within Qualinet WG2 Task Force „Crowdsourcing“

  8. Open Issues • Experiences with crowdsourcing • What are the main problems? • Reliability, environment monitoring, technical implementation, language problems …? • cartography for crowdsourcing use cases and mechanisms • Incentive design for QoE tests • Improves reliability, complementary to filtering techniques • E.g. testsdesignedasgame • E.g. different paymentschemes • Requirements onto crowdsourcing platforms for improved QoEassessment • open API availablesoonformicroworkers • Database with crowdsourcing results • as part of WG 4? Available in crowdsourcing wiki? • E.g. analyze fake user ratings • E.g. compare lab and crowd results for certain apps • E.g. impact of context factors on QoE, country, habits, … • Framework for crowdsourcing testsavailabletoQualinet? • E.g. usingfacebook WG2 Task Force „Crowdsourcing“

  9. Thankyou https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd cs.wg2.qualinet@listes.epfl.ch WG2 Task Force „Crowdsourcing“

More Related