1 / 19

Follow the Crowd: On QoE for Internet Applications

Follow the Crowd: On QoE for Internet Applications . Tobias Hoßfeld. www3.informatik.uni-wuerzburg.de www.t-hossfeld.de. What is the Internet crowd consuming ?. Web and Cloud Applications Online Video, Web Browsing, Downloads, Cloud Services, etc. Why relevant?

lowell
Download Presentation

Follow the Crowd: On QoE for Internet Applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Follow the Crowd: On QoE for Internet Applications Tobias Hoßfeld www3.informatik.uni-wuerzburg.dewww.t-hossfeld.de

  2. Whatisthe Internet crowdconsuming? • Web and Cloud Applications • Online Video, Web Browsing, Downloads, Cloud Services, etc. • Why relevant? • Constitute dominant internet use cases • Generate relevant share of network traffic Global Consumer Internet Traffic Volume (Forecast).Source: Cisco VNI 2011.

  3. YouTube QoEandPracticalGuidelines

  4. QoE Issue: Waiting, Waiting, Waiting… Stalling

  5. Video Transmission over the Internet • UDP-based streaming • Unreliable transmission • Video quality affected • Artifacts may occur • Stimuli are visual degradationsor artifacts • HTTP streaming • Reliable transmission • Video quality not affected • But stalling may occur • Most stimuli/impairments are of temporal nature • YouTube uses HTTP streaming • Internet technology changes quality perception

  6. Key Influence Factors on YouTube QoE • Interesting: no significantcorrelation of QoE and • initialdelay • video characteristics likeresolution, type of content,ratio of audio/video, etc. • users preference, whether they liked video • demographical features • Stalling frequency andstalling duration determinethe user perceived quality • Support vector machines and correlation coefficients

  7. What is the influence of stalling on YouTube QoE? • Small number of interruptions strongly affect YouTube QoE • Provider (i.e. content and network provider) must avoid stalling • Total stalling time not sufficient for good QoE estimation • Monitoring of QoErequires sophisticated methods to capture stalling pattern, e.g. using DPI or directly at end user

  8. Provider: Between the Devil and the Deep Blue Sea? • In case of insufficient resources, • „one“ has to choose between initial delays and stalling • What is worse for users? • Stalling has to be avoided,even at costs of initial delays • Current work: Is YouTube QoEmanagement beneficial for ISPs? • Users do „QoE management“ themselves – by pausing the video to prefetch contents and then to consume w/o interruptions • ISP may „invest“ in capacity, sophisticated traffic management, e.g. DASH and SVC • Exponential increase of costs wrt. quantile (of video corpus) • Delivering videos with about 120% of video bitrateas “rule of thumb”

  9. Crowdsourcing forQoETesting

  10. Crowdsourcing • Crowdsourcing is a neologism composed of “crowd“ and “outsourcing“  literally, it means outsourcing to a (large, anonymous) crowd • All tasks are web-based micro jobs, typically little effort to fulfill • Crowdsourcing interesting for (QoE) user studies • large user panel, diversity of users, international users, • user studies can be executed in short time, • low costs in contrast to laboratory studies, • QoE tests for Internet applications with realistic settings Crowdsourcing is the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call. Jeff Howe - Definition of Crowdsourcing “The White Paper Version“

  11. Crowdsourcing Workflow 1 2 • Challenges due to remote setting • Unreliable QoEresults, no test moderator • Heterogeneousenvironment, devices, users 4 3 5

  12. Countermeasures: Unreliability • Proper Test Design and statistical methods for filtering data • Consistency Tests • “Same” question is asked multiple times in different manner. • Example: user is asked about his origin country in the beginning and about his origin continent at the end. • Content Questions • Simple questions about the video clip, after watching the video. • Example, “Which sport was shown in the clip? A) Tennis. B) Soccer. C) Skiing.‘” • Application Usage Monitoring • Example: measuring the time the worker spends on the task • Example: monitor browser events and user reactions • Utilizefeaturesof crowdsourcing platform • Specializedcrowds, whichhavecertainskills, reliability, etc. • Conducttrainingsessions, two-stage tests • Payment accordingtoquality

  13. Lessons Learned: Unreliable workers • FILTER LEVEL 1: • - wrong answers to content questions • different answers to the same questions • always selected same option • consistency questions: specified the wrong country/continent FILTER LEVEL 2: - did not notice stalling - perceived non-existent stalling • Many user ratings rejected •  use simple testinstructions •  avoid Java applets • takecareoflow Internet speed •  avoid incentives for users to cheat, see Facebook results of student’s friends • User warning („Test not done carefully“)  rejection rate decreased about 50% • improvements possible  detailed analysis of (inter and intra-) rater reliability revealed: filtering too strict FILTERLEVEL 3: - did not watch all videos completely C1 C2 C3 C4 C5 C6 C7 Facebook First crowdsourcing tests

  14. Crowdsourcing vs. Laboratory Studies • Crowdsourcing testswith Microworkers.com at Uni Würzburg • Lab studies within ACE 2.0 at FTW’s i:Lab • Similar results in laboratory and crowdsourcing study • Crowdsourcing appropriateforQoEtestsof Internet apps single stall event: 4 sec videoduration: 30 sec • 2,035 users from more than 60 countries participated in tests and rated 8,163 video. Payment was below 200,- Euro. • User diversity • Statisticallysignificantresults • Lowcosts, fast conduction

  15. CrowdsourcingTestsfor HD Live Streaming • Live video streaming investigated via Microworkers and Facebook • Joint work within QUALINET STSM by Bruno Gardlo “Improving Reliability for Crowdsourcing-Based QoE Testing” • Strong differences due to worse viewing conditions and smaller screen resolutions  context monitoring required, e.g. light conditions, • Critical, proper analysis of data, consider hidden influence factors

  16. CurrentActivitiesin Qualinet

  17. Qualinet “Crowdsourcing“ Task Force • Goal • Derive a methodologyand setup for crowdsourcing in QoE assessment, • Challenge crowdsourcing QoE assessment approach with usual “lab” methodologies, comparisonof QoE tests • Develop mechanisms and statistical approaches for identifying reliable ratings from remote crowdsourcing users, • Define requirements onto crowdsourcing platforms for improved QoE assessment. • Experiences with crowdsourcing • What are the main challenges? Reliability, environment/context monitoring, technical implementation, language problems … • Cartography for crowdsourcing use cases and mechanisms • Database with crowdsourcing results, e.g. impact of context factors on QoE, country, habits, … • Framework for crowdsourcing QoE tests • Results are implemented in framework “QualityCrowd” by TU Munich • Further information: https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:crowd

  18. Qualinet Task Force „Web andCloud Apps“ E-Mail Instant Messaging Customer Relationship Management Office Desktop Gaming • Technology changeandservicemigrationtocloudsstronglyimpactsuserperceptionandQoE • Currentactivites • DropboxQoE and mulicollaboration tools • QoE-aware adaptation mechanism for video streaming: DASH and SVC • Standardization: finalization of model and measurement methodology for web browsing QoE • https://www3.informatik.uni-wuerzburg.de/qoewiki/qualinet:webcloud Application Type Example OnLive Google Mail Facebook Chat SalesForce.com MS Office Live EyeOS Small High DegreeofInteractivity

  19. Questions? www.t-hossfeld.dehossfeld@informatik.uni-wuerzburg.de

More Related