1 / 38

> HIGHERVIEW

> HIGHERVIEW. Team: A. Sasse J. D. McCarthy D. Miras J. Riegelsberger. Presentation to UCL Network Group: 3rd March 2004. > Sharp or smooth? Comparing the effects of quantization vs. frame rate for streamed video. J.D. McCarthy M. A. Sasse D. Miras. > motivation.

weston
Download Presentation

> HIGHERVIEW

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. >HIGHERVIEW Team: A. Sasse J. D. McCarthy D. Miras J. Riegelsberger Presentation to UCL Network Group: 3rd March 2004

  2. >Sharp or smooth?Comparing the effects of quantization vs. frame rate for streamed video. J.D. McCarthy M. A. Sasse D. Miras

  3. >motivation • Existing QOS policies conflict with experimental evidence. • No previous studies manipulating frame quality in conjunction with frame rate.

  4. >motivation • IBM QOS policy (2003) “recommends reducing DCT coefficients rather than frame rate for Sports coverage, as “the priority for smooth video is higher than the priority for frame quality” • Apteker et al. (1995) • Sport coverage relatively insensitive to reductions in frame rate.

  5. >methodology • Continuously change video quality while users are watching. • Continuously record user’s perception. • Discover the relationship between signal quality and perceived quality.

  6. >which measure? • Mean Opinion Score (MOS) • 8-10 second clips • single camera angle • rate quality on a 5 point Likert scale. • Limitations • Doesn’t measure continuous quality variations. • Poor measure for streamed video quality. • Doesn’t measure acceptability.

  7. >which measure? • SSCQE • The single stimulus continuous quality evaluation (SSCQE) • using a slider to indicate quality continuously. • Limitations • Too demanding for users performing real tasks. • Doesn’t measure service acceptability.

  8. >acceptability? • Is a MOS of 3.5 acceptable to users? • What about an SSCQE rating of 70? • Service dependent?

  9. >our approach • Focus on a specific service. • Ask users to say when the service is acceptable / unacceptable. • Advantages • Can be used with continuous streams • Easy for users to understand • Less disruptive • Relevant to service providers

  10. >methodology • Continuously change video quality while users are watching. • Continuously record user’s perception. • Discover the relationship between signal quality and perceived quality.

  11. >“method of limits” acceptable unacceptable high quality low quality

  12. >“method of limits” acceptable unacceptable high quality low quality

  13. >“method of limits” acceptable m unacceptable high quality low quality

  14. >service functions acceptable Pr (acceptable) unacceptable high quality low quality

  15. >service functions acceptable ITU BT.500-11 Logistic Function Pr (acceptable) unacceptable high quality low quality

  16. >service functions acceptable ? unacceptable frame rate

  17. >service functions acceptable ? unacceptable frame quality

  18. >two studies • Study 1 • CIF video viewed on a desktop. • Acceptability ratings. • Eye movements. • Study 2 • QCIF video viewed on an iPAQ. • Acceptability ratings. • Qualitative interviews.

  19. >video material • Football match • Arsenal vs Man. United (2002) • 3 source clips. • [A] Match intro and opening 3 minutes of play • [B] Highlights of Manchester United chances • [C] Highlights of Arsenal chances, final whistle and Arsenal celebration.

  20. >participants • Study 1 • 41 football fans. • 59% watched at least once a week • 88% supported a football team. • 51% supported Arsenal or Man U.

  21. >participants • Study 2 • 37 football fans. • 65% watched at least once a week • 84% supported a football team. • 34 % supported Arsenal or Man U.

  22. >design

  23. >study 1 - results fps

  24. >study 1 - results quant

  25. >study 1 - results fps + quant

  26. >study 1 - results gaze

  27. >study 1 - summary • Acceptability insensitive to frame rate. • Acceptability sensitive to quantization. • Critical values: • Quantisation = 8 • Frame rate = 6

  28. >study 2 - results fps

  29. >study 2 - results quant

  30. >study 2 - results fps + quant

  31. >bandwidth?

  32. >bandwidth? Critical Values (Clip B)

  33. >qualitative comments • 84%, recognising players was impossible. • 65% had problems following the ball. • 35% said close up shots fine - but long distant shots poor. • 21% said jerky movement was a problem.

  34. >qualitative comments “I’d rather have jerky video and better quality pictures”

  35. >study 2 - summary • Acceptability insensitive to frame rate. • Acceptability sensitive to quantization. • Critical values: • Quantisation = 4 • Frame rate = 6

  36. >conclusions • Limitations • Network effects not factored in. • Substantive • High motion does not need high frame rate! • Important task relevant information is lost with poor frame quality.

  37. >conclusions • Methodological • Binary acceptability rating • continuous • easy to understand • doesn’t disrupt task • “Method of limits” produces robust replicable service functions.

  38. >Sharp or smooth?Comparing the effects of quantization vs. frame rate for streamed video. J.D. McCarthy M. A. Sasse D. Miras

More Related