1 / 58

Temporal and Visual Fidelity

Temporal and Visual Fidelity. Benjamin Watson Dept. Computer Science Northwestern University watson@cs.nwu.edu. The Visual/Temporal Tradeoff. Dynamic LOD is striking a compromise: Visual : polys, textures, lighting… Temporal : frame rate, delay… Most have emphasized visual

malise
Download Presentation

Temporal and Visual Fidelity

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Temporal and Visual Fidelity Benjamin Watson Dept. Computer Science Northwestern University watson@cs.nwu.edu

  2. The Visual/Temporal Tradeoff Dynamic LOD is striking a compromise: • Visual: polys, textures, lighting… • Temporal: frame rate, delay… Most have emphasized visual Here we examine both in some detail

  3. Outline Temporal Fidelity Motivating questions Temporal Basics Previous Research Answers and Implications Visual Fidelity (Quickly) Future Directions

  4. Questions: A Good Mean? Frame rate: what’s “good enough?” F Rate F Rate or... Time Time

  5. Questions: How Constant? Frame rate: how much change? F Rate F Rate or... Time Time

  6. Questions: Pattern Effects? Frame rate: effects of frequency? F Rate F Rate or... Time Time

  7. Other Questions Effects of frame rate vs. delay? display continuity vs. age How do effects vary by task? e.g. placing vs. navigating

  8. Basics: Frame Rate Def: displayed samples/sec • Not constant, that takes work Frame time: ms/frame • Nonlinear relationship • Better predictor human performance: linear delay relationship

  9. Basics: Frame Rate Refresh rate: display refresh/sec • One frame time = a multiple of refresh time • Mean frame time may not be Time image generation image generation Renderer display refresh display refresh Display vertical retrace vertical retrace

  10. Basics: System Latency Def: age of displayed sample • some of frame time, plus input collection time • Also varies

  11. Basics: System Responsiveness Def (SR): delay from input to display • system latency, plus delay between event and sample

  12. Basics: System Responsiveness Only one sample used per frame So delay between event and sample: • Mean: half frame time • Variation: random, range 1 frame time So frame time has great effect on SR

  13. Not So Basic:Complex Systems VR systems may have • More than one input device • More than one output device For each I/O link • Different latency, SR

  14. Temporal Fidelity Control: Frame-Latency Both frame time and latency change (FL) • e.g. by varying 3D model complexity after input arrives • SR += 3/2 change Change here

  15. Temporal Fidelity Control: Frame-Only Only frame time changes (FO) • e.g. by varying simulation complexity before input arrives • SR += 1/2 change Change here

  16. Temporal Fidelity Control: Latency-Only Only latency changes (LO) • e.g. by varying degree of filtering before input sent • SR += change Change here

  17. Temporal Fidelity Control: Summary Moral: not all delay or speedup is equal • Differ in frame rate/latency effects • Differ in SR effects

  18. Outline Update Temporal Fidelity Motivating questions Temporal Basics Previous Research Answers and Implications Visual Fidelity (Quickly) Implications

  19. Research: Tasks With Prediction If tasks require prediction: • Environment is changing • Must observe, anticipate, act Therefore: • Need samples for prediction: frame time • Account for sample age: system latency

  20. Research: Tasks With Feedback If tasks require feedback: • Must act, evaluate results, act…. Therefore: • Must wait for results: responsiveness • Feedback iterations multiply SR effects

  21. Research: Catching Tasks Catching tasks: • Visually track, manually intercept • e.g. shooting, catching moving objects Prediction important, not feedback • Expect: At same SR, FL and FO more effect than LO

  22. Research: Catching Tasks Watson, Richard: • ~15 Hz “enough” (290 ms) • Examined only FL effects Watson: • SR change 100+ ms significant • Perceivable patterns had effects

  23. Research: Tracking Tasks Tracking tasks: • Manually track a moving object • Similar to navigation Both prediction and feedback crucial • Expect: At same SR, FL and FO slightly more effect than LO

  24. Research: Tracking Tasks Tharp, Bryson (small studies): • Compared mean FO, mean LO Obtained similar results: • Faster SR always helped users • Given same SR, LO better than FO • Difficulty increased SR effects

  25. Research: Placement Tasks Placement tasks: • Move to well-known location • e.g. selection, grasping static objects Feedback important, not prediction • Expect: SR is key, whether FL, FO or LO

  26. Research: Placement Tasks Bryson (small study): • Given same SR, FO and LO same effects • More SR always better Mackenzie & Ware (LO only): • SR improved performance down to 25 ms • Difficulty increased SR effects

  27. Research: Placement Tasks Ware & Balakrishnan: • Given same SR, FL/FO/LO same effects Watson (FL only): • Difficulty increased SR effects • SR change (100+ ms) had effects • Perceivable patterns had effects

  28. Research Details: Task

  29. Research Details: Placement F rate and delay become more important with difficulty

  30. Research Details: Placement Feedback lessens effects of delay (esp. @ high difficulty)

  31. Answers: A Good Mean? Predictive tasks: • “Enough” is quite possible • This applies to SR and frame time Feedback tasks: • Really hard to get enough SR

  32. Answers: How Constant? Not very: • Up to 100 ms change okay! Predictive tasks: • Constancy impt at low frame rates Feedback tasks: • Constancy impt when mean SR good

  33. Answers: Pattern Effects? Yes, there are effects: • Only when change is severe • Only when frequencies are large More research needed: • Transient change? • Asymmetric change? • trend: worsening SR is worse than improving SR

  34. Other Answers Effects of frame rate vs. SR? • Generally, SR dominates How do effects vary by task? • Predictive: frame time • Feedback: SR

  35. LOD Implications Improve SR before frame time Easier to get “enough” frame time Only minor efforts to control change User can tolerate quite a bit

  36. LOD Implications After change, patterns Control only large scale changes Avoid repetitive patterns of change Control should be sensitive to task! According to predictive, feedback makeup

  37. LOD Implications If delay (speedup) is necessary (possible): Prefer FO, LO, and FL, in that order (reverse for speedup) Control should be sensitive to difficulty Higher difficulty, better temporal detail

  38. LOD Implications Feedback can compensate for delay, diff Improved visual feedback != more polys Especially in tasks depending on feedback Visual/temporal detail interactions?

  39. Visual vs. Temporal Information is the key: It can be spatially inaccurate (wrong place) Or, temporally inaccurate (wrong time) How to compare these inaccuracies?

  40. I/O Differencing Key observation: wrong time = wrong place complex vs. simple models Input (I) vs. displayed (O) positions

  41. I/O Differencing Task

  42. I/O Differencing Results

  43. I/O Differencing Results

  44. I/O Differencing Results

  45. Visual Fidelity Control When controlling visual fidelity: How well are we doing? How might we do better? For answers, we need Experimental measures of visual fidelity Automatic measures of visual fidelity

  46. Experimental Vis Fidelity Psychophysical experimentation Nice internal validity, but External validity is a big concern! We studied 3 higher level measures Forced choice Ratings Naming times

  47. Experimental Vis Fidelity Our independent variables 2 simplification algorithms: QSlim, Cluster 3 levels of simplification: 0%, 50%, 80% 2 groups of stimuli: animals, objects 36 experimental stimuli 36 subjects

  48. Some Typical Stimuli Original Models

  49. Some Typical Stimuli clustering qslim 50% Simplified

  50. Some Typical Stimuli clustering qslim 80% Simplified

More Related