html5-img
1 / 39

The User In Experimental Computer Systems Research

The User In Experimental Computer Systems Research. Peter A. Dinda Gokhan Memik, Robert Dick Bin Lin, Arindam Mallik, Ashish Gupta, Sam Rossoff Department of Electrical Engineering and Computer Science Northwestern University http://presciencelab.org.

borka
Download Presentation

The User In Experimental Computer Systems Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The User In Experimental Computer Systems Research Peter A. Dinda Gokhan Memik, Robert Dick Bin Lin, Arindam Mallik, Ashish Gupta, Sam Rossoff Department of Electrical Engineering and Computer Science Northwestern University http://presciencelab.org

  2. Experimental Computer Systems Researchers Should… • Incorporate user studies into the evaluation of systems • Incorporate direct user feedback into the design of systems

  3. Experimental Computer Systems Researchers Should… • Incorporate user studies into the evaluation of systems • No such thing as the typical user • Really measure user satisfaction • Incorporate direct user feedback into the design of systems • No such thing as the typical user • Measure and leverage user variation

  4. Outline • Prescription • Experiences with user studies and direct user feedback • User comfort with resource borrowing • User-driven scheduling of interactive VMs • User satisfaction with CPU frequency • User-driven frequency scaling • User-driven control of distributed virtualized envs. • Prospects for speculative remote display • Principles for client/server context • General advice

  5. Experiences in Detail • Concepts : ExpCS 2007 @ FCRC • Specific Projects • User comfort with resource borrowing • HPDC 2004, NWU-CS-04-28 • User-driven scheduling of interactive VMs • Grid 2004, SC 2005, VTDC 2006, NWU-EECS-06-07 • User satisfaction with CPU frequency • CAL 2006, SIGMETRICS 2007, NWU-EECS-06-11 • User-driven frequency scaling (/process-driven voltage scaling) • CAL 2006, SIGMETRICS 2007, NWU-EECS-06-11 • User-driven control of distributed virtualized envs. • Portion of Bin Lin’s thesis, see also ICAC 2007 • Prospects for speculative remote display • NWU-EECS-06-08

  6. User Comfort With Resource Borrowing • Systems that use “spare” resources on desktops for other computation • *@Home, Condor on desktops, etc. • How much can they borrow before discomforting user? • Inverse: How much must desktop replacement system give?

  7. User Comfort With Resource Borrowing • Developed system for controlled resource borrowing given a profile • CPU contention, disk BW contention, physical memory pages • User presses “irritation button” to stop • User study • 38 participants • Four apps • Word, Powerpoint, Web browsing, Game • Ramp, Step, Placebo profiles • Double blinded

  8. Example Result Massive Variation in User Response

  9. User-driven Scheduling of Interactive VMs • Virtual machine-based desktop replacement model • VM runs on backend server • User connects with remote display • VM is scheduled according to periodic real-time model • Allows straightforward mixing of batch and interactive VMs + isolation properties • What should interactive VM’s schedule be?

  10. User-driven Scheduling of Interactive VMs • VSched scheduler on server • User interface on client Non-centering joystick allows user to set schedule $10 interface in study Cheaper interfaces possible Onscreen display indicates price of current schedule Also indicates when schedule cannot be admitted

  11. User-driven Scheduling of Interactive VMs • User study • 18 participants • 4 applications • Word, Powerpoint, Internet browsing, Game • Survey response + measurement • Deception scheme to control bias in survey response • Results • Almost all could find a setting that was comfortable • Almost all could find a setting that was comfortable and believed to be of lowest cost • Lowest cost highly variable, as expected given previous results • <1 minute convergence typical • Interface captures individual user tradeoffs • Fewer cycles for tolerant users • More cycles for others

  12. User Satisfaction With CPU frequency • Modern processors can lower frequency to reduce power consumption • Software control: DVFS - conservative • How satisfied are users of different applications at different clock frequencies? • User Study • 8 users • 3 frequencies + Windows DVFS • 3 apps • Presentation, Animation, Game • Rate comfort on 1 to 10 scale • Double-blinded

  13. Example Results Presentation • Dramatic variation in user satisfaction for fixed frequencies • And for DVFS Game

  14. User-driven Frequency Scaling • Developed system to dynamically customize frequency to user • User presses “irritation button” as input • 2 very simple learning algorithms • User study • 20 participants • Three apps • Powerpoint, Animation, Game • Comparison with Windows DVFS • Double blinded

  15. Example Results (Measured System Power) % gain over Windows DVFS Users Powerpoint Game % gain over Windows DVFS Users

  16. Outline • Prescription • Experiences with user studies and direct user feedback • User comfort with resource borrowing • User-driven scheduling of interactive VMs • User satisfaction with CPU frequency • User-driven frequency scaling • User-driven control of distributed virtualized envs. • Prospects for speculative remote display • Principles for client/server context • General advice

  17. Principles for the Client/Server Context • User variation • Considerable variation in user satisfaction with any given operating point • No such thing as a typical user • User-specified performance • Have user tell system software how satisfied he is • No decoupling of user response from user and OS-level measurements • Think global feedback • Thin, simple user-system interface • One bit is a lot of information compared to zero • Learning to decrease interaction rate • Model the individual user

  18. Outline • Prescription • Experiences with user studies and direct user feedback • User comfort with resource borrowing • User-driven scheduling of interactive VMs • User satisfaction with CPU frequency • User-driven frequency scaling • User-driven control of distributed virtualized envs. • Prospects for speculative remote display • Principles for client/server context • General advice

  19. General Advice for Evaluating Systems with User Studies • Consult an HCI or psychology expert • User studies are different but not impossible • At least consult the literature • Engage your IRB early • These are “social science”-based studies • Easier the second time around • Accept small study size • Parameter sweeps, hundreds of traces impossible • Internet volunteerism not especially effective • Use non-user studies to augment if possible • Robust statistics

  20. General Advice for Evaluating Systems with User Studies • Accept that random sample unlikely • Selection bias estimation, if possible • Report all your data, not just summaries • Histogram instead of curve fit • Measure the noise floor / placebo effect • Vital to determine how much of user satisfaction is actually under your control • Double-blind to greatest extent possible • Investigator bias and subject bias

  21. General Advice for Evaluating Systems with User Studies • Correlate system-level measurements with user responses to validate the latter • Consider deception when this is impossible • Eliminate user-visible extraneous information during any study • What the user knows can hurt you • Example: disk light

  22. General Advice for Incorporating Direct User Feedback • Out-of-band devices work best • Avoid cognitive context switch • Use as little input as possible • One bit is much more information than zero • Utility of input may not be clear to user • Output as little information as possible • Minimize input rate through learning • Bridge explicit feedback to implicit feedback when possible

  23. Experimental Computer Systems Researchers Should… • Incorporate user studies into the evaluation of systems • No such thing as the typical user • Really measure user satisfaction • Incorporate direct user feedback into the design of systems • No such thing as the typical user • Measure and leverage user variation

  24. For MoreInformation • Peter Dinda • http://pdinda.org • Prescience Lab • http://presciencelab.org

  25. User-driven Control of Distributed Virtual Environments • Area of current exploration (part of Lin’s thesis) • Idea: Can we frame these problems as games that naïve or expert users/admins can solve? • Initial results interesting, but still too early too tell • Scaling • Dimensionality • Categorical dimensions • …

  26. Typical Design Models • Optimize User Satisfaction Subject to Constraints • Systems software’s decisions have dramatic effect on user experience • But how does systems software know how well it is doing? Individual User Satisfaction with System/App Combination Interface Considerations Application(s) Core API Systems Software Resource Management and Scheduling Considerations

  27. Typical Design Models • Optimize User Satisfaction Subject to Constraints • One option: let the application tell it! • But how does the application know? Individual User Satisfaction with System/App Combination Interface Considerations Application(s) Core API Policy API Systems Software Resource Management and Scheduling Considerations

  28. Typical Design Models • Optimize User Satisfaction Subject to Constraints • One option: let the application tell it! • Assume typical user and apply general rules derived from him/her • And figure out how to translate to the policy API Typical User Satisfaction with System/App Combination <500 ms latency and <100 ms jitter Interface Considerations Application(s) Core API Policy API Systems Software Resource Management and Scheduling Considerations

  29. Typical Design Models • Optimize User Satisfaction Subject to Constraints • One option: let the application tell it! • Or formalize tradeoffs • And figure out how to translate to the policy API Typical User Satisfaction with System/App Combination Satisfaction Latency Utility Function Interface Considerations Application(s) Core API Policy API Systems Software Resource Management and Scheduling Considerations

  30. Typical Design Models • Optimize User Satisfaction Subject to Constraints • Another option: generalize over applications and infer user experience Typical User Satisfaction with System/App Combination Interface Considerations Application(s) Core API Inferred Latency Systems Software Good/Bad? Satisfaction Resource Management and Scheduling Considerations Latency

  31. Typical Design Models • Optimize User Satisfaction Subject to Constraints • Another option: Get the utility function right from the individual user • Assuming he/she knows it… Individual User “What’s a utility function?” Satisfaction with System/App Combination “What is your utility function?” or “Which of these profiles are you most like?” Interface Considerations Application(s) Application(s) Policy Interface Core API Systems Software Systems Software Resource Management and Scheduling Considerations Resource Management and Scheduling Considerations

  32. Typical Design Models • Optimize User Satisfaction Subject to Constraints • Another option: Expose the system software to the user in its glory details • Works great for us! Individual User “What the…” Satisfaction with System/App Combination Interface Considerations Application(s) Application(s) Policy Interface Core API Systems Software Systems Software Resource Management and Scheduling Considerations Resource Management and Scheduling Considerations

  33. Typical Evaluation Approaches • Workloads • User workload model/generator • How to account for user variation? • How to evaluate as closed system? • How to validate? • User traces • Context dependent • How to evaluate as closed system?

  34. Typical Evaluation Approaches • Metrics • Can system meet performance objectives given through policy interface? • What should the objectives be? • Can system optimize over some combination of utility functions? • What should the utility functions be?

  35. New Model for Characterization and Evaluation • User studies to characterize user response • Examine the range of user satisfaction for some perceivable quantity or combination of quantities • Capture the variation, not only the mean • Variation = opportunity • User studies for evaluating systems • Directly measure user satisfaction with your system

  36. New Model: Direct User Feedback • Optimize User Satisfaction Subject to Constraints • User conveys satisfaction (or dissatisfaction) through a simple user interface Individual User Satisfaction with System/App Combination Interface Considerations Application(s) Application(s) Satisfaction Feedback Core API Systems Software Systems Software Resource Management and Scheduling Considerations Resource Management and Scheduling Considerations

  37. New Model: Direct User Feedback • Optimize User Satisfaction Subject to Constraints • User has some direct control over systems-level decision making through a simple interface Individual User Satisfaction with System/App Combination Interface Considerations Application(s) Application(s) Some Control Over Decision Making Core API Systems Software Systems Software Resource Management and Scheduling Considerations Resource Management and Scheduling Considerations

  38. User-driven Control of Distributed Virtual Environments • Virtuoso project (see virtuoso.cs.northwestern.edu) • User “rents” collection of virtual machines • Virtuoso front-end looks like computer vendor • Providers stand up resources on which VMs can run or communicate • Virtuoso provides adaptation mechanisms • VM migration • Overlay topology and routing (VNet) • CPU reservations (VSched) • Network reservations (optical with VReserve) • Transparent network services (VTL) • Virtuoso provides inference mechanisms • Application traffic and topology (VTTIF) • Network bandwidth and latency (Wren)

  39. User-driven Control of Distributed Virtual Environments • Optimization problem: Given the inferred demands and supply, choose a configuration made possible by the adaptation mechanisms that maximizes a measure of application performance within constraints • Formalizations • NP-Hard problem in general • Approximation bound is not great either • Heuristic solutions • Can the user or a system administrator solve these problems given the right interface? • Can a naïve human do it?

More Related