The user in experimental computer systems research
Download
1 / 39

The User In Experimental Computer Systems Research - PowerPoint PPT Presentation


  • 96 Views
  • Uploaded on

The User In Experimental Computer Systems Research. Peter A. Dinda Gokhan Memik, Robert Dick Bin Lin, Arindam Mallik, Ashish Gupta, Sam Rossoff Department of Electrical Engineering and Computer Science Northwestern University http://presciencelab.org.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'The User In Experimental Computer Systems Research' - borka


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
The user in experimental computer systems research

The User In Experimental Computer Systems Research

Peter A. Dinda

Gokhan Memik, Robert Dick

Bin Lin, Arindam Mallik,

Ashish Gupta, Sam Rossoff

Department of Electrical Engineering and Computer Science

Northwestern University

http://presciencelab.org


Experimental computer systems researchers should
Experimental Computer Systems Researchers Should…

  • Incorporate user studies into the evaluation of systems

  • Incorporate direct user feedback into the design of systems


Experimental computer systems researchers should1
Experimental Computer Systems Researchers Should…

  • Incorporate user studies into the evaluation of systems

    • No such thing as the typical user

    • Really measure user satisfaction

  • Incorporate direct user feedback into the design of systems

    • No such thing as the typical user

    • Measure and leverage user variation


Outline
Outline

  • Prescription

  • Experiences with user studies and direct user feedback

    • User comfort with resource borrowing

    • User-driven scheduling of interactive VMs

    • User satisfaction with CPU frequency

    • User-driven frequency scaling

    • User-driven control of distributed virtualized envs.

    • Prospects for speculative remote display

  • Principles for client/server context

  • General advice


Experiences in detail
Experiences in Detail

  • Concepts : ExpCS 2007 @ FCRC

  • Specific Projects

    • User comfort with resource borrowing

      • HPDC 2004, NWU-CS-04-28

    • User-driven scheduling of interactive VMs

      • Grid 2004, SC 2005, VTDC 2006, NWU-EECS-06-07

    • User satisfaction with CPU frequency

      • CAL 2006, SIGMETRICS 2007, NWU-EECS-06-11

    • User-driven frequency scaling (/process-driven voltage scaling)

      • CAL 2006, SIGMETRICS 2007, NWU-EECS-06-11

    • User-driven control of distributed virtualized envs.

      • Portion of Bin Lin’s thesis, see also ICAC 2007

    • Prospects for speculative remote display

      • NWU-EECS-06-08


User comfort with resource borrowing
User Comfort With Resource Borrowing

  • Systems that use “spare” resources on desktops for other computation

    • *@Home, Condor on desktops, etc.

  • How much can they borrow before discomforting user?

    • Inverse: How much must desktop replacement system give?


User comfort with resource borrowing1
User Comfort With Resource Borrowing

  • Developed system for controlled resource borrowing given a profile

    • CPU contention, disk BW contention, physical memory pages

    • User presses “irritation button” to stop

  • User study

    • 38 participants

    • Four apps

      • Word, Powerpoint, Web browsing, Game

    • Ramp, Step, Placebo profiles

    • Double blinded


Example result
Example Result

Massive Variation in

User Response


User driven scheduling of interactive vms
User-driven Scheduling of Interactive VMs

  • Virtual machine-based desktop replacement model

    • VM runs on backend server

    • User connects with remote display

    • VM is scheduled according to periodic real-time model

      • Allows straightforward mixing of batch and interactive VMs + isolation properties

  • What should interactive VM’s schedule be?


User driven scheduling of interactive vms1
User-driven Scheduling of Interactive VMs

  • VSched scheduler on server

  • User interface on client

Non-centering joystick allows user

to set schedule

$10 interface in study

Cheaper interfaces possible

Onscreen display indicates price of current schedule

Also indicates when

schedule cannot be admitted


User driven scheduling of interactive vms2
User-driven Scheduling of Interactive VMs

  • User study

    • 18 participants

    • 4 applications

      • Word, Powerpoint, Internet browsing, Game

    • Survey response + measurement

      • Deception scheme to control bias in survey response

  • Results

    • Almost all could find a setting that was comfortable

    • Almost all could find a setting that was comfortable and believed to be of lowest cost

    • Lowest cost highly variable, as expected given previous results

    • <1 minute convergence typical

  • Interface captures individual user tradeoffs

    • Fewer cycles for tolerant users

    • More cycles for others


User satisfaction with cpu frequency
User Satisfaction With CPU frequency

  • Modern processors can lower frequency to reduce power consumption

    • Software control: DVFS - conservative

  • How satisfied are users of different applications at different clock frequencies?

  • User Study

    • 8 users

    • 3 frequencies + Windows DVFS

    • 3 apps

      • Presentation, Animation, Game

    • Rate comfort on 1 to 10 scale

    • Double-blinded


Example results
Example Results

Presentation

  • Dramatic variation in user satisfaction for fixed frequencies

    • And for DVFS

Game


User driven frequency scaling
User-driven Frequency Scaling

  • Developed system to dynamically customize frequency to user

    • User presses “irritation button” as input

    • 2 very simple learning algorithms

  • User study

    • 20 participants

    • Three apps

      • Powerpoint, Animation, Game

    • Comparison with Windows DVFS

    • Double blinded


Example results measured system power
Example Results (Measured System Power)

% gain over Windows DVFS

Users

Powerpoint

Game

% gain over Windows DVFS

Users


Outline1
Outline

  • Prescription

  • Experiences with user studies and direct user feedback

    • User comfort with resource borrowing

    • User-driven scheduling of interactive VMs

    • User satisfaction with CPU frequency

    • User-driven frequency scaling

    • User-driven control of distributed virtualized envs.

    • Prospects for speculative remote display

  • Principles for client/server context

  • General advice


Principles for the client server context
Principles for the Client/Server Context

  • User variation

    • Considerable variation in user satisfaction with any given operating point

    • No such thing as a typical user

  • User-specified performance

    • Have user tell system software how satisfied he is

    • No decoupling of user response from user and OS-level measurements

      • Think global feedback

  • Thin, simple user-system interface

    • One bit is a lot of information compared to zero

  • Learning to decrease interaction rate

    • Model the individual user


Outline2
Outline

  • Prescription

  • Experiences with user studies and direct user feedback

    • User comfort with resource borrowing

    • User-driven scheduling of interactive VMs

    • User satisfaction with CPU frequency

    • User-driven frequency scaling

    • User-driven control of distributed virtualized envs.

    • Prospects for speculative remote display

  • Principles for client/server context

  • General advice


General advice for evaluating systems with user studies
General Advice for Evaluating Systems with User Studies

  • Consult an HCI or psychology expert

    • User studies are different but not impossible

    • At least consult the literature

  • Engage your IRB early

    • These are “social science”-based studies

    • Easier the second time around

  • Accept small study size

    • Parameter sweeps, hundreds of traces impossible

    • Internet volunteerism not especially effective

    • Use non-user studies to augment if possible

    • Robust statistics


General advice for evaluating systems with user studies1
General Advice for Evaluating Systems with User Studies

  • Accept that random sample unlikely

    • Selection bias estimation, if possible

    • Report all your data, not just summaries

      • Histogram instead of curve fit

  • Measure the noise floor / placebo effect

    • Vital to determine how much of user satisfaction is actually under your control

  • Double-blind to greatest extent possible

    • Investigator bias and subject bias


General advice for evaluating systems with user studies2
General Advice for Evaluating Systems with User Studies

  • Correlate system-level measurements with user responses to validate the latter

    • Consider deception when this is impossible

  • Eliminate user-visible extraneous information during any study

    • What the user knows can hurt you

      • Example: disk light


General advice for incorporating direct user feedback
General Advice for Incorporating Direct User Feedback

  • Out-of-band devices work best

    • Avoid cognitive context switch

  • Use as little input as possible

    • One bit is much more information than zero

    • Utility of input may not be clear to user

  • Output as little information as possible

  • Minimize input rate through learning

  • Bridge explicit feedback to implicit feedback when possible


Experimental computer systems researchers should2
Experimental Computer Systems Researchers Should…

  • Incorporate user studies into the evaluation of systems

    • No such thing as the typical user

    • Really measure user satisfaction

  • Incorporate direct user feedback into the design of systems

    • No such thing as the typical user

    • Measure and leverage user variation


For more information
For MoreInformation

  • Peter Dinda

    • http://pdinda.org

  • Prescience Lab

    • http://presciencelab.org


User driven control of distributed virtual environments
User-driven Control of Distributed Virtual Environments

  • Area of current exploration (part of Lin’s thesis)

  • Idea: Can we frame these problems as games that naïve or expert users/admins can solve?

  • Initial results interesting, but still too early too tell

    • Scaling

    • Dimensionality

    • Categorical dimensions


Typical design models
Typical Design Models

  • Optimize User Satisfaction Subject to Constraints

  • Systems software’s decisions have dramatic effect on user experience

  • But how does systems software know how well it is doing?

Individual

User

Satisfaction with

System/App

Combination

Interface Considerations

Application(s)

Core API

Systems Software

Resource Management

and Scheduling

Considerations


Typical design models1
Typical Design Models

  • Optimize User Satisfaction Subject to Constraints

  • One option: let the application tell it!

  • But how does the application know?

Individual

User

Satisfaction with

System/App

Combination

Interface Considerations

Application(s)

Core API

Policy API

Systems Software

Resource Management

and Scheduling

Considerations


Typical design models2
Typical Design Models

  • Optimize User Satisfaction Subject to Constraints

  • One option: let the application tell it!

  • Assume typical user and apply general rules derived from him/her

    • And figure out how to translate to the policy API

Typical

User

Satisfaction with

System/App

Combination

<500 ms latency

and <100 ms jitter

Interface Considerations

Application(s)

Core API

Policy API

Systems Software

Resource Management

and Scheduling

Considerations


Typical design models3
Typical Design Models

  • Optimize User Satisfaction Subject to Constraints

  • One option: let the application tell it!

  • Or formalize tradeoffs

    • And figure out how to translate to the policy API

Typical

User

Satisfaction with

System/App

Combination

Satisfaction

Latency

Utility Function

Interface Considerations

Application(s)

Core API

Policy API

Systems Software

Resource Management

and Scheduling

Considerations


Typical design models4
Typical Design Models

  • Optimize User Satisfaction Subject to Constraints

  • Another option: generalize over applications and infer user experience

Typical

User

Satisfaction with

System/App

Combination

Interface Considerations

Application(s)

Core API

Inferred Latency

Systems Software

Good/Bad?

Satisfaction

Resource Management

and Scheduling

Considerations

Latency


Typical design models5
Typical Design Models

  • Optimize User Satisfaction Subject to Constraints

  • Another option: Get the utility function right from the individual user

    • Assuming he/she knows it…

Individual

User

“What’s a utility function?”

Satisfaction with

System/App

Combination

“What is your

utility function?”

or

“Which of these

profiles are you most like?”

Interface Considerations

Application(s)

Application(s)

Policy

Interface

Core API

Systems Software

Systems Software

Resource Management

and Scheduling

Considerations

Resource Management

and Scheduling

Considerations


Typical design models6
Typical Design Models

  • Optimize User Satisfaction Subject to Constraints

  • Another option: Expose the system software to the user in its glory details

    • Works great for us!

Individual

User

“What the…”

Satisfaction with

System/App

Combination

Interface Considerations

Application(s)

Application(s)

Policy

Interface

Core API

Systems Software

Systems Software

Resource Management

and Scheduling

Considerations

Resource Management

and Scheduling

Considerations


Typical evaluation approaches
Typical Evaluation Approaches

  • Workloads

    • User workload model/generator

      • How to account for user variation?

      • How to evaluate as closed system?

      • How to validate?

    • User traces

      • Context dependent

      • How to evaluate as closed system?


Typical evaluation approaches1
Typical Evaluation Approaches

  • Metrics

    • Can system meet performance objectives given through policy interface?

      • What should the objectives be?

    • Can system optimize over some combination of utility functions?

      • What should the utility functions be?


New model for characterization and evaluation
New Model for Characterization and Evaluation

  • User studies to characterize user response

    • Examine the range of user satisfaction for some perceivable quantity or combination of quantities

    • Capture the variation, not only the mean

    • Variation = opportunity

  • User studies for evaluating systems

    • Directly measure user satisfaction with your system


New model direct user feedback
New Model: Direct User Feedback

  • Optimize User Satisfaction Subject to Constraints

  • User conveys satisfaction (or dissatisfaction) through a simple user interface

Individual

User

Satisfaction with

System/App

Combination

Interface Considerations

Application(s)

Application(s)

Satisfaction

Feedback

Core API

Systems Software

Systems Software

Resource Management

and Scheduling

Considerations

Resource Management

and Scheduling

Considerations


New model direct user feedback1
New Model: Direct User Feedback

  • Optimize User Satisfaction Subject to Constraints

  • User has some direct control over systems-level decision making through a simple interface

Individual

User

Satisfaction with

System/App

Combination

Interface Considerations

Application(s)

Application(s)

Some Control

Over Decision

Making

Core API

Systems Software

Systems Software

Resource Management

and Scheduling

Considerations

Resource Management

and Scheduling

Considerations


User driven control of distributed virtual environments1
User-driven Control of Distributed Virtual Environments

  • Virtuoso project (see virtuoso.cs.northwestern.edu)

    • User “rents” collection of virtual machines

      • Virtuoso front-end looks like computer vendor

    • Providers stand up resources on which VMs can run or communicate

    • Virtuoso provides adaptation mechanisms

      • VM migration

      • Overlay topology and routing (VNet)

      • CPU reservations (VSched)

      • Network reservations (optical with VReserve)

      • Transparent network services (VTL)

    • Virtuoso provides inference mechanisms

      • Application traffic and topology (VTTIF)

      • Network bandwidth and latency (Wren)


User driven control of distributed virtual environments2
User-driven Control of Distributed Virtual Environments

  • Optimization problem: Given the inferred demands and supply, choose a configuration made possible by the adaptation mechanisms that maximizes a measure of application performance within constraints

    • Formalizations

    • NP-Hard problem in general

    • Approximation bound is not great either

    • Heuristic solutions

  • Can the user or a system administrator solve these problems given the right interface?

    • Can a naïve human do it?