1 / 29

Guiding Personal Choices in a Quality Contracts Driven Query Economy

Guiding Personal Choices in a Quality Contracts Driven Query Economy. Huming Qu 1 , Jie Xu 2 , Alexandros Labrinidis 2 1 IBM Watson Research Center 2 University of Pittsburgh. Audience Questions.

sharla
Download Presentation

Guiding Personal Choices in a Quality Contracts Driven Query Economy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Guiding Personal Choices in a Quality Contracts Driven Query Economy Huming Qu1, Jie Xu2, Alexandros Labrinidis2 1 IBM Watson Research Center 2 University of Pittsburgh

  2. Audience Questions • Question #1How many of you consider the time to get an airfare quote from a travel web site too long (sometimes)? • Question #2How many of you clicked on an airfare at a travel web site, only to get a message saying “price changed”?

  3. QoS vs. QoD Trade-off best QoD What if you could specify your preferences (on the trade-off between QoS and QoD)? worst worst best QoS

  4. Roadmap Conclusions % of audience asleep Experiments AQC Algorithm Background Motivation # of slides

  5. Application Architecture WWW Browser WWW Server Queries Web-database Updates Data Warehouse

  6. Application Architecture WWW Browser • Web-DB workloads • Read-only queries • Write-only updates • Intensive and/or bursty • QoS vs. QoD • Quality of Service • Answer queriesfast • Quality of Data • Finish updates on time • At trade-off with each other WWW Server Queries Web-database Updates User preferences can help system with resource allocation Data Warehouse

  7. Users care about Timeliness and Staleness Why scheduling? • Impact of scheduling • A simple test • FIFO • FIFO-UH (Update High) • FIFO-QH (Query High) • Nonebest on both dimensions • Combining performance metrics • Set constraint on one metric and optimize another [Kang04] • Construct a single metric based on weighted aggregation [Abadi05]

  8. Quality Contracts (QC) • Combine performance metrics • Convert incomparable QoS and QoD to the common “worth” to users • Capture user preference • Among quality metrics • Among different queries worth ∑ worth= $8 quality metric Response time = 30ms

  9. Related work User preferences • Grid computing • [AuYoung, et al., 2006] • [Buyya et al., 2005] • [Wolski et al., 2001] • … • Distributed databases • [Braumandl et al., 2003] • [Benatallah et al., 2002] • [Naumann et al., 1999] • … • Web-databases • [Challenger et al. 2000] • [Luo et al. 2002] • [Datta et al. 2002] • [Labrinidis et al. 2004] • [Qu et al. 2006] • [Labrinidis et al. 2006] • [Guirguis et al. 2009] • … • Real time systems • [Abbott et al., 1988] • [Sha et al., 1991] • [Haritsa et al., 1993] • [Ramamritham et al., 1994] • [Adelberg et al., 1996] • [Burns et al., 2000] • … • Stream Processing • [Carney et al., 2002] • [Das et al., 2003] • [Babcock et al., 2004] • [Sharaf et al., 2005] • [Abadi et al., 2005] • [Sharaf et al., 2008] • … • Economic Models • [Ferguson et al., 1996] • [Stonebraker et al., 1996] • …

  10. Roadmap Conclusions % of audience asleep Experiments AQC Algorithm Background Motivation # of slides

  11. User side – Problem definition • Given • Total budget B, Total queries N • Objective • Maximize Success Ratio (query result worth > 0, or Qpaid> 0) • QC setup • Known: • rtmax, • uumax, • qosmax/qodmax • Unknown:Qmax = qosmax + qodmax • Problem • How to adapt Qmaxto maximize Success Ratio?

  12. Baseline Algorithms • Fixed (FIX) • Fixed average • Pro: simple, once for all • Con: ignores the refund comes from previous failures • Random (RAN) • Random based on fixed average • Pro: simple • Con: ignores the refund comes from previous failures • Dynamic (DYN) • Future average • Pro: keep an eye on the budget left and queries left • Con: future average keeps increasing, money distributed unevenly

  13. Example($10 budget per query) FIX DYN $10 $10 $10 Future average (DYN) Unfair distribution of the budget Fixed average (FIX, RAN) Not fully make use of the budget time time RAN time

  14. Adaptive Quality Contract (AQC) • Overbid -bid more than you can afford • Deposit- bid less when continuous successes occur AQC Mode Selection If failureQ.size> 0 Overbid Mode else if successQ.size>c Deposit Mode

  15. Adaptive Quality Contract (AQC) • Overbid • Goal: make full use of the budget to boost the query priority • DYN : • AQC : Solve for

  16. Probability of returning before rtmax Percentage of returning before rtmax AQC – Overbid w/ Linear QC • Getting expected payment from QoS function S(x) S(1) = 5 Empirical Expectation

  17. AQC – Overbid • Overbid • Make use of entire budget • But server performance is affected by behaviors of all users smaller than 1

  18. AQC – Deposit • Deposit • Goal: saving money in a less competitive environment to be ready for more competitive one • Decrease of is proportional to qosmax = $10 qospaid = $8 20ms qospaid = $1

  19. Roadmap Conclusions % of audience asleep Experiments AQC Algorithm Background Motivation # of slides

  20. Experimental Setup • Real stock web site traces on April 24, 2000 • # queries: 120,000 • # updates: 396,000 • # stocks: 4,107 • Budget per query $10 • Four classes of users: FIX, RAN, DYN, AQC

  21. Experiment Design • Performance • 1-class experiment: FIX, RAN, DYN, AQC • 2-class experiment: FIX-AQC, RAN-AQC, DYN-AQC • Population • Evaluate various population of users using different algorithms • Knowledge scope • Evaluate various users’ knowledge scope of other users’ information

  22. AQC beats other strategy up to 3X! Success Ratio:1-class, 2-class 1-class 2-class

  23. AQC makes fully use of user budget! Over time:1-class FIX DYN $10 $10 $10 $10 time time RAN AQC time time

  24. More competitive users decreases overall success ratio Population • Mixing AQC & RAN users with different population • AQC 10%, RAN 90% • AQC 50%, RAN 50% • AQC 90%, RAN 10%

  25. Sharing more information increases success ratio and reduce the risk Knowledge scope • 100,000 users; 1 ~ 10,000 number of groups • Group members share all history information

  26. Roadmap Conclusions % of audience asleep Experiments AQC Algorithm Background Motivation # of slides

  27. Would users go for this? • Implementation could be like cell-phone plans • Provide user with set of predetermined options to choose from • Examples: • have predetermined ratio of preference of QoS over QoD and vice-versa • have predetermined budget levels for each query (to choose from)

  28. Conclusions • Believe that utilizing user preferences on QoS/QoD can help system decide better under resource constraints • Presented (AQC) algorithm to help users dynamically adjust Quality Contracts of queries • Algorithm is dynamic and adapts to changing conditions of the “economy

  29. Questions? • More info:Advanced Data Management Technologies Lab (ADMT)http://db.cs.pitt.edu • Funding:National Science Foundation • Career award IIS-0746696 • ITR award ANI-0325353

More Related