1 / 33

Community Driven Adaptation

Community Driven Adaptation. Iqbal Mohomed (iq@cs.toronto.edu) Eyal de Lara (delara@cs.toronto.edu) Department of Computer Science University of Toronto. Challenge of Variety. Plethora of client devices access content on the World Wide Web Trend is growing faster!

avari
Download Presentation

Community Driven Adaptation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Community Driven Adaptation Iqbal Mohomed (iq@cs.toronto.edu) Eyal de Lara (delara@cs.toronto.edu) Department of Computer Science University of Toronto

  2. Challenge of Variety • Plethora of client devices access content on the World Wide Web • Trend is growing faster! • Majority of web pages are still developed for the typical user = Desktop Computing Environment • No power constraint • Abundant primary/secondary storage • Large screen with high color depth • Fast processor, etc. • Modem-speed or high-speed Internet access • Input via Keyboard/ Mouse

  3. Mobile Device Differences • Network Connectivity - Intermittent connectivity - Low bandwidth, high latency for some but not for others (e.g. CDPD: 12kbps vs. 802.11b: 11Mbps) • Energy limitations • Form Factor • Screen Size • Color Depth • I/O Interface • Numeric keypad • Mini-keyboard • Pen

  4. Adaptation • Content that is changed or changes so as to become suitable to a new or special application or situation • Goal is to save resources and/or improve usability • Use of software transcoder to modify fidelity of data objects … and hence resource footprint

  5. Important Considerations Where is it done? When to do it? How much? Who chooses?

  6. Server Adaptation • Publishers develop and make available content for several classes of devices • Disadvantages: • Burden/Cost imposed on all Content Publishers • Continuous effort to support new types of devices • Difficult to maintain synchronization of various formats under heavy utilization • You can never cover all possible versions!

  7. Proxy Based Adaptation • Proxies are heavily utilized in current architecture of WWW • Rather than just serving cached content, the adaptation proxy can provide content at different fidelities • Transcoding • Can be done at the outset or on the fly • Can be done at the Server or Proxy

  8. Proxy in the Middle

  9. Advantages of Having a Proxy • Server does not have to keep track of preferences and other information pertaining to individual users • Can place transcoder at proxy to dynamically generate content at fidelities that differ from the original (e.g. black and white, grayscale, 8-bit color and so on for a 32-bit color image)

  10. Who Decides how much to adapt? • Server • Server takes information about the user • Provides its best guess on which version of the requested content would be optimal • Client • Client/ Agent is told about all types of available content • He/she/it picks the fidelity it wants • Proxy • Effectively a combination of the two approaches • Proxy has information about the user • It is provided with a list of all available content versions • Proxy picks the best one to provide the user

  11. Adaptation is tricky • How much adaptation should be performed on object? • Too much adaptation: • User inconvenienced by frequent explicit interaction • Unusable data is transferred … which wastes resources • memory • bandwidth • time • Too little adaptation: • More data than optimally required by user is transferred • Wastes resources again • Related to the “Whose decides?” question

  12. Community Driven Adaptation • Classify the user into one of many communities • Serve content at fidelities which have been deemed appropriate by other members of that community in the past • Utilize a community-driven process to determine the optimal fidelity for objects!

  13. Hypothesis • Data which is accessed in the past is likely to be accessed again • Users share preferences, and thus optimal fidelity, along with other members of a user community

  14. What’s the catch, eh? • When no information is available for an object, the proxy makes an educated guess to serve the first user • Based on feedback received from the initial users, the proxy modifies the fidelity at which the object is served to future users Share the Pain and Spread the Benefit!

  15. An Example • Let a be a community of users {a1,a2,…an} • Person a1 accesses page A • Proxy serves content at a fidelity it “guesses” to be appropriate • If a1 is not satisfied with fidelity, feedback can be provided for real-time refinement • Proxy incorporates any feedback received from a1 when choosing the fidelity to serve page A for any member of the a community

  16. The End Result • Users who access site initially have to bear “pain” of having to provide feedback • Users who come later, reap “benefit” • Same user unlikely to always be first to access a site • Providing feedback is • Natural … occurs in the course of performing task • Easy … integrated into browser • Optional …no forced user’s interaction

  17. Advantages of our Approach • User is not burdened with making too many choices on a frequent basis • Preferences do not have to be explicitly transmitted to the proxy … extrapolated through natural feedback to the system • With a transcoder installed on the proxy, server does not have to maintain multiple versions of content • Given a reasonable set of installed transcoders, system automatically extends to future devices of any characteristics

  18. Research Questions • How do we classify users into communities and how does this classification change over time? • How closely can history of user accesses be used to track fidelity preferences in the future? (Based on community, how good is prediction) • At what granularity should user accesses be tracked? (e.g. object, page, site, etc.) • User Interface Usability

  19. Research Direction • Focusing on Pragmatic Issues • System operates in world of imperfect information • Adaptation has the potential of changing user behaviour User Study is a significant component of this research

  20. Goal of Experimentation • Design an effective user study that provides realistic data which we will analyze to define and refine system behaviour

  21. User Study • Simple Proxy • Serves content at lowest fidelity unless an improvement is requested • Extended Browser • Allow user to request improvements • The Proxy and Browser trace all user actions

  22. User Study (contd … 1) • This experimental setup for the study allows us to determine the minimum fidelity that satisfies a user. Since the goal is to save resources, this turns out to be the optimal fidelity for an object. • The traces from this initial study will be mined to discover the efficacy of various heuristics that may be employed by the proxy

  23. User Study (contd … 2) • Initial Study focuses on fidelity optimization of image objects • Easy to gauge for users • Extensive transcoder support for Progressive JPEG images • Good Browser support for regular JPEG images • Can be extrapolated to other forms of media • Sound • Video • Text

  24. User Interface Considerations • Feedback process must be • Simple • Non-disruptive • Some Good Candidates: • Mouse over an image • Right click menu options • Right click menu options under a Fidelity category

  25. Browser Extensions • We have extended MS Internet Explorer to allow our own code to execute [1] … whenever the user right-clicks on an image, they are allowed to request improvement to the fidelity of that image • Same thing can be done with Mozzila and some other advanced browsers

  26. Environment Control • Network bandwidth and latency • Time limit … simulating energy • Screen size

  27. Thoughts for Tasks • We have the user carry out simple tasks 1) Freeform browsing … e.g. check out websites about Avril Lavine.  2) Browsing for specific information … e.g. Find states that border Georgia, USA. 3) Making an e-commerce transaction … e.g. buy a book from Amazon.ca. 4) Browse for routine information … e.g. What price did Nortel close today? 

  28. Well Designed Tasks • Too much detail in specification is bad • Have several instances of each type of task • User learns and might perform better/worse over course of experiment • Randomize order of tasks

  29. Data that we will keep track of • Pages navigated by user • Actual Content of Page • Time Spent on Page • Time breakdown on a per-task basis • Any operations canceled by user

  30. And Then … Data Crunching!

  31. References • “Component-Based Adaptation” by Eyal de Lara, Yogesh Chopra, Rajnish Kumar, Nilesh Vaghela, Dan S. Wallach, and Willy Zwaenepoel • "Resource-Aware Speculative Prefetching in Wireless Networks" by N. J. Tuah, M. Kumar, and S. Venkatesh

  32. Questions/ Comments Thank you for your time

More Related