1 / 51

Susan Dumais Microsoft Research research.microsoft/~sdumais

Information Analysis in a Dynamic and Data-Rich World. Susan Dumais Microsoft Research http://research.microsoft.com/~sdumais. Change is Everywhere in Information Systems. New documents appear all the time Query volume changes over time Document content changes over time

velika
Download Presentation

Susan Dumais Microsoft Research research.microsoft/~sdumais

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Analysis in a Dynamic and Data-Rich World Susan Dumais Microsoft Research http://research.microsoft.com/~sdumais

  2. Change is Everywhere in Information Systems • New documents appear all the time • Query volume changes over time • Document content changes over time • So does metadata and extracted info .. • User interaction changes over time • E.g., anchor text, query-click streams, page visits • What’s relevant to a query changes over time • E.g., U.S. Open 2010 (in May vs. Sept) • E.g., Hurricane Earl (in Sept 2010 vs. before/after) • Change is pervasive in IR systems … yet, we’re not doing much about it !

  3. Information Dynamics But, ignores … Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation Today’s Browse and Search Experiences

  4. Digital Dynamics Easy to Capture • Easy to capture • But … few tools support dynamics

  5. Overview • Characterize change in digital content • Content changes over time • People re-visit and re-find over time • Improve retrieval and understanding • Data -> Models/Systems -> Decision Analysis • Examples • Desktop: Stuff I’ve Seen; Memory Landmarks; LifeBrowser • News: Analysis of novelty (e.g., NewsJunkie) • Web: Tools for understanding change (e.g., Diff-IE) • Web: IR models that leverage dynamics • Examples from our work on search and browser support … but more general

  6. SIS: • Unified access to distributed, heterogeneous, content (mail, files, web, tablet notes, rss, etc.) • Index full content + metadata • Fast, flexible search • Information re-use Stuff I’ve Seen Windows-DS [Dumais et al., SIGIR 2003] Stuff I’ve Seen (SIS) • Many silos of information • SIS -> • Windows Desktop Search

  7. Example searches Looking for: recent email from Fedor that contained a link to his new demo Initiated from:   Start menu Query:from:Fedor Looking for: the pdf of a SIGIR paper on context and ranking (not sure it used those words) that someone (don’t remember who) sent me a month ago Initiated from:   Outlook Query:SIGIR Looking for:meeting invite for the last intern handoffInitiated from: Start menu Query: intern handoff kind:appointment Looking for: C# program I wrote a long time ago Initiated from: Explorer pane Query:  QCluster*.*

  8. Stuff I’ve Seen: Lessons Learned • Personal stores: 5–1500k items • Retrieved items: • Age: Today (5%), Last week (21%), Last month (47%) • Need to support episodic access to memory • Information needs: • People are important – 25% queries involve names/aliases • Date by far the most common sort order, even for people who had best-match Rank as the default • Few searches for “best” matching object • Many other criteria (e.g., time, people, type), depending on task • Need to support flexible access • Abstractions important – “useful” date, people, pictures • Desktop search != Web search

  9. Beyond Stuff I’ve Seen • Better support for human memory • Memory Landmarks • LifeBrowser • Phlat • Beyond search • Proactive retrieval • Stuff I Should See (IQ) • Temporal Gadget • Using desktop index as a rich “user model” • PSearch • News Junkie • Diff-IE

  10. Memory Landmarks • Importance of episodes in human memory • Memory organized into episodes (Tulving, 1983) • People-specific events as anchors (Smith et al., 1978) • Time of events often recalled relative to other events, historical or autobiographical (Huttenlocher & Prohaska, 1997) • Identify and use landmarks facilitate search and information management • Timeline interface, augmented w/ landmarks • Learn Bayesian models to identify memorable events • Extensions beyond search, e.g., Life Browser

  11. [Ringle et al., 2003] Distribution of Results Over Time Search Results • Memory Landmarks • General (world, calendar) • Personal (appts, photos) • Linked to results by time Memory Landmarks

  12. Memory LandmarksLearned models of memorability

  13. [Horvitz & Koch, 2010] Images & videos Desktop & search activity Appts & events Locations Whiteboard capture LifeBrowser

  14. LifeBrowserLearned models of selective memory

  15. [Gabrilovich et al., WWW 2004] News JunkieEvolution of Context over Time • Personalized news using information novelty • Stream of news -> cluster groups of related articles • Characterize what a user knows • Compute the novelty of new articles, relative to this background (relevant & novel) • Novelty = KLDivergence (article || current_knowledge) • Novelty score and user preferences guide what to show when and how to do so

  16. Novelty Score Novelty Score On topic, elaboration:SARS patient’s wife held under quarantine On-topic, recap Word Position Word Position Novelty Score Novelty Score Offshoot:Swiss company develops SARS vaccine Offshoot:SARS impact on Asian stock markets Word Position Word Position NewsJunkieTypes of novelty, via intra-article novelty dynamics

  17. [Adar et al., WSDM 2009] Characterizing Web Change Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 • Large-scale Web crawls, over time • Revisited pages • 55,000 pages crawled hourly for 18+ months • Unique users, visits/user, time between visits • Pages returned by a search engine (for ~100k queries) • 6 million pages crawled every two days for 6 months 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation

  18. Measuring Web Page Change • Summary metrics • Number of changes • Amount of change • Time between changes • Change curves • Fixed starting point • Measure similarity over different time intervals • Within-page changes

  19. Measuring Web Page Change • 33% of Web pages change • 66% of visited Web pages change • 63% of these change every hr. • Avg. Dice coeff. = 0.80 • Avg. time bet. change = 123 hrs. • .edu and .gov pages change infrequently, and not by much • popular pages change more frequently, but not by much • Summary metrics • Number of changes • Amount of change • Time between changes

  20. Measuring Web Page Change Knot point Time from starting point • Summary metrics • Number of changes • Amount of change • Time between changes • Change curves • Fixed starting point • Measure similarity over different time intervals

  21. Measuring Within-Page Change Sep. Oct. Nov. Dec. Time • DOM-level changes • Term-level changes • Divergence from norm • cookbooks • salads • cheese • ingredient • bbq • … • “Staying power” in page

  22. Example Term Longevity Graphs

  23. [Adar et al., CHI 2009] Revisitation on the Web • Revisitation patterns • Log analyses • Toolbar logs for revisitation • Query logs for re-finding • User survey to understand intent in revisitations Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation What was the last Web page you visited? Why did you visit (re-visit) the page?

  24. Measuring Revisitation • 60-80% of Web pages you visit, you’ve visited before • Many motivations for revisits Time Interval • Summary metrics • Unique visitors • Visits/user • Time between visits • Revisitation curves • Histogram of revisit intervals • Normalized

  25. [Teevan et al., SIGIR 2007] [Tyler et al., WSDM 2010] Revisitation and Search(ReFinding) • Repeat query (33%) • Q: microsoft research • Repeat click (39%) • http://research.microsoft.com • Q: microsoft research, msr … • Big opportunity (43%) • 24% “navigational revisits”

  26. Building Support for Web Dynamics Temporal IR Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation Diff IE

  27. Diff-IE [Teevan et al., UIST 2009] [Teevan et al., CHI 2010] Diff-IE toolbar Changes to page since your last visit

  28. Interesting Features of Diff-IE New to you Always on Non-intrusive In-situ

  29. Examples of Diff-IEin Action

  30. Expected New Content

  31. Monitor

  32. Unexpected Important Content

  33. Serendipitous Encounters

  34. Understand Page Dynamics

  35. Expected Unexpected Unexpected Important Content Expected New Content Edit Attend to Activity Understand Page Dynamics Monitor Unexpected Unimportant Content Serendipitous Encounter

  36. Studying Diff-IE In situ Representative Experience Longitudinal • Feedback buttons • Survey • Prior to installation • After a month of use • Logging • URLs visited • Amount of change when revisited • Experience interview

  37. People Revisit More 14% • Perception of revisitation remains constant • How often do you revisit? • How often are revisits to view new content? • Actual revisitation increases • First week: 39.4% of visits are revisits • Last week: 45.0% of visits are revisits • Why are people revisiting more with DIFF-IE?

  38. Revisited Pages Change More 8% 17% 51+% • Perception of change increases • What proportion of pages change regularly? • How often do you notice unexpected change? • Amount of change seen increases • First week: 21.5% revisits, changed by 6.2% • Last week: 32.4% revisits, changed by 9.5% • Diff-IE is driving visits to changed pages • It supports people in understanding change

  39. Other Examples of Dynamics and User Experience • Content changes • Diff-IE • Zoetrope (Adar et al., 2008) • Diffamation(Chevalier et al., 2010) • Temporal summaries and snippets … • Interaction changes • Explicit annotations, ratings, wikis, etc. • Implicit interest via interaction patterns • Edit wear and read wear (Hill et al., 1992)

  40. Leveraging Dynamics for Retrieval Temporal IR Content Changes 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 User Visitation/ReVisitation

  41. [Elsas et al., WSDM 2010] Temporal Retrieval Models • Current IR algorithms look only at a single snapshot of a page • But, Web pages change over time • Can we can leverage this to improved retrieval? • Pages have different rates of change • Different priors (using change vs. link structure) • Terms have different longevity (staying power) • Some are always on the page; some transient • Language modeling approach to ranking Change prior Term longevity

  42. Relevance and Page Change • Page change is related to relevance judgments • Human relevance judgments • 5 point scale – Perfect/Excellent/Good/Fair/Bad • Rate of Change -- 60% Perfect pages; 30% Bad pages • Use change rate as a document prior (vs. priors based on link structure like Page Rank) • Shingle prints to measure change Change prior

  43. Relevance and Term Change • Terms patterns vary over time • Represent a document as a mixture of terms with different “staying power” • Long, Medium, Short Term longevity

  44. Evaluation: Queries & Documents • 18K Queries, 2.5M Judged Documents • 5-level relevance judgment (Perfect … Bad) • 2.5M Documents crawled weekly for 10 wks • Navigational queries • 2k queries identified with a “Perfect” judgment • Assume these relevance judgments are consistent over time

  45. Experimental Results Dynamic Model + Change Prior Change Prior Dynamic Model Baseline Static Model

  46. [Kulkarni et al., WSDM 2011] Temporal Retrieval, Ongoing Work • Initial evaluation/model • Focused on navigational queries • Assumed their relevance is “static” over time • But, there are many other cases … • E.g., US Open 2010 (in June vs. Sept) • E.g., World Cup Results (in 2010 vs. 2006) • Ongoing evaluation • Collecting explicit relevance judgments, interaction data, page content, and query frequency over time

  47. Relevance over Time Relevance Judgment (normalized) Mar 25, 2010 May 28, 2010 Time (weeks) Query: march madness [Mar 15 – Apr 4, 2010]

  48. Other Examples of Dynamics and IR Models/Systems • Temporal retrieval models • Elsas & Dumais (2010); Liu & Croft (2004); Efron (2010); Aji et al. (2010) • Document dynamics, for crawling and indexing • Efficient crawling and updates • Query dynamics • Jones & Diaz (2004); Kulkarni et al. (2011); Kotov et al. (2010) • Integration of news into Web results • Diaz (2009) • Extraction of temporal entities within documents • Protocols for retrieving versions over time • E.g., Memento(Von de Sompel et al., 2010)

  49. Summary • Characterize change in information systems • Content changes over time • People re-visit and re-find over time • Develop new algorithms and systems to improve retrieval and understanding of dynamic collections • Desktop: Stuff I’ve Seen; Memory Landmarks; LifeBrowser • News: Analysis of novelty (e.g., NewsJunkie) • Web: Tools for understanding change (e.g., Diff-IE) • Web: Retrieval models that leverage dynamics

  50. Key Themes • Dynamics are pervasive in information systems • Data-driven approaches increasingly important • Data -> Models/Systems -> Decision Analysis • Large volumes of data are available • Data scarcity is being replaced by data surfeit • Data exhaust is a critical resource in the internet economy, and can be more broadly leveraged • Machine learning and probabilistic models are replacing hand-coded rules • Broadly applicable • Search … plus: machine translation, spam filtering, traffic flow, evidence-based health care, skill assessment, etc.

More Related