1 / 22

ASSOCIATIVE BROWSING

ASSOCIATIVE BROWSING. Evaluating. for Personal Information. Jinyoung Kim / W. Bruce Croft / David Smith. What do you remember about your documents?. Registration. James. James. Use search if you recall keywords!. What if keyword search is not enough?. Registration.

shepry
Download Presentation

ASSOCIATIVE BROWSING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASSOCIATIVE BROWSING Evaluating for Personal Information Jinyoung Kim / W. Bruce Croft / David Smith

  2. What do you remember about your documents? Registration James James Use search if you recall keywords!

  3. What if keyword search is not enough? Registration Search first, then browse through documents!

  4. But I don’t remember any keywords! William James You might remember a related concept! *Concept : entities and terms of interest to the user

  5. How can we build associations? Manually? Participants wouldn’t create associations beyond simple tagging operations - Sauermann et al. 2005 Automatically? How would it match user’s preference?

  6. Building the Associative Browsing Model 1. Document Collection 2. Concept Extraction 3. Link Extraction 4. Link Refinement Term Similarity Temporal Similarity Click-based Training Co-occurrence

  7. Link Extraction and Refinement • Link Scoring • Linear combination of link type scores • S(c1,c2) = Σi [ wi × Linki(c1,c2) ] • Link Presentation • Ranked list of suggested items • Users click on them for browsing • Link Refinement (training wi) • Maximize click-based relevance • Grid Search : Maximize retrieval effectiveness (MRR) • RankSVM : Minimize error in pairwise preference Concept: Search Engine

  8. Evaluation

  9. Evaluation based on Known-item Finding • Data Collection • Collect public documents in UMass CS dept • CS dept. people competed in known-item finding tasks • 30 participants, 53 search sessions in total • Two rounds of user study • Metrics • Value of browsing • % of sessions browsing was used • % of sessions browsing was used & led to success • Quality of browsing suggestions • Mean Reciprocal Rank using clicks as judgments • 10-fold cross validation over the click data collected

  10. DocTrack[Kim&Croft10] Concept: Computer Architecture Relevant Documents:

  11. Evaluation Results • Comparison with Simulation Results [Kim&Croft&Smith11] • Roughly matches in terms of overall usage and success ratio • The Value of Browsing • Browsing was used in 30% of all sessions • Browsing saved 75% of sessions when used Document Only Document + Concept

  12. Quality of Browsing Suggestions – CS Collection • ConceptBrowsing (MRR) • Document Browsing (MRR)

  13. Summary Associative Browsing Model for Personal Information Evaluation based on User Study Any Questions?

  14. Evaluation by Simulated User Model [KCS11] • Query generation model [Kim&Croft09] • Select terms from a target document • State transition model • Use browsing when result looks marginally relevant • Link selection model • Click on browsing suggestions based on perceived relevance

  15. Community Efforts based on the Datasets

  16. Future Work Concept Map Visualization User Interface Query-based Concept Generation Combine with Faceted Search Evaluation Learning Method Active Learning More Features Exploratory Search Large-scale User Study

  17. Optional Slides

  18. Role of Personalization Using one person’s click data for training results in much higher learning effectiveness

  19. Quality of Browsing Suggestions – Person 1/2 • Concept Browsing (MRR) • Document Browsing (MRR)

  20. Building the Associative Browsing Model • Link Types • Links between concepts • Links between documents

  21. Quality of Browsing Suggestions (optional) • For Concept Browsing • For Document Browsing (Using the CS Collection, Measured in MRR)

  22. Evaluation Results(optional) • Success Ratio of Browsing • Lengths of Successful Sessions

More Related