click evidence signals and tasks n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Click Evidence Signals and Tasks PowerPoint Presentation
Download Presentation
Click Evidence Signals and Tasks

Loading in 2 Seconds...

play fullscreen
1 / 16

Click Evidence Signals and Tasks - PowerPoint PPT Presentation


  • 69 Views
  • Uploaded on

Click Evidence Signals and Tasks. Vishwa Vinay Microsoft Research, Cambridge. Introduction. Signals Explicit Vs Implicit Evidence Of what? From where? Used how? Tasks Ranking, Evaluation & many more things search. Clicks as Input . Task = Relevance Ranking

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Click Evidence Signals and Tasks' - reia


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
click evidence signals and tasks

Click Evidence Signals and Tasks

Vishwa Vinay

Microsoft Research, Cambridge

introduction
Introduction
  • Signals
    • Explicit VsImplicit
  • Evidence
    • Of what?
    • From where?
    • Used how?
  • Tasks
    • Ranking, Evaluation & many more things search
clicks as input
Clicks as Input
  • Task = Relevance Ranking
    • Feature in relevance ranking function
  • Signal
    • selectURL, count(*)asDocFeature

fromHistorical_Clicksgroupby URL

    • selectQuery,URL, count(*)asQueryDocFeature

fromHistorical_Clicksgroupby Query, URL

clicks as input1
Clicks as Input
  • Feature in relevance ranking function
  • Static feature (popularity)
  • Dynamic feature (for this query-doc pair)
    • “Query Expansion using Associated Queries”, Billerbeck et al, CIKM 2003
    • “Improving Web Search Ranking by Incorporating User Behaviour”, Agichtein et al, SIGIR 2006
    • ‘Document Expansion’
    • Signal bleeds to similar queries
clicks as output
Clicks as Output
  • Task = Relevance Ranking
    • Result Page = Ranked list of documents
    • Ranked list = Documents sorted based on Score
    • Score = Probability that this result will be clicked
  • Signal
    • Did my prediction agree with the user’s action?
    • “Web-Scale Bayesian Click-through rate Prediction for Sponsored Search Advertising in Microsoft’s Bing Search Engine”, Graepel et al, ICML 2010
clicks as output1
Clicks as Output
  • Calibration: Merging results from different sources (comparable scores)
    • “Adaptation of Offline Vertical Selection Predictions in the Presence of User Feedback”, Diaz et al, SIGIR 2009
  • Onsite Adaptation of ranking function
    • “A Decision Theoretic Framework for Ranking using Implicit Feedback”, Zoeter et al, SIGIR 2008
clicks for training
Clicks for Training
  • Task = Learning a ranking function
  • Signal

Query=“Search Solutions 2010”

Absolute: Relevant={Doc1, Doc3}, NotRelevant={Doc2}

Preferences: {Doc2 Doc1}, {Doc2 Doc3}

clicks for training1
Clicks for Training
  • Preferences from Query-> {URL, Click} events
    • Rank bias & Lock-in
      • Randomisation & Exploration
    • “Accurately Interpreting Clickthrough Data as Implicit Feedback”, Joachims et al, SIGIR 2005
  • Preference Observations into Relevance Labels
    • “Generating Labels from Clicks”, Agrawal et al, WSDM 2010
clicks for evaluation
Clicks for Evaluation
  • Task = Evaluating a ranking function
  • Signal
    • Engagement and Usage metrics

Query=“Search Solutions 2010”

Controlled experiments for A/B Testing

clicks for evaluation1
Clicks for Evaluation
  • Disentangling relevance from other effects
    • “An experimental comparison of click position-bias models”, Craswell et al, WSDM 2008
  • Label-free evaluation of retrieval systems (‘Interleaving’)
    • “How Does Clickthrough Data Reflect Retrieval Quality?”, Radlinski et al, CIKM 2008
personalisation with clicks
Personalisation with Clicks
  • Task = Separate out Individual preferences from aggregates
  • Signal : {User, Query, URL, Click} tuples

Query=“Search Solutions 2010”

personalisation with clicks1
Personalisation with Clicks
  • Click event as a rating
    • “Matchbox: Large Scale Bayesian Recommendations”, Stern et al, WWW 2009
  • Sparsity

- collapse using user groups (groupisation)

“Discovering and Using Groups to Improve Personalized Search”, Teevan et al, WSDM 2009

- collapse using doc structure

miscellaneous
Miscellaneous
  • Using co-clicking for query suggestions
    • “Random Walks on the Click Graph”, Craswell et al, SIGIR 2007
  • User behaviour models for
    • Ranked lists: “Click chain model in Web Search”, Guo et al, WWW 2009
    • Whole page: “Inferring Search Behaviors Using Partially Observable Markov Model”, Wang et al, WSDM 2010
  • User activity away from the result page
    • “BrowseRank: Letting Web Users Vote for Page Importance”, Liu et al, SIGIR 2008
additional thoughts
Additional Thoughts
  • Impressions & Examinations
  • Raw click counts versus normalised ratios
  • Query=“Search Solutions 2010”
  • All clicks are not created equal
    • - Skip Click LastClick OnlyClick
clicks and enterprise search
Clicks and Enterprise Search
  • Relying on the click signal
    • Machine learning and non-click features
    • Performance Out-Of-the-Box
    • Shipping a shrink-wrapped product
  • The self-aware adapting system
    • Good OOB
    • Gets better with use
    • Knows when things go wrong
slide16
Thank you

vvinay@microsoft.com