1 / 19

Relevance feedback using query-logs

Relevance feedback using query-logs. Gaurav Pandey Supervisors: Prof. Gerhard Weikum Julia Luxenburger. Motivation. Query. Search Engine. Results. “ One size fits all ”. Motivation. User info. Query. Search Engine. Results. Motivation. Python. Motivation.

hhensley
Download Presentation

Relevance feedback using query-logs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Relevance feedback using query-logs Gaurav Pandey Supervisors: Prof. Gerhard Weikum Julia Luxenburger

  2. Motivation Query Search Engine Results “One size fits all”

  3. Motivation User info Query Search Engine Results

  4. Motivation Python

  5. Motivation CGI code Debugging programming Python

  6. Usage of Query Logs Clickthrough data • Past queries • Documents clicked

  7. Query Clicked Documents History Instance Usage of Query Logs

  8. Result Query: “python information CGI code examples program code debugging bug removal programming” But, p(python/query)=? p(CGI)/query)=? p(code)/query)=? ………………………….. Query Reformulation

  9. Considers only the current query But, not history instances Language Model Normally(without using history), w:term d: document q:query Importance of term w in current query

  10. Now, using history: ? Importance of term w in current query + history instances Language Model Normally(without using history), w:term d: document q:query Importance of term w in current query

  11. History query: “CGI code” Documents: “CGI examples”, “program code” History query: “CGI code” Language Model+History Importance of term w in history instances Importance of the term w at one instance in the history Documents: “CGI examples”, “program code”

  12. Works,but can be improved Equal Weighting

  13. Discriminative Weighting Choose different  for every history instance.. How?

  14. Overlap if a history query has common terms with the current query then λi= 1, Else if there is no common term λi=0 Example: Current query “python information” History query:”python code” λi= 1 History query:”world cup” λi= 0

  15. Soft overlap if a history query has common terms with the current query then λi= a, Else if there is no common term λi=b (a>b) Example: Current query “python information” History query:”python code” λi= 8 History query:”world cup” λi= 2

  16. Decrease with time Use uniformly decreasing  values If there are n history instances, 1 =n 2 =n-1 3 =n-2 …… n-1 =2 n =1

  17. Decrease with time Use geometrically decreasing  values If there are n history instances, 1 =n 2 =n/2 3 =n/3 …… n-1 =n/(n-1) n =1

  18. Experiment • Comparison of the • 4 techniques • Equal weighting • Basic model (without history) • Use similar techniques for: • Probabilistic model • Vector space model

  19. Thanks

More Related