1 / 15

AIRS 2013 Report (AIRS: Asia Information Retrieval Societies Conference)

AIRS 2013 Report (AIRS: Asia Information Retrieval Societies Conference). Kazunari Sugiyama WING meeting 19 th December, 2013. Outline of AIRS 2013. Venue Rendezvous Grand Hotel, Singapore. x. (Source: “Conference Venue”

vesta
Download Presentation

AIRS 2013 Report (AIRS: Asia Information Retrieval Societies Conference)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AIRS 2013 Report(AIRS: Asia Information Retrieval Societies Conference) Kazunari Sugiyama WING meeting 19th December, 2013

  2. Outline of AIRS 2013 • Venue • Rendezvous Grand Hotel, Singapore x (Source: “Conference Venue” http://www.colips.org/conference/airs2013/index.php/conferencevenue)

  3. Outline of AIRS 2013 • Review Process • For each submission, (1) 3 reviewers read and rate for each paper, (2) Then decide accepted papers based on the reviewers’ score and discussion by area chair and program chairs • Acceptance rate • 41.2% [45 / 109](27 oral and 18 poster presentations) (Source: “Acceptance Statistics” http://www.colips.org/conference/airs2013/index.phpacceptance-statistics)

  4. Outline of AIRS 2013 • Acceptance rate in the past AIRS * I do not know any statistics as I did not serve for AIRS 2012.

  5. Candidate Venues for AIRS 2014 Sarawak, Malaysia Xi’An, China

  6. Best Papers “BibRank: a Language-Based Model for Co-ranking Entities in Bibliographic Networks” (JCDL’12) • Best paper • Laure Soulier, Lynda Tamine, and WahibaBahsoun: “A Collaborative Document Ranking Model for a Multi-faceted Search” • Best poster paper • RajendraPrasath, Aidan Duane, and Philip O’Reilly: “Topic Assisted Fusion to Re-rank Texts for Multi-faceted Information Retrieval” • Qianli Xing, Yiqun Liu, Min Zhang, Shaoping Ma and Kuo Zhang: “Characterizing Expertise of Search Engine Users “ • Best oral presentation • Alistair Moffat: “Seven Numeric Properties of Effectiveness Metrics” How about U. Bhandari, K. Sugiyama, A. Datta, and R. Jindal: “Serendipitous Recommendation for Mobile Apps Using Item-Item Similarity Graph” ? Nominated for both “the Best Poster and Paper Award”

  7. Soulier et al., “A Collaborative Document Ranking Model for a Multi-faceted Search” • Goal of this work • To solve complex information retrieval task involving a multi-faceted information need • Take into account group of collaborative users (i.e., experts) • by addressing the different query facets • Propose a collaborative document ranking model by • - Mining query facets • - Building collaborative document ranking

  8. Soulier et al., “A Collaborative Document Ranking Model for a Multi-faceted Search” • Multi-faceted search • How to infer the different query facets • How to exploit them jointly to select relevant results • Example of a multi-faceted query • “Habble Telescope Achievements” • “Focus of camera” • “Age of the universe space telescope” • “Cube pictures”

  9. Soulier et al., “A Collaborative Document Ranking Model for a Multi-faceted Search” • Mining Query Facets • Extract a subset of documents (D*) which satisfies the condition of being relevant with respect to the query topic (Q) to ensure broad topical coverage (Maximal Marginal Relevance, MMR) • Apply LDA to D* to identify latent topics • Each topic is integrated to a facet of Q

  10. Soulier et al., “A Collaborative Document Ranking Model for a Multi-faceted Search” • Building Collaborative Document Ranking • Expert-based Document Scoring • Rerank documents with respect to each expert’s domain towards the query facets • Expert-based Document Allocation • Allocate documents to the most suitable experts • EM-based algorithm

  11. Soulier et al., “A Collaborative Document Ranking Model for a Multi-faceted Search” • Experiments • Dataset • TREC Financial Times London 1991-1994 Collection • 210,158 articles • 20 TREC initial topics • Evaluation Measure • Coverage ratio at rank R • Diversity of search results • Relevant coverage ratio at rank R • Average precision at rank R

  12. Soulier et al., “A Collaborative Document Ranking Model for a Multi-faceted Search” • Experimental Results Some variants in EM-algorithm

  13. Soulier et al., “A Collaborative Document Ranking Model for a Multi-faceted Search” • Experimental Results The higher the agreement level is, the fewer documents are assessed as relevant. The improvements in proposed model are stable even if group size increases.

  14. Additional Information Awarded papers • Best paper • Laure Soulier, Lynda Tamine, and WahibaBahsoun: “A Collaborative Document Ranking Mode for a Multi-faceted Search” • Best poster paper • RajendraPrasath, Aidan Duane, and Philip O’Reilly: “Topic Assisted Fusion to Re-rank Texts for Multi-faceted Information Retrieval” • Qianli Xing, Yiqun Liu, Min Zhang, Shaoping Ma and KuoZhang: “Characterizing Expertise of Search Engine Users “ • Best oral presentation • Alistair Moffat: “Seven Numeric Properties of Effectiveness Metrics”

  15. Another Recent Papers on “Multi-faceted IR” • S. Lee, S.-I. Song, M. Kahng, D. Lee, and S.-G. Lee: “Random Walk based Entity Ranking on Graph for Multidimensional Recommendation” (RecSys'11) • A. Kotov, P. N. Bennett, R. W. White, S. T. Dumais, and J. Teevan: “Modeling and Analysis of Cross-Session Search Tasks” (SIGIR’11) • W. Kong and J. Allan: “Extracting Query Facets from Search Results” (SIGIR’13)

More Related