1 / 13

Ranking Multimedia Databases via Relevance Feedback with History and Foresight Support

Ranking Multimedia Databases via Relevance Feedback with History and Foresight Support. DBRank 08, April 12 th 2008, Cancún , Mexico Marc Wichterich , Christian Beecks, Thomas Seidl. Outline. Motivation Ranking DB according to Earth Mover’s Distance

oberon
Download Presentation

Ranking Multimedia Databases via Relevance Feedback with History and Foresight Support

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ranking Multimedia Databases via Relevance Feedback with History and Foresight Support DBRank 08, April 12th 2008, Cancún, Mexico Marc Wichterich, Christian Beecks, Thomas Seidl

  2. Outline • Motivation • Ranking DB according to Earth Mover’s Distance • Search for suitable ground distance via user interaction • Relevance Feedback • The MindReader approach • Challenges in multimedia context • History – Change of user preferences over time • Foresight – Fast exploration • Conclusion and Outlook

  3. Motivation: Ranking accordingto Earth Mover‘s Distance • Transform object features to match those of other object • Minimum work for transformation: EMD[1] • Feature signatures: {(center1, weight1), (c2,w2), …} signature of object 1 signature of object 2 EMD weight assignment [1] Rubner, Tomasi, Guibas, “A metric for distributions with applications to image databases,” in IEEE ICCV 1998.

  4. Motivation: Ranking accordingto Earth Mover‘s Distance • Requires ground distance gd in feature space • gd(“blue/left”, “purple/right”) vs. gd(“blue/left”, “red/middle”) ?gd? gd? • Possibly complex gd: “Blue may move horizontally at low cost if at top of image (sky)” • Idea: Find gd according to user preferences

  5. Motivation: Ranking accordingto Earth Mover‘s Distance • Collecting preference information on feature space • Utilize histogram-based Relevance Feedback system • Histogram dimensions correspond to points in feature space • System has to deliver information on histogram dimension pairs • Define gdon feature space • Rank DB according to EMDgdon signatures featurespace histogram

  6. Relevance Feedback: MindReader Approach [2] • MR shows candidate objects • User rates relevant objects • MindReader determines: • new query point q • similarity matrix S for ellipsoid-shaped distance • Goto 1 • Similarity matrix S is (pseudo) inverse covariance matrix • S reflects user preferences w.r.t. histograms dimensions [2] Ishikawa, Subramanya, Faloutsos, “MindReader: Querying databases through multiple examples,” VLDB 1998.

  7. MindReader: Challenges in multimedia context • Multimedia object histograms usually high-dimensional • Number of rated candidates << histogram dimensionality • Pseudo inverse results in open ellipsoid search region • MindReader implicitly assumes: no info from user  maximum preference • Solution: close the query ellipsoid • Ask user for many more object ratings • Replace assumption: no info from user  as preferred as least preferred direction [3] • Avoid assumptions by tackling “no info from user” [3] Ye, Xu, “Similarity measure learning for image retrieval using feature subspace analysis,” ICCIMA 2003.

  8. Relevance Feedback with History (1) • “No information” only true within single iteration • Idea: save information from previous rounds + = iteration k-1 iteration k result • Technique: • Incrementally compute weighted covariance matrix • Exponential aging for ratings of previous iterations • Include relevant points from all previous iterations

  9. Relevance Feedback with History (2) = 0.1  = 0.3

  10. Relevance Feedback with History (Summary) • Feedback information crosses iteration boundaries • Parameter  sets aggregated weight for previous rounds • Weighted covariance matrix is computed incrementally • No need to store or access old objects and weights • Efficiently computable from aggregated information • Benefits: • Guarantees closed query ellipsoids • Suitable for high-dimensional multimedia data

  11. Relevance Feedback with Foresight (1) • Framework can be reused to tackle another challenge • Exploratory search: user navigates through DB • User picks objects to move query point into preferred direction • New search region mightbe oriented contrary to intended movement • Slow or no advancement • Idea: Introduce heuristic direction matrix

  12. Relevance Feedback with Foresight (2) • Orientation of matrix D depends on direction of query point movement • Influence  as a function of magnitude of movement • Adjust seamlessly to phases of exploration and stationary refinement

  13. Observations and Outlook • Preliminary results • Implemented prototype Relevance Feedback system • History approach successfully extends MindReader to high dimensions • Foresight promising but naïve  functions sometimes showed too rapid or too slow a change in influence • Work in progress: • Suitable function for Foresight parameter  • Heuristics for aggregating Relevance Feedback results into gd • Find gd using signature-based Relevance Feedback

More Related