1 / 18

ValuePick : Towards a Value-Oriented Dual-Goal Recommender System

ValuePick : Towards a Value-Oriented Dual-Goal Recommender System. Leman Akoglu Christos Faloutsos. OEDM in conjunction with ICDM 2010 Sydney, Australia. Recommender Systems. Traditional recommender systems try to achieve high user satisfaction.

Download Presentation

ValuePick : Towards a Value-Oriented Dual-Goal Recommender System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ValuePick: Towards a Value-Oriented Dual-Goal Recommender System Leman Akoglu Christos Faloutsos OEDM in conjunction with ICDM 2010 Sydney, Australia

  2. Recommender Systems Traditional recommender systems try to achieve high user satisfaction

  3. Dual-goal Recommender Systems -“value” Trade-off user satisfaction vs. vendor profit Dual-goal recommender systems try to achieve (1) high user satisfaction as well as (2) high-“value” vendor gain

  4. Dual-goal Recommender Systems vertices ranked by proximity v253 v162 v261 v327 . . . query vertex network-“value”

  5. Dual-goal Recommender Systems vertices ranked by proximity v253 v162 v261 v327 . . . network-“value”

  6. network-“value” Dual-goal Recommender Systems vertices ranked by proximity v253 v162 v261 Trade-off user satisfaction vs. network connectivity v327 . . . network-“value”

  7. Dual-goal Recommender Systems • Main concerns: • We cannot make the highest value recommendations • Recommendations should still reflect users’ likes relatively well Vendor User

  8. ValuePick: Main idea • Carefullyperturb (change the order of) the proximity-ranked list of recommendations • Controlled by a tolerance for each user ζ ζ Vendor User

  9. ValuePick Optimization Framework DETAILS proximity “value” Total expected gain (assuming proximity ~ acceptance prob.) tolerance ϵ [0,1] average proximity score of original top-k

  10. ValuePick ~ 0-1 Knapsack DETAILS value We use CPLEX to solve our integer programming optimization problem maximum weight W allowed weight of item i

  11. Pros and Cons of ValuePick Cons:In marketing, it is hard to predict the effect of an intervention in the marketing scheme, i.e., not clear how users will respond to ‘adjustments’ Pros: • Tolerance ζ can flexibly (and even dynamically) control the `level-of-adjustment’ • Users rate same item differently at different times, i.e., users have natural variability in their decisions.

  12. Experimental Setup I • Two real networks • Netscience – collaboration network • DBLP – co-authorship network • Four recommendation schemes: • No Gain Optimization (ζ = 0) • ValuePick (ζ = 0.01, ζ = 0.02) • Max Gain Optimization (ζ = 1) • Random • “value” is centrality

  13. Experimental Setup II Simulation steps: Given a recommendation scheme s • At each step T • For each node i • Make a set K of recommendations to node i using s • Node ilinks to node jϵK with prob. proximity(i,j) • Re-compute proximity and centrality scores We use k=5 and T=30

  14. Comparison of schemes EXPERIMENTS ValuePick provides a balance between user satisfaction (high E), and vendor gain (small diameter).

  15. Recommend by heuristic EXPERIMENTS Simple perturbation heuristics do not balance user satisfaction and vendor gain properly.

  16. Computational complexity EXPERIMENTS Making k ValuePick recommendations to a given node involves: 1 - finding PPR scores O(#edges) 2 - solving ValuePick optimization w/ CPLEX 1/10 sec. to solve among top 1K nodes

  17. Conclusions • Problem formulation: incorporate the “value” of recommendations into the system • Design of ValuePick: • parsimonious single parameter ζ • flexible adjust ζ for each user dynamically • general use any “value” metric • Performance study: • experiments to show proper trade of user acceptance in exchange for higher gain • CPLEX with fast solutions

  18. THANK YOU www.cs.cmu.edu/~lakoglu lakoglu@cs.cmu.edu ζ Vendor User

More Related