1 / 33

Hybrid Recommendation Technologies

Hybrid Recommendation Technologies. Francesco Ricci eCommerce and Tourism Research Laboratory ITC-irst Trento – Italy ricci@itc.it http://ectrl.itc.it. Content. Recommender systems Collaborative-based filtering (CF) Limitations of CF Motivations for the proposed research

Download Presentation

Hybrid Recommendation Technologies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Hybrid Recommendation Technologies Francesco Ricci eCommerce and Tourism Research Laboratory ITC-irst Trento – Italy ricci@itc.it http://ectrl.itc.it

  2. Content • Recommender systems • Collaborative-based filtering (CF) • Limitations of CF • Motivations for the proposed research • Case Based Reasoning and Interactive Query Management: • Case/Session Model • Similarity for tree-based models • Query relaxation • Empirical evaluation • Discussion

  3. Recommender Systems • A recommender system helps to make choices without sufficient personal experience of the alternatives • To suggest products to their customers - PUSH • To provide consumers with information to help them decide which products to purchase - PULL • Some examples found in the Web: • Amazon.com – looks in the user past buying history, and recommends product bought by a user with similar buying behavior • Tripadvisor.com - Quoting product reviews of a community of users • Activebuyersguide.com – make questions about searched benefits to reduce the number of candidate products • They are based on a number of technologies: information filtering, machine learning, adaptive and personalized system, user modeling, …

  4. Recommendation “Core” Techniques U is a set of users I is a set of items/products [Burke, 2002]

  5. Current User Users 14th item rate 1st item rate 1 Dislike 0 ? 0 Like 1 1 Items 1 ? Unknown 0 1 1 0 1 1 1 1 0 8 Hamming distance 5 6 6 5 4 Nearest Neighbor Nearest Neighbor Collaborative-Based Filtering 1 User Model = interaction history

  6. Collaborative-Based Filtering • A collection of user ui, i=1, …n and a collection of products pj, j=1, …, m • A n ´ m matrix of rates vij, with vij = ?if user i did not rate product j • Prediction is computed as • Where, vi is the average rate of user i, K is a normalization factor such that the sum of uik is 1, and Similarity of users i and k • Where the sum is over j s.t. vijand vkjare not “?”.

  7. Collaborative-Based Filtering • Pros: require minimal knowledge engineering efforts (knowledge poor) • Users and products are symbols without any internal structure or characteristics • Cons: • Requires a large number of explicit “rates” to bootstrap • Requires products to be standardized (users should have bought exactly the same product) • Assumes that prior behavior determines current behavior without taking into account “contextual” knowledge (session-level) • Does not provide information about products or explanations for the recommendations • Does not support sequential decision making or recommendation of “good bundling”, e.g., a travel package.

  8. Requirements and Issues • Recommendation Process • Recommendation requires information search – not only filtering • Human/Computer dialogues should be supported – e.g. user criticizes a suggested product or refine a query definition • Input/Output • Products and services may have complex structures • The final recommendation is a bundling of elementary components • Allow system bootstrapping without an initial memory of rates interactions • Generalize the definition of rates (implicit rates) • Users • Both short term (goal oriented) preferences and long term (stable) preferences must influence the recommendation • Unregistered users should be allowed to get recommendations • Account for user variability in preferred decision style • Users needs and wants structure/language may not match those of the products.

  9. Suggest Q changes Locationsfrom Catalogue Travel components loc1 Current Case loc2 1. Search the catalogue twc loc3 tb u 4. Sort locations loci by similarity to locations in reference cases r 2. Search Similar Cases Case twc loc2 tb loc3 Case Base loc1 u Ranked Items 3. Output Reference Set r Output Hybrid Case-Based/Collaborative Ranking Input Q

  10. Case/Session Model

  11. nt: cart nt: destination mt: vector nt: location mt: vector nt: case nt: destinations mt: hierarchical mt: vector mt: set dest1 cart1 X1 dests1 c1 X2 clf1 accs1 dest2 X3 cnq1 acts1 X4 ITEM Tree-based Case Representation • A case is a rooted tree and each node has a: • node-type: similarity between two nodes in two cases is defined only for nodes with the same node-type • metric-type: node content structure - how to measure the node similarity with another node in a second case

  12. Item Representation TRAVELDESTINATION=(X1,X2,X3,X4) X1 = (Italy, Trentino, Fassa, Canazei) X2 = (1,1,1) dest1 X3 = 1400 X4 = (0, 1, 0)

  13. X1 = (Italy, Trentino, Fassa, Canazei) X2 = (1,1,1) dest1 X3 = 1400 X4 = (0, 1, 0) ì = if is boolean x true x i i ï k k = = if is nominal c x v x í k i i k k ï £ £ if is numerical l x u x î i i k k Item Query Language • For querying purposes items x a represented as simple vector features x=(x1, …, xn) • A query is a conjunction of constraints over features: q=c1 Ù c2 Ù … Ù cmwherem£nand (Italy, Trentino, Fassa, Canazei, 1, 1, 1, 1400, 0, 1, 0)

  14. Query Relaxation • The goal of relaxation process is to solve the empty result set problem finding “maximal” succeeding sub-queries • For boolean queries: 1 succeeding sub-query can be found in quadratic time; finding all of them requires exponential time • Our approach: • Look for all relaxed sub-queries that change one single constraint and produce some results • Present all these relaxed queries to the users without sorting them • If two or more constraints should be relaxed this is done only if they belong to the same Abstraction Hierarchy (they refer to the same concept from the user point of view)

  15. Relaxation Applicability: Accommodation Search All queries Failing queries Failing queries repaired by the algorithm

  16. Destinations matching the user’s query current case CC C1 similar casesin the case base D1 CD1 ? current case CC C2 D1 D2 ? D2 CD2 Example: Scoring Two Destinations Score(Di) = Maxj {Sim(CC,Cj)*Sim(Di,CDj)} Score(D1)=Max{0.2*0.4,0.6*0.7}=0.42 Score(D2)=Max{0.2*0.5,0.6*0.3}=0.18

  17. Scoring • A collection of case sessions si i=1,…n and a collection of items/products pj, j=1, …, m • An ´ nsessions similarity matrixS ={sij}andam ´ m items/products similarity matrixP={pij} • An ´ msession vs. product incidence matrixA={aij},whereaij=1 (aij=0)if session i does (not) include product j Score(si,pj) = MAXk,l{sik akl plj} • A product pj gets a high scoreif it is very similar (plj close to 1)to a productplthat is contained in a session sk (akl = 1) that is very similar to the target session (sik close to 1) • A particular product can be scored (high) even if it is not already present in other sessions/cases, provided that it is similar to other products contained in other similar sessions.

  18. X1=(Italy, Trentino, Fassa, Canazei) X2 = (1,1,1) dest1 X3 = 1400 X4 = (0, 1, 0) Item Similarity If X and Y are two items with same node-type d(X,Y) = (1/åi = 1n wi)1/2 [åi = 1n wi di(Xi,Yi)2 ]1/2 where 0 £ wi£ 1. 1 if Xi or Yi are unknown overlap(Xi,Yi) if Xi is symbolic |Xi - Yi|/rangei if Xi is finite integer or real di(Xi,Yi) = Jaccard(Xi,Yi) if Xi is an array of Boolean Hierarchical(Xi,Yi) if Xi is a hierarchy Modulo(Xi,Yi )if Xi is a circular feature (month) Date (Xi,Yi )if Xi is a date Sim(X,Y) = 1 - d(X,Y)

  19. Item Similarity Example X1 = (I, TN, Fassa, Canazei) Y1 = (I, TN, Fassa,?) Y2 = (1,0,1) X2 = (1,1,1) dest2 dest1 Y3 = 1200 X3 = 1400 Y4 = (1, 1, 0) X4 = (0, 1, 0)

  20. dest3 cart2 Y1 dests2 c2 Y2 dest4 clf12 accs2 Y3 cnq2 dest5 acts2 Y4 Case Distance nt: cart nt: destination mt: vector nt: location mt: vector nt: case nt: destinations mt: hierarchical mt: vector mt: set dest1 cart1 X1 dests1 c1 X2 clf1 accs1 dest2 X3 cnq1 acts1 X4

  21. Case Distance nt: case mt: vector cart1 c1 clf1 cnq1 cart2 c2 clf12 cnq2

  22. nt: cart mt: vector dests1 accs1 acts1 dests2 accs2 acts2 nt: case mt: vector cart1 c1 clf1 cnq1 cart2 c2 clf12 cnq2

  23. nt: cart mt: vector nt: case nt: destinations mt: vector mt: set dest1 cart1 dests1 c1 clf1 accs1 dest2 cnq1 acts1 dest3 cart2 dests2 c2 dest4 clf12 accs2 cnq2 dest5 acts2

  24. Empirical Evaluation Bold face means significantly different (t-test, p<0.05)

  25. Research Areas Intersecting with RS • User Modeling: product rates; user dependent product classifier; product preferences; etc. • Information Retrieval: RS may be evaluated in term of precision and recall; a RS retrieves content that is relevant for user information needs • Personalization and Adaptive Hypermedia: recommendations are one-to-one and presentation is adapted to the user (e.g. explanations) • Mixed Initiative and Conversational Systems: system suggestions or questions interleave with user input and information browsing • Decision Making: the ultimate goal is to support a purchase decision; utility-based ranking has been exploited.

  26. Contribution • User Model: a collection of cases (recommendation sessions of the user); attribute weights • Information Retrieval: interactive query management; query relaxation; query tightening • Personalization: query refinement suggestions; product ranking; explanations • Conversational: user initiates interaction; system suggests way to escape from interaction dead-ends • Decision Making: model derived from literature on consumer behavior.

  27. Thank you !

More Related