1 / 38

Social Comparisons and Contribution to Online Communities: A Field Experiment on MovieLens

Social Comparisons and Contribution to Online Communities: A Field Experiment on MovieLens. Yan Chen, Maxwell Harper Joseph Konstan, Xin Sherry Li June 8, 2007 (www.communitylab.org). Outline. Online Communities MovieLens ( www.movielens.org ) Recommender systems

wilton
Download Presentation

Social Comparisons and Contribution to Online Communities: A Field Experiment on MovieLens

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Social Comparisons and Contribution to Online Communities: A Field Experiment on MovieLens Yan Chen, Maxwell Harper Joseph Konstan, Xin Sherry LiJune 8, 2007 (www.communitylab.org)

  2. Outline • Online Communities • MovieLens (www.movielens.org) • Recommender systems • Social Comparison Theory • Experimental Design • Results • Discussions

  3. Online Communities • Online communities: groups of people meet to • Share information: e.g. cancer support groups • Produce info goods: e.g. open source, wikipedia • Play games: e.g. ESP games (label all images on the web) • Carry out business: e.g. Xerox service engineers • Opportunities to create new social capital • Nonparticipation and under-contribution

  4. Under-contribution: Solutions • Incentive-compatible mechanisms for public goods provision • Tax-subsidy schemes • Online communities • Rarely use monetary transfers • Voluntary participation • Voluntary contribution • Lots of information about users • Social information as non-pecuniary mechanism: • Social comparison theory

  5. Social Comparison Theory • Festinger (1954): people evaluate themselves by comparison with other people • Social comparisons affect behavior(Suls, Martin and Wheeler 02) • Information for the right behavior • Ambiguous situations • Conformity theory: behavior • Akerlof (82), Jones (84) • Bernheim (94) • Inequality aversion: outcome • Fehr and Schmidt (99), Bolton and Ockenfels (00) • Interdependent preferences • Utility depends on average level of consumptionDuesenberry (49), Pollak (76) • Utility depends on ordinal rank:Frank (85), Robson (92), Hopkins and Kornienko (04), Samuelson (04)

  6. Social Comparison in the Lab and Field • Lab experiments • Dictator games:Cason and Mui (98), Krupka and Weber (05), Duffy and Kornienko (07) • Ultimatum bargaining games:Knez and Camerer (95), Duffy and Feltovic (99), Bohnet and Zeckhouser (04) • Coordination gamesEckel and Wilson (06) • Field experiments • Frey and Meier (04): mail fundraising • Shang and Croson (05): on-air fund drive

  7. movielens.org • Active and successful online communities • 100,000 users, 15,000 active within the past year • 13 million ratings of 9,043 movies • Activities • Rate movies • Receive recommendations • Collaborative filtering technology • 22% of movies have few than 40 ratings • Software can’t make accurate predictions

  8. Target User 3 Weighted Sum K-Nearest Neighbor Collaborative Filtering ?

  9. insert slide on new user incentives

  10. Personalized newsletter RatingInfo NetBenefit Control Pre- survey Week 1 2 3 4 5 6 Experimental Design • Stage 1: Pre-experiment survey (398/1966) • Time to search for and rate ten movies • Willingness to pay for a list of top-ten movies • Number of ratings: perceived position • Net benefit: perceived position in distribution • Stage 2: Experimental Newsletter • RatingInfo treatment: 134 users • NetBenefit treatment: 130 • Control: 134

  11. Personalized newsletter RatingInfo NetBenefit Control Post- survey Pre- survey Week 1 2 3 4 5 6 Experimental Design • Stage 3: post-experiment survey • ML related questions • General social survey • Personality • Demographics • Survey response rate: 78%

  12. Subject Pool • Active in the past year • At least 30 ratings • Completed pre-survey

  13. Stage 2: Experiment • Newsletter • RatingInfo treatment: 134 users • NetBenefit treatment: 130 • Control: 134 • Five shortcuts • Rate popular movies: increase own benefit, easy • Rate rare movies: costly, but help other users • Update database: costly, but help other users • Invite a buddy: increase own benefit, easy • Just visit the Movielens homepage • Follow users for one month to collect behavioral data

  14. Creating Peer Groups

  15. A Theoretical Framework: A Neoclassical Model Harper, Li, Chen, Konstan (2005) • User benefit • Recommendation quality • Rating fun • Non-rating fun • User’s neoclassical utility function: • Parameterization • Cobb-Douglas production function: • Linear fun and cost function • Rating: private and public good

  16. Solution and Model Estimation • Solution: inefficient amount of rating • Model estimation • Explains 34% of variance in rating behavior

  17. Extension to a 2-period model • t: the month before pre-survey • t+1: the month after newsletter • Xi: user i’s life time rating • xi: user i’s monthly rating • di: user i’s number of database entries • Without social information: neoclassical model • With social information: • Conformity • Difference aversion

  18. Rating Information Treatment

  19. Net Benefit Treatment

  20. R: Conformity RatingInfo • Below median > median: • Overall: p = 0.02 • New: p = 0.01 • Above =< median: • New: no (p=0.02) • Old: yes • Mid: yes • Median = Control • New: yes • Mid: yes • Old: yes

  21. Conformity and Competition

  22. Rating Info:Other Behavior • Update database • Control > Median (*) • Invite a buddy • Not enough observations • No pair-wise comparisons significant

  23. Net BenefitBehavior: above • Raterare movies • Above > below (p=0.01) • Average > Above (p=0.03) • Update database • Above > below (p <0.001) • Above > average(p<0.004)

  24. Net BenefitBehavior: below • Rate popular movies • Above > Below (p=0.068) • Average > Below (p=0.028) • Invite a buddy • Not enough observations • No pair-wise comparisons significant

  25. Altruism and Database Entries • More altruistic users updated more database entries.

  26. Net Benefit Score Change in net benefit score: below average > about average > above average

  27. Red Queen Effect • The Red Queen said, “It takes all the running you can do, to keep in the same place.” – Lewis Carroll’s Through the Looking-Glass • Rating Info: relative rankings of total movie ratings remain the same • Net Benefit: relative rankings of net benefit scores remain the same

  28. Rating Information vs. Net Benefit

  29. Discussions • Social comparison significantly influence behavior • Rating Information • Below median: increase # of ratings 530% • Above median: decrease # of ratings of mid and old 62% • Conformity vs. competitive preferences • Net Benefit • Below: rate more pop • Above : rate more rare; update more database entries • Average: rate more rare, rate more pop • Effects of altruism

  30. Discussion • Number of ratings • Easy to understand • Actions to outcome: transparent • Design: below the median • Net benefit • Difficult concept • Actions to outcome: not transparent • Design: • Above average: maintain database • About average: rate rare and popular movies

  31. Future Work • Other forms of social information • Leaderboard: ESP game • Other reward • Promotion: slashdot • Barnstar: wikipedia • Work-oriented online communities • SourceForge

More Related