1 / 31

Stay Our of My Forum! Evaluating Firm Involvement in Online Rating Communities

Stay Our of My Forum! Evaluating Firm Involvement in Online Rating Communities. Neveen Awad Wayne State University, Detroit, MI Hila Etzion University of Michigan, Ann Arbor, MI. Growth of online word of mouth.

tab
Download Presentation

Stay Our of My Forum! Evaluating Firm Involvement in Online Rating Communities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stay Our of My Forum!Evaluating Firm Involvement in Online Rating Communities Neveen Awad Wayne State University, Detroit, MI Hila Etzion University of Michigan, Ann Arbor, MI

  2. Growth of online word of mouth • A growing number of consumers are contributing content to online product review forums, discussion boards, weblogs (blogs), video sharing communities, etc. • An even larger number of consumers reference online word of mouth (Source: Pew Internet 2006)

  3. Amazon.com’s bet • Amazon.com eliminated its entire budget for television and general-purpose print advertising “ Word of mouth is important because on the Web you can reach so many more people beyond your circle of friends” Bill Curry, Amazon.com Spokesperson

  4. Example

  5. Comparison of Reviews Example

  6. Should firms be involved with online word of mouth? “ Some retailers are struggling with how they should handle a flood of submissions, and in particular, negative reviews that could make it difficult to sell a product. Many sites simply use automated filters to check reviews for profanity and then publish a majority of them. Others like Newegg, have employees closely vet each submission and reject a greater percentage of reviews.” (WSJ, 2005).

  7. Lots of questions… few concrete answers • Are online reviews and ratings related to online sales? • What are the consequences of firm intervention activities on the information value of online forums? • How do consumers assess information provided through online reviews?

  8. Literature Review:Effect of Online Word of Mouth on Sales • Senecal and Nantel (2004) Chevalier and Mayzlin (2006) Online product reviews influence consumer purchase decisions • Godes and Mayzlin (2004) Dispersion of conversations among different Usenet groups is significantly related to Nielsen (viewership) ratings of TV shows, but volume is not • Liu (2006), Duan et al. (2005) The volume of Yahoo! Movies discussions has a significant impact on motion picture box office revenues, but not the valence • Dellarocas, Awad, and Zhang, (2005) Early volume of online movie reviews is a proxy of early sales; valence is a significant predictor of word of mouth and rate of decay of external publicity

  9. Literature Review: Online Reviews – Bias ? • Li and Hitt (2004) Books ratings decline over time, showing a positive bias in reviews written by early buyers. • Dellarocas (2003) Under certain conditions manipulation of online ratings can increase the informativeness of the forum • Chen and Xie (2004) It is not always beneficial for the seller to support a review system • Mayzlin, (2006) If third-party signals are sufficiently noisy, consumers listen to promotional chat strategically posted by firms

  10. Research Questions • How online reviews effect the shape of the demand functions for imperfect substitutes? • Do consumers reference different metrics of online word of mouth depending on the nature and number of reviews? • Does firm filtering of the online reviews effect the impact of these reviews on online transactions? • Should online retailers filter bad reviews?

  11. The Model • Retailer selling 2 imperfect substitutes. • Retailer selling 2 imperfect substitutes and online review system Assumption:   wi = (Ti)/(Ti+Tj) EM the a-priori expected value of the summary statistic M.

  12. The Model, Ratings {-1, 0, 1} Retailer selling 2 imperfect substitutes and online review system Assumption: consumers expect each rating to be submitted with the same probability Gi – Number of good reviews (1)for product i Ni– number of neutral reviews (0)for product i Bi– number of Bad reviews (-1) for product i

  13. Should the seller filter? G1=15 , G2=20 , B1=3 , N1=5 , N2=3, a=1000 , b=1 , d=0.2 , c1=c2=80 p1=100 , p2=90,=5 ,  =2.5, Avg Fraction of G’s Fraction of G’s w. bias B2 Avg w. Bias

  14. Should the seller filter? G1=15 , G2=20 , B1=3 , N1=5 , N2=3, a=1000 , b=1 , d=0.2 , c1=c2=80 p1=90 , p2=100,=3, =3, Avg w. Bias Avg Fraction of G’s Fraction of G’s w. Bias B2

  15. How Consumers choose M? If CM is increasing in B2, then CM favors product 1. If CM is decreasing in B2 – then clearly it prefers product 2 G1=30, G2=20, B2=0,N1=1, N2=1 Pg &S favor 2 M=Pg Pg favors 2 S favors 1 M=S Pg &S favor 1 M=S G1=50, G2=1, B2=0, N1=0, N2=1. (RHS – suspect filtering, LHS- don’t) Red= average Blue= Percentage of good

  16. Hypothesis 1: Biased Filtering • When consumers become aware of biased reviews, their usage of online review information will change (White, 1999; Mayzlin 2006). • Hypothesis 1: When firms implement an intervention strategy aimed at filtering out negative reviews, percentage of positive reviews will be significant and positively associated with online transactional amount.

  17. Hypothesis 2: Noise Reduction • Information overload is one of the biggest issues online (Berghel, 1997) • Firm manipulation of online forums can either increase or decrease the informational value of the forum (Dellarocas, 2004). • Hypothesis 2: When firms implement an intervention strategy aimed at “noise reduction”, valence will be significantly associated with online sales.

  18. Data • Collected from a large online retailer • Dates range from April 16th, 1999 to February 2nd, 2006 • The firm changed its reviews filtering method Feb 2nd, 06 March 3rd, 05 April 16th, 99 • The user reviews data consisted of an optional text review of product together with an integer numerical rating that ranged from 5 (best) to 1 (worst). • All of the reviews are first market as pending • As the team goes through the reviews, they either approve the reviews or reject them.

  19. Summary Statistics

  20. Data Summary

  21. Transactions Per Month

  22. Model Dependent variable: Log (Online Sales)

  23. Results: Hypothesis 1: Bias Filtering Hypothesis 1: When firms implement an intervention strategy aimed at filtering out all negative reviews, positive reviews will be significant and positively associated with online transactional amount. Supported

  24. Results: Hypothesis 2: Noise Reduction Hypothesis 2: When firms implement an intervention strategy aimed at “noise reduction”, the review valence will be significant and positively associated with online transactional amount. Supported

  25. Conclusion • Before the change the percentage of ‘1’ ratings were not significantly associated with number of purchases, but the percentage of ‘5’ were • After the filtering strategy change; the average valence does significantly affect the number of purchases per product. • Firms should filter reviews in certain situations

  26. Questions

  27. Online Word of Mouth- Bias?

  28. Effect of Online Word of Mouth on Sales

  29. Should the seller filter if consumer does not expect bias When M=average rating Where L= 2G1+G2+N1+N2 and k=2G2+G1+N1+N2 If products have same margins, 1. When (p1-c1)=(p2-c2) then M  0with equality when =. • Reviews only transfer demand from one product to another • When  =: • If B1(2G1+N1+N2) >B2(2G2+N1+N2) then M  0 iff (p2-c2) ≤ (p1-c1). • If B1(2G1+N1+N2) < B2(2G2+N1+N2) then M 0 iff (p2-c2) >( p1-c1).

  30. Data • Collected from a large online retailer • Dates range from April 16th, 1999 to February 2nd, 2006 • The firm changed its reviews filtering method on March 3rd 2005 • First period: April 16th, 1999 through March 3rd 2005 • Second period: March 4th of 2005 through February 2nd, 2006. • The user reviews data consisted of an optional text review of product together with an integer numerical rating that ranged from 5 (best) to 1 (worst). • All of the reviews are first market as pending • As the team goes through the reviews, they either approve the reviews or reject them.

More Related