1 / 34

Social Media Intelligence: Measuring Brand Sentiment from Online Conversations

Social Media Intelligence: Measuring Brand Sentiment from Online Conversations. David A. Schweidel Goizueta Business School Emory University October 2012. What’s Trending on Social Media?. Agenda. Social Dynamics in Social Media Behavior

becky
Download Presentation

Social Media Intelligence: Measuring Brand Sentiment from Online Conversations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Social Media Intelligence: Measuring Brand Sentiment from Online Conversations David A. Schweidel Goizueta Business School Emory University October 2012

  2. What’s Trending on Social Media?

  3. Agenda • Social Dynamics in Social Media Behavior • Why do people post a product opinion? What influences their posting behavior? (Moe and Schweidel 2012) • What is the impact of these social dynamics on product sales? (Moe and Trusov 2011) • Social Media Intelligence (work in progress with W. Moe) • What factors influence social media metrics? • How can we adjust our metrics for different sources of data?

  4. Pre-Purchase Evaluation E[uij] LATENT EXPERIENCE MODEL Purchase Decision and Product Experience Post-Purchase Evaluation Vij=f(uij, E[uij]) INCIDENCE & EVALUATION MODELS Incidence Decision Evaluation Decision SELECTIONEFFECT ADJUSTMENT EFFECT Posted Product Ratings Why do people post? Opinion formation versus opinion expression (Berinsky 2005) Opinion formation Pre/post purchase (Kuksov and Xie 2008) Customer satisfaction and word-of-mouth (Anderson and Sullivan 1993, Anderson 1998) Opinion expression Opinion dynamics (Godes and Silva 2009, Li and Hitt 2008, Schlosser 2005, McAllister and Studlar 1991) Opinion polls and voter turnout (see for example McAllister and Studlar 1991)

  5. Selection Effects: What influences participation? • Extremely dissatisfied customers are more likely to engage in offline word-of-mouth (Anderson 1998) • Online word-of-mouth is predominantly positive (e.g. Chevalier and Mayzlin 2006, Dellarocas and Narayan 2006) • Subject to the opinions of others • Bandwagon effects (McAllister and Studlar 1991, Marsh 1984) • Underdog effects (Gartner 1976, Straffin 1977) • Effect of consensus (Epstein and Strom 1981, Dubois 1983, Jackson 1983, Delli Carpini 1984, Sudman 1986)

  6. Adjustment effects:What influences posted ratings? • Empirical evidence of opinion dynamics • Online opinions decline as product matures (Li and Hitt 2008) • Online opinions decline with ordinality of rating (Godes and Silva 2009) • Other behavioral explanations of opinion dynamics • “Experts” differentiate from the crowd by being more negative (Schlosser 2005) • “Multiple audience” effects when opinion variance is high (Fleming et al 1990)

  7. Modeling Overview • Online product ratings for bath, fragrance and home retailer over 6 months in 2007 (sample of 200 products with 3681 ratings) • Two component model structure (Ying, Feinberg and Wedel 2006): • Incidence (Probit) • Evaluation (Ordered Probit) • Product utility links incidence and evaluation models (non-linear) • Bandwagon versus differentiation effects • Covariates of ratings environment • Separate but correlated effects in each model component • Product and individual heterogeneity

  8. Role of post-purchase evaluation (Vij)in rating incidence

  9. Classifying Opinion Contributors • Groups based on frequency of posts (β0)

  10. Empirical Trends Posted ratings Average rating decreases over time Variance increases over time Poster composition Community-builders are over-represented in the posting population. As forum evolves, participation from community-builders increases while that of LI and BW decreases.

  11. Effect of Opinion Variance Customer bases have same mean but different variance in opinions With a polarized customer base, ratings exhibit: Lower average with a significant decreasing trend Greater variance Negative ratings do not necessarily signal a lower average opinion among customers MEDIAN CUSTOMER BASE POLARIZED CUSTOMER BASE

  12. Conclusions: Individual-level Analysis • Empirical findings • Heterogeneity in posting incidence and evaluation • Incidence and evaluation behavior are related and can result in a systematic evolution of posting population • General trends in the evolution of product forums: • Dominated by “activists”” • Participation by activists tends to increase as forum evolves while participation by low-involvement individuals tend to decrease • Implications: • Ratings environment does not necessarily reflect the opinions of the entire customer base or even the socially unbiased opinions of the posters • Posting behavior is subject to venue effects…

  13. Posting Decisions Opinion / Brand Evaluation Where do I post? Do I post? What do I post? Venue Format Sentiment Product Domain Attribute SOCIAL MEDIA METRICS

  14. The value of social media as a research tool • Does it matter where (i.e., blogs, microblogs, forums, ratings/reviews, etc.) we listen? • Product/topic differences across venues? • Systematic sentiment differences across venues? • Venue specific trends and dynamics? • How do social media metrics compare to other available measures?

  15. Social media monitoring IN PRACTICE • Early warning system: Kraft removed transfats from Oreos in 2003 after monitoring blogs • Customer feedback: Land of Nod monitors reviews to help with product modifications and redesigns • Measuring sentiment: Social media listen platforms collect comments across venues IN RESEARCH • Twitter to predict sock prices (Bollen, Mao and Zeng 2011) • Twitter to predict movie sales (Rui, Whinston and Winkler 2009) • Discussion forums to predict TV ratings (Godes and Mayzlin 2004) • Ratings and Reviews to predict sales (Chevalier and Mayzlin 2006)

  16. Does source of data matter? • Online venues (e.g., blogs, forums, social networks, micro-blogs) differ in: • Extent of social interaction • Amount of information • Audience attracted • Focal product/attribute • Venue is a choice • Consumers seek out brand communities(Muniz and O’Guinn 2001) • Venue depends on posting motivation (Chen and Kirmani 2012) • Social dynamics affects posting (Moe and Schweidel 2012, Moe and Trusov 2011)

  17. Research Objective • Assess online social media as a listening tool • Disentangle the following factors that can systematically influence posted sentiment • Venue differences • Product and attribute differences • Within venue trends and dynamics • Examine differences across different venue types • Sentiment • Product and attribute differences • Implications for social media monitoring and metrics

  18. Social Media Data • Provided by Converseon (leading online social media listening platform and agency) • Sample of approximately 500 postings per month pertaining to target brand • Comments manually coded for: • Sentiment (positive, neutral, negative) • Venue and venue format • Focal product/attribute • Categories: (1) enterprise software, (2) telecommunications, (3) credit card services, and (4) automobile manufacturing

  19. Data for Enterprise Software Brand • 140 products within the brand portfolio • 59 brand attributes (e.g., compatibility, price, service, etc.) • Social Media data spanned a 15 month period • June 2009 – August 2010 • 7565 posted comments • Across 800+ domains

  20. Modeling Social Media Sentiment • Comments coded as “negative”, “neutral”, or “positive” • Ordered probit regression venue-specific brand sentiment product effect attribute effect

  21. What affects venue-specific brand sentiment? • General brand impression (GBI) • Domain and venue effects (including dynamics) domain effect venue-specific dynamics venue-format effect

  22. Venue Attractiveness • Model venue format as a choice made by the poster • Multinomial logit model effect of content on venue choice product effect Attribute effect

  23. Model Comparisons • Baseline model • Independent sentiment and venue decisions • Controlling for product, topic and domain effects

  24. Sentiment metrics vary depending on what you are measuring. CORRELATIONS

  25. Sentiment Differencesacross Venues (bv(i))

  26. Divergent Trendsacross Venues (fv(i),t(i)) • Sentiment varies across venues • Venue-specific sentiment is subject to venue-specific dynamics * Blogs, forums and microblogs are the 3 most common venues

  27. Attribute Effects on Sentiment (aa(i),1) • Provides attribute-specific sentiment metrics • Empirical measures are problematic due to data sparsity • Correlation between model-based effects and observed attribute-sentiment metrics = -.276

  28. Venue Attractiveness Results Posters with positive sentiments toward the brand are attracted to forums and microblogs. Forums attract comments that focus on different products and attributes than microblogs.

  29. Attribute effects on venue choice (aa(i),2) TOP 10 BOTTOM 10 * For products with >5 mentions only

  30. Predictive Value of GBI • Offline brand tracking survey • Satisfaction surveys conducted from Nov 2009 to Aug 2010 in waves (overlapping with last 10 months of social media data) • Approximately 100 surveys conducted per month • 7 questions re: overall sentiment toward brand • Company stock price • Weekly and monthly closing prices for firm • Weekly and monthly closing S&P • June 2009 to September 2010 (extra month for lag)

  31. GBI vs. Offline Survey • Potential for GBI as a lead indicator • Correlation with survey • GBI(t) = .375 [.277,.469]* • GBI(t-1) = .875 [.824,.919]* • Avg sentiment = -.0346 • Blogs = .678 • Forums = .00410 • Microblogs = .751

  32. GBI and Stock Price(DV=monthly close) Iteration level Posterior Means * Closing price in month

  33. Observed Social Media Metrics(DV=monthly close) * Closing price in month

  34. Conclusions • Social media behavior varies across venue formats  Need to account for the source of SM data • Potential to use social media as market research Adjusted measure (GBI) can serve as lead indicator • Implications for academic research that use social media measures and for practitioners who monitor social media sentiment • Next steps • Additional data sets • Simulations of different GBI scenarios and resulting metrics

More Related