1 / 27

Marketing Research

Marketing Research. Aaker, Kumar, Day Seventh Edition Instructor’s Presentation Slides. Chapter Nineteen. Correlation Analysis and Regression Analysis. Correlation Analysis. Measures the strength of the relationship between two or more variables Correlation

Download Presentation

Marketing Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Marketing Research Aaker, Kumar, Day Seventh Edition Instructor’s Presentation Slides

  2. Chapter Nineteen Correlation Analysis and Regression Analysis

  3. Correlation Analysis • Measures the strength of the relationship between two or more variables • Correlation • Measures the degree to which there is an association between two intervally scaled variables Aaker, Kumar, Day

  4. Correlation Analysis (Contd.) Positive Correlation • Tendency for a high value of one variable to be associated with a high value in the second Population Correlation () • Database includes entire population Aaker, Kumar, Day

  5. Correlation Analysis (Contd.) Sample Correlation (r) • Measure is based on a sample • Reflects tendency for points to cluster systematically about a straight line or falling from left to right on a scatter diagram • R lies between -1 < r < + 1 • R = o ---> absence of linear association Aaker, Kumar, Day

  6. Simple Correlation Coefficient • Plot points on scatter diagram cov (x,y) =σ(xi-x) * (yi - y) • Covariance between two variables gives measure of association • Divide the association expression by the sample size To ensure that the measure does not increase simply by increasing the sample size Aaker, Kumar, Day

  7. Simple Correlation Coefficient (Contd.) • Divide the measure by the sample standard deviation of x and y • Sample coefficient of correlation, R is independent of sample size and units of measurement • It lies between -1 and +1 • It does not imply any causal relationship between the variables Aaker, Kumar, Day

  8. Testing the Significance of the Correlation Coefficient • Null hypothesis: Ho equal to 0 • Alternative hypothesis: Ha not equal to 0 • Test statistic t = r  (n - 2) / (1 - r2) Aaker, Kumar, Day

  9. Partial Correlation Coefficient • Provides a measure of association between two variables after controlling for the effects of one or more additional variables Aaker, Kumar, Day

  10. Regression Analysis • Used to understand the nature of the relationship between two or more variables • A dependent or response variable (Y) is related to one or more independent or predictor variables (Xs) • Object is to build a regression model relating dependent variable to one or more independent variables • Model can be used to describe, predict, and control variable of interest on the basis of independent variables Aaker, Kumar, Day

  11. Simple Linear Regression Yi = βo + β1 xi + εi Where Y • Dependent variable X • Independent variable βo • Model parameter • Mean value of dependent variable (Y) when the independent variable (X) is zero Aaker, Kumar, Day

  12. Simple Linear Regression (Contd.) β1 • Model parameter • Slope that measures change in mean value of dependent variable associated with a one-unit increase in the independent variable εi • Error term that describes the effects on Yi of all factors other than value of Xi Aaker, Kumar, Day

  13. Assumption of the Regression Model • Error term is normally distributed (normality assumption) • Mean of error term is zero (E{Ei} = 0) • Variance of error term is a constant and is independent of the values of X (constant variance assumption) • Error terms are independent of each other (independent assumption) • Values of the independent variable X is fixed (non-stochastic X) Aaker, Kumar, Day

  14. Estimating the Model Parameters • Calculate point estimate bo and b1 of unknown parameter βo and β1 • Obtain random sample and use this information from sample to estimate βo and β1 • Obtain a line of best "fit" for sample data points - least squares line Yi = bo + b1 xi Aaker, Kumar, Day

  15. Values of Least Squares Estimates bo and b1 b1 = n xiyi - (xi)(yi) n xi2 - (xi)2 bo = y - bi x Where y = yi ; x = xi n n Aaker, Kumar, Day

  16. Residual Value • Difference between the actual and predicted values • Estimate of the error in the population ei = yi - yi = yi - (bo + b1 xi) • Bo and b1 minimize the residual or error sums of squares (SSE) SSE = ei2 = ((yi - yi)2 = Σ [yi-(bo + b1xi)]2 Aaker, Kumar, Day

  17. Testing the Significance of the Independent Variables Null Hypothesis • There is no linear relationship between the independent & dependent variables Alternative Hypothesis • There is a linear relationship between the independent & dependent variables Aaker, Kumar, Day

  18. Testing the Significance of the Independent Variables (Contd.) • Test Statistic t = b1 - β1 sb1 • Degrees of Freedom Vx = n - 2 • Testing for a Type II Error Ho: β1 equal to 0 Ha: β1 not equal to 0 • Decision Rule: Reject ho: β1 = 0 if α > p value Aaker, Kumar, Day

  19. Predicting the Dependent Variable yi = bo + bixi • Error of prediction is yi - yi (yi - y)2 = σ(yi - y)2 + σ(yi - yi)2 • Total variation (SST) = Explained variation (SSM) + unexplained variation (SSE) Aaker, Kumar, Day

  20. Predicting the Dependent Variable (Contd.) SST • Sum of squared prediction error that would be obtained if we do not use x to predict y SSE • Sum of squared prediction error that is obtained when we use x to predict y SSM • Reduction in sum of squared prediction error that has been accomplished using x in predicting y Aaker, Kumar, Day

  21. Coefficient of Determination (R2) • Measure of regression model's ability to predict R2 = SST - SSE SST = SSM SST = Explained Variation Total Variation Aaker, Kumar, Day

  22. Multiple Linear Regression • A linear combination of predictor factors is used to predict the outcome or response factors • Involves computation of a multiple linear regression equation • More than one independent variable is included in a single linear regression model Aaker, Kumar, Day

  23. Evaluating the Importance of Independent Variables • Which of the independent variables has the greatest influence on the dependent variable? • Consider t-value for βi's Ho : βi = 0 • If null hypothesis is true, bi (a non-zero estimate) was simply a sampling phenomenon Aaker, Kumar, Day

  24. Examine the Size of the Regression Coefficients • Use beta coefficients when independent variables are in different units of measurement Standardized βi = bi (Standard deviation of xi) (Standard deviation of Y) • Compare β coefficients with the largest value representing the variable with the strongest impact on the dependent variable Aaker, Kumar, Day

  25. Multicollinearity • Correlations among predictor variables • Discovered by examining the correlates among the X variables Selecting Predictor Variables • Include only those variables that account for most of the variation in the dependent variable Aaker, Kumar, Day

  26. Stepwise Regression • Predictor variables enter or are removed from the regression equation one at a time Forward Addition • Start with no predictor variables in regression equation i.e. y = βo + ε • Add variables if they meet certain criteria in terms of f-ratio Aaker, Kumar, Day

  27. Stepwise Regression (Contd.) Backward Elimination • Start with full regression equation i.e. y = βo + β1x1 + β2 x2 ...+ βr xr + ε • Remove predictors based on F ratio Stepwise Method • Forward addition method is combined with removal of prediction that no longer meet specified criteria at each stop Aaker, Kumar, Day

More Related