1 / 56

Chapter 10

Chapter 10. Two Quantitative Variables Inference for Correlation: Simulation-Based Methods Least-Squares Regression Inference for Correlation and Regression: Theory-Based Methods. Section 10.1 and 10.2: Summarizing Two Quantitative Variables and Inference for Correlation (Simulation).

fauve
Download Presentation

Chapter 10

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 10 Two Quantitative Variables Inference for Correlation: Simulation-Based Methods Least-Squares Regression Inference for Correlation and Regression: Theory-Based Methods

  2. Section 10.1 and 10.2: Summarizing Two Quantitative Variables and Inference for Correlation (Simulation) I was interested in knowing the relationship between the time it takes a student to take a test and the resulting score. So I collected some data …

  3. Scatterplot Put explanatory variable on the horizontal axis. Put response variable on the vertical axis.

  4. Describing Scatterplots • When we describe data in a scatterplot, we describe the • Direction (positive or negative) • Form (linear or not) • Strength (strong-moderate-weak, we will let correlation help us decide) • How would you describe the time and test scatterplot?

  5. Correlation • Correlation measures the strength and direction of a linearassociation between two quantitative variables. • Correlation is a number between -1 and 1. • With positive correlation one variable increases, on average, as the other increases. • With negative correlation one variable decreases, on average, as the other increases. • The closer it is to either -1 or 1 the closer the points fit to a line. • The correlation for the test data is -0.56.

  6. Correlation Guidelines

  7. Back to the test data • I was not completely honest. There were three more test scores. The last three people to finish the test had scores of 93, 93, and 97. What does this do to the correlation?

  8. Influential Observations • The correlation changed from -0.56 (a fairly strong negative correlation) to -0.12 (a fairly weak negative correlation). • Points that are far to the left or right and not in the overall direction of the scatterplot can greatly change the correlation. (influential observations)

  9. Correlation • Correlation measures the strength and direction of a linear association between two quantitative variables. • -1 < r < 1 • Correlation makes no distinction between explanatory and response variables. • Correlation has no unit. • Correlation is not a resistant measure. • Correlation based on averages is usually higher than that for individuals. • Correlation game

  10. Inference for Correlation with Simulation 1. Compute the observed statistic. (Correlation) 2. Scramble the response variable, compute the simulated statistic, and repeat this process many times. 3. Reject the null hypothesis if the observed statistic is in the tail of the null distribution.

  11. Temperature and Heart Rate Hypotheses • Null: There is no association between heart rate and body temperature. (ρ = 0) • Alternative: There is a positive linear association between heart rate and body temperature. (ρ > 0) ρ = rho

  12. Temperature and Heart Rate Collect the Data

  13. Temperature and Heart Rate r = 0.378 Explore the Data

  14. Temperature and Heart Rate • If there was no association between heart rate and body temperature, what is the probability we would get a correlation as high as 0.378 just by chance? • If there is no association, we can break apart the temperatures and their corresponding heart rates. We will do this by shuffling one of the variables. (The applet shuffles the response.)

  15. Shuffled r =-0.216 Shuffled r =0.212 Shuffled r =-0.097 Temp HR 98.30 81 98.20 73 98.70 57 98.50 86 97 79 98.80 72 98.50 71 98.70 68 99.30 58 97.80 89 98.20 82 99.90 69 98.60 71 98.60 72 97.80 62 98.40 84 98.70 82 97.40 80 96.70 65 98 68 Temp HR 98.30 86 98.20 65 98.70 69 98.50 71 97 73 98.80 58 98.50 81 98.70 72 99.30 57 97.80 82 98.20 71 99.90 80 98.60 68 98.60 79 97.80 84 98.40 89 98.70 68 97.40 82 96.70 62 98 72 Temp HR 98.30 58 98.20 86 98.70 79 98.50 73 97 71 98.80 82 98.50 57 98.70 80 99.30 71 97.80 72 98.20 81 99.90 82 98.60 89 98.60 62 97.80 84 98.40 72 98.70 68 97.40 68 96.70 69 98 65

  16. Temperature and Heart Rate • In three shuffles, we obtained correlations of -0.216, 0.212 and -0.097. These are all a bit closer to 0 than our sample correlation of 0.378 • Let’s scramble 1000 times and compute 1000 simulated correlations.

  17. Temperature and Heart Rate • Notice our null distribution is centered at 0 and somewhat symmetric. • We found that 530/10000 times we had a simulated correlation greater than or equal to 0.378.

  18. Temperature and Heart Rate • With a p-value of 0.053 we don’t have strong evidence there is a positive linear association between body temperature and heart rate. However, we do have moderate evidence of such an association and perhaps a larger sample give a smaller p-value. • If this was significant, to what population can we make our inference?

  19. Temperature and Heart Rate • Let’s look at a different data set comparing temperature and heart rate (Example 10.5A) using the Analyzing Two Quantitative Variables applet. • Pay attention to which variable is explanatory and which is response. • The default statistic used is not correlation, so we need to change that.

  20. Work on exploration 10.2 to see if there is an association between birth date and the 1970 draft lottery number.

  21. Least Squares Regression Section 10.3

  22. Introduction • If we decide an association is linear, it’s helpful to develop a mathematical model of that association. • Helps make predictions about the response variable. • The least-squares regression lineis the most common way of doing this.

  23. Introduction • Unless the points form a perfect line, there won’t be a single line that goes through every point. • We want a line that gets as close as possible to all the points

  24. Introduction • We want a line that minimizes the vertical distances between the line and the points • These distances are called residuals. • The line we will find actually minimizes the sum of the squares of the residuals. • This is called a least-squares regression line.

  25. Are Dinner Plates Getting Larger? Example 10.3

  26. Growing Plates? • There are many recent articles and TV reports about the obesity problem. • One reason some have given is that the size of dinner plates are increasing. • Are these black circles the same size, or is one larger than the other?

  27. Growing Plates? • They appear to be the same size for many, but the one on the right is about 20% larger than the left. • This suggests that people will put more food on larger dinner plates without knowing it.

  28. Portion Distortion • There is name for this phenomenon: Delboeufillusion

  29. Growing Plates? • Researchers gathered data to investigate the claim that dinner plates are growing • American dinner plates sold on ebay on March 30, 2010 (Van Ittersum and Wansink, 2011) • Size and year manufactured.

  30. Growing Plates? • Both year (explanatory variable) and size (response variable) are quantitative. • Each dot represents one plate in this scatterplot. • Describe the association here.

  31. Growing Plates? • The association appears to be roughly linear • The least squares regression line is added • How can we describe this line?

  32. Growing Plates? The regression equation is : • a is the y-intercept • b is the slope • x is a value of the explanatory variable • ŷ is the predicted value for the response variable • For a specific value of x, the corresponding distance is a residual

  33. Growing Plates? • The least squares line for the dinner plate data is • Or • This allows us to predict plate diameter for a particular year. • Substitute 2000 for year and we predict a diameter of -14.8 + 0.0128(2000) = 10.8 in.

  34. Growing Plates? • What is the predicted diameter for a plate manufactured in the year 2001? • -14.8 + 0.0128(2001) = 10.8128 in. • How does this compare to our prediction for the year 2000 (10.8 in.)? • 0.0128 larger • Slope b= 0.0128 means that diameters are predicted to increase by 0.0128 inches per year on average

  35. Growing Plates? • Slope is the predicted change in the response variable for one-unit change in the explanatory variable. • Both the slope and the correlation coefficient for this study were positive. • The slope and correlation coefficient will always have the same sign.

  36. Growing Plates? • The y-intercept is where the regression line crosses the y-axis or the predicted response when the explanatory variable equals 0. • We had a y-intercept of -14.8 in the dinner plate equation. What does this tell us about our dinner plate example? • Dinner plates in year 0 were -14.8 inches. • How can it be negative? • The equation works well within the range of values given for the explanatory variable, but fails outside that range. • Our equation should only be used to predict the size of dinner plates from about 1950 to 2010.

  37. Growing Plates? • Predicting values for the response variable for values of the explanatory variable that are outside of the range of the original data is called extrapolation.

  38. Predicting Height from Footprints Exploration 10.3

  39. Inference for the Regression Slope: Theory-Based Approach Section 10.5

  40. Beta vs Rho • Testing the slope of the regression line is equivalent to testing the correlation. • Hence these hypotheses are equivalent. • Ho: β = 0 Ha: β > 0 (Slope) • Ho: ρ= 0 Ha: ρ> 0 (Correlation) • Sample slope (b) Population (β: beta) • Sample correlation (r) Population (ρ: rho) • When we do the theory based test, we will be using the t-statistic which can be calculated from either the slope or correlation.

  41. Introduction • We have seen in this chapter that our null distributions are again bell-shaped and centered at 0 (for either correlation or slope as our statistic). This is also true if we use the t-statistic.

  42. Validity Conditions • Under certain conditions, theory-based inference for correlation or slope of the regression line use t-distributions. • We are going to use the theory-based methods for the slope of the regression line. (The applet allows us to do confidence intervals for slope.) • We would get the same p-value if we used correlation as our statistic.

  43. Validity Conditions • Validity Conditions for theory-based inference for slope of the regression equation. • The values of the response variable at each possible value of the explanatory variable must have a normal distribution in the population from which the sample was drawn. • Each normal distribution at each x-value must have the same standard deviation.

  44. Validity Conditions • Suppose the graph represents a scatterplot for an entire population where the explanatory variable is heart rate and the response is body temperature. Normal distributions of temps for each BPM Each normal distribution has the same SD

  45. Validity Conditions • How realistic are these conditions? • In practice, you can check these conditions by examining your scatterplot. • Let’s look at some scatterplots that do not meet the requirements.

  46. Smoking and Drinking • The relationship between number of drinks and cigarettes per week for a random sample of students at Hope College. The dot at (0,0) represents 524 students Are conditions met?

  47. Smoking and Drinking • Since these conditions aren’t met, we shouldn’t apply theory-based inference

  48. When checking conditions for traditional tests use the following logic. • Assume that the condition is met in your data • Examine the data in the appropriate manner • If there is strong evidence to contradict that the condition is met, then don’t use the test.

  49. Validity Conditions • What do you do when validity conditions aren’t met for theory-based inference? • Use the simulated-based approach • Another strategy is to “transform” the data on a different scale so conditions are met. • The logarithmic scale is common

  50. Predicting Heart Rate from Body Temperature Example 10.5A

More Related