1 / 18

Turning Point

Turning Point. At the beginning of the course, we discussed three ways in which mathematics and statistics can be used to facilitate psychological science quantification and measurement theoretical modeling evaluating theoretical models

hide
Download Presentation

Turning Point

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Turning Point • At the beginning of the course, we discussed three ways in which mathematics and statistics can be used to facilitate psychological science • quantification and measurement • theoretical modeling • evaluating theoretical models • Up to this point, we have focused on quantification and measurement

  2. Turning Point • For the second part of the course, we are going to focus on ways in which statistics can be used to model psychological processes • We will begin with some simple models and work our way up to more complex ones • modeling the influence of one variable on an outcome (simple regression) • modeling the influence of many variables on an outcome (multiple regression)

  3. Example • Let’s say we wish to model the relationship between coffee consumption and happiness

  4. Some Possible Functions

  5. Lines • Linear relationships • Y = a + bX • a = Y-intercept (the value of Y when X = 0) • b = slope (the “rise over the run”, the steepness of the line); a weight Y = 1 + 2X

  6. Lines and intercepts • Y = a + 2X • Notice that the implied values of Y go up as we increase a. • By changing a, we are changing the elevation of the line. Y = 5 + 2X Y = 3 + 2X Y = 1 + 2X

  7. Lines and slopes • Slope as “rise over run”: how much of a change in Y is there given a 1 unit increase in X. • As we move up 1 unit on X, we go up 2 units on Y • 2/1 = 2 (the slope) rise from 1 to 3 (a 2 unit change) rise run move from 0 to 1 Y = 1 + 2X

  8. Lines and slopes • Notice that as we increase the slope, b, we increase the steepness of the line 10 Y = 1 + 4X 5 HAPPINESS Y = 1 + 2X 0 -5 -4 -2 0 2 4 COFFEE

  9. Lines and slopes b=4 10 • We can also have negative slopes and slopes of zero. • When the slope is zero, the predicted values of Y are equal to a. Y = a + 0X = a b=2 5 HAPPINESS 0 b=0 b=-2 -5 b=-4 -4 -2 0 2 4 COFFEE

  10. Other functions • Quadratic function • Y = a + bX2 • a still represents the intercept (value of Y when X = 0) • b still represents a weight, and influences the magnitude of the squaring function

  11. Quadratic and intercepts • As we increase a, the elevation of the curve increases 30 Y = 5 + 1X2 25 20 HAPPINESS 15 10 5 Y = 0 + 1X2 0 -4 -2 0 2 4 COFFEE

  12. Quadratic and Weight • When we increase the weight, b, the quadratic effect is accentuated Y = 0 + 5X2 120 100 80 HAPPINESS 60 40 20 Y = 0 + 1X2 0 -4 -2 0 2 4 COFFEE

  13. Quadratic and Weight • As before, we can have negative weights for quadratic functions. • In this case, negative values of b flip the curve upside-down. • As before, when b = 0, the value of Y = a for all values of X. Y = 0 + 5X2 100 Y = 0 + 1X2 50 HAPPINESS 0 Y = 0 + 0X2 -50 Y = 0 – 1X2 -100 Y = 0 – 5X2 -4 -2 0 2 4 COFFEE

  14. linear weight (b) - 0 + quadratic weight (c) + 0 - Linear & Quadratic Combinations • When linear and quadratic terms are present in the same equation, one can derive j-shaped curves • Y = a + b1X + b2X2

  15. Some terminology • When the relation between variables are expressed in this manner, we call the relevant equation(s) mathematical models • The intercept and weight values are called parameters of the model. • Although one can describe the relationship between two variables in the way we have done here, for now on we’ll assume that our models are causal models, such that the variable on the left-hand side of the equation is assumed to be caused by the variable(s) on the right-hand side.

  16. Terminology • The values of Y in these models are often called predicted values, sometimes abbreviated as Y-hat or . Why? They are the values of Y that are implied by the specific parameters of the model.

  17. Estimation • Up to this point, we have assumed that our models are correct. • There are two important issues we need to deal with, however: • Is the gist of the model correct? That is, is a linear, as opposed to a quadratic, model the appropriate model for characterizing the relationship between variables? • Assuming the model is correct, what are the correct parameters for the model?

  18. Estimation • For the next few weeks we will assume that the basic model (i.e., whether it is linear, whether the right variables are included) is correct. In the third part of the course, we will deal with methods for addressing this issue and comparing alternative models. • The process of obtaining the correct parameter values (assuming we are working with the right model) is called parameter estimation.

More Related