examining and quantifying relationships among variables
Download
Skip this Video
Download Presentation
Examining and Quantifying Relationships Among Variables

Loading in 2 Seconds...

play fullscreen
1 / 20

Examining and Quantifying Relationships Among Variables - PowerPoint PPT Presentation


  • 178 Views
  • Uploaded on

Examining and Quantifying Relationships Among Variables. Contingency Tables (categorical variables) Correlations (linear relationships) Other measures of association (eta, omega) Multiple regression (more than two variables at a time). Contingency Tables

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Examining and Quantifying Relationships Among Variables' - casper


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
examining and quantifying relationships among variables
Examining and Quantifying Relationships Among Variables
  • Contingency Tables (categorical variables)
  • Correlations (linear relationships)
  • Other measures of association (eta, omega)
  • Multiple regression (more than two variables at a time)
slide2

Contingency Tables

When all of your variables are categorical, you can use contingency tables to see if your variables are related.

• A contingency table is a table displaying information in cells formed by the intersection of two or more categorical variables.

• A contingency or relationship occurs when there is a pattern between the data on the rows with the data on the columns

examining and quantifying relationships among variables1
Examining and Quantifying Relationships Among Variables
  • Measures of association
    • Correlation coefficient, r
    • Other strength of association measures, eta (η), omega (ω)
    • Percentage of Variance Explained (PVE):

(association measure)2 X 100

    • For example, r = .7,

PVE = (.7)2 X 100 = 49%

pearson s correlation coefficient
Pearson’s correlation coefficient
  • Correlation – measure of the linear relationship between two variables
  • Coefficient can range from 0 to +/- 1.00
  • Size of number indicates strength or magnitude of relationship
  • Sign indicates direction of relationship
illustration of strength of relationship using venn like diagrams
Illustration of Strength of Relationship Using Venn-like Diagrams

Y

X

r2 = 0.0

X

Y

X

Y

r2 = 0.30

r2 = 0.95

correlation vs regression
Correlation vs. Regression
  • Correlation investigates relationships between variables
    • Variables have equal status or role
  • Regression examines relationship of predictor variable(s) on outcome variable
    • Variables have different role or status, interest of researcher is directional
    • Predictor variables (X’s) predict or explain the outcome variable (Y)
slide10

Correlation, bidirectional

Regression, X influences Y, unidirectional

slide11

Regression Analysis

  • Regression analysis: used to explain or predict the values of a quantitative dependent variable based on the values of one or more predictor variables.
    • • Simple regression, one predictor variable.
    • • Multiple regression, two or more predictor variables.
  • Here is the simple regression equation showing the relationship between starting salary (Y or your dependent variable) and GPA (X or your independent variable):
  • Y = 9,234.56 + 7,638.85 (X)
slide13

• The 9,234.56 is the Y intercept (look at the above regression line; it crosses the Y axis a little below $10,000; specifically, it crosses the Y axis at $9,234.56).

• The 7,638.85 is the simple regression coefficient, which tells you the average amount of increase in starting salary that occurs when GPA increases by one unit. (It is also the slope or the rise over the run).

• Now, you can plug in a value for X (i.e., starting salary) and easily get the predicted starting salary.

• If you put in a 3.00 for GPA in the above equation and solve it, you will see that the predicted starting salary is $32,151.11

• Now plug in another number within the range of the data (how about a 3.5) and see what the predicted starting salary is. (Check on your work: it is $35,970.54)

slide15

Multiple Regression: examining more than one association at the same time

  • • Main difference between simple and multiple regression is that multiple looks at the complex relationships among several predictors at the same time. The regression coefficient is now called a partial regression coefficient. This coefficient provides the predicted change in the dependent variable given a one unit change in the predictor variable, controlling for the other predictor variables in the equation. In other words, you can use multiple regression to control for other variables (i.e., statistical control).
  • Kinds of coefficients
    • Raw score betas (unique relationship controlling for other predictors)
    • Standardized betas (unique relationship controlling for other predictors)
    • R and R2(all predictors together)
  • Multicollinearity
slide17

Unique overlap of an individual predictor (x2) is standardized beta

All overlap of an individual predictor (x2) is Pearson’s r2

slide18

Y

X2

X1

Multicollinearity is the area the two predictors have in common, the extent to which the two predictors correlate with each other

ad