Examining and quantifying relationships among variables
Download
1 / 20

Examining and Quantifying Relationships Among Variables - PowerPoint PPT Presentation


  • 178 Views
  • Uploaded on

Examining and Quantifying Relationships Among Variables. Contingency Tables (categorical variables) Correlations (linear relationships) Other measures of association (eta, omega) Multiple regression (more than two variables at a time). Contingency Tables

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Examining and Quantifying Relationships Among Variables' - casper


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Examining and quantifying relationships among variables
Examining and Quantifying Relationships Among Variables

  • Contingency Tables (categorical variables)

  • Correlations (linear relationships)

  • Other measures of association (eta, omega)

  • Multiple regression (more than two variables at a time)


Examining and quantifying relationships among variables

Contingency Tables

When all of your variables are categorical, you can use contingency tables to see if your variables are related.

• A contingency table is a table displaying information in cells formed by the intersection of two or more categorical variables.

• A contingency or relationship occurs when there is a pattern between the data on the rows with the data on the columns



Examining and quantifying relationships among variables1
Examining and Quantifying Relationships Among Variables Hospital

  • Measures of association

    • Correlation coefficient, r

    • Other strength of association measures, eta (η), omega (ω)

    • Percentage of Variance Explained (PVE):

      (association measure)2 X 100

    • For example, r = .7,

      PVE = (.7)2 X 100 = 49%


Pearson s correlation coefficient
Pearson’s correlation coefficient Hospital

  • Correlation – measure of the linear relationship between two variables

  • Coefficient can range from 0 to +/- 1.00

  • Size of number indicates strength or magnitude of relationship

  • Sign indicates direction of relationship



Illustration of strength of relationship using venn like diagrams
Illustration of Strength of Relationship Using Venn-like Diagrams

Y

X

r2 = 0.0

X

Y

X

Y

r2 = 0.30

r2 = 0.95


Correlation vs regression
Correlation vs. Regression Diagrams

  • Correlation investigates relationships between variables

    • Variables have equal status or role

  • Regression examines relationship of predictor variable(s) on outcome variable

    • Variables have different role or status, interest of researcher is directional

    • Predictor variables (X’s) predict or explain the outcome variable (Y)


Examining and quantifying relationships among variables

Correlation, bidirectional Diagrams

Regression, X influences Y, unidirectional


Examining and quantifying relationships among variables

  • Regression Analysis Diagrams

  • Regression analysis: used to explain or predict the values of a quantitative dependent variable based on the values of one or more predictor variables.

    • • Simple regression, one predictor variable.

    • • Multiple regression, two or more predictor variables.

  • Here is the simple regression equation showing the relationship between starting salary (Y or your dependent variable) and GPA (X or your independent variable):

  • Y = 9,234.56 + 7,638.85 (X)


Examining and quantifying relationships among variables

DiagramsThe 9,234.56 is the Y intercept (look at the above regression line; it crosses the Y axis a little below $10,000; specifically, it crosses the Y axis at $9,234.56).

• The 7,638.85 is the simple regression coefficient, which tells you the average amount of increase in starting salary that occurs when GPA increases by one unit. (It is also the slope or the rise over the run).

• Now, you can plug in a value for X (i.e., starting salary) and easily get the predicted starting salary.

• If you put in a 3.00 for GPA in the above equation and solve it, you will see that the predicted starting salary is $32,151.11

• Now plug in another number within the range of the data (how about a 3.5) and see what the predicted starting salary is. (Check on your work: it is $35,970.54)



Examining and quantifying relationships among variables

Multiple Regression: examining more than one association at the same time

  • • Main difference between simple and multiple regression is that multiple looks at the complex relationships among several predictors at the same time. The regression coefficient is now called a partial regression coefficient. This coefficient provides the predicted change in the dependent variable given a one unit change in the predictor variable, controlling for the other predictor variables in the equation. In other words, you can use multiple regression to control for other variables (i.e., statistical control).

  • Kinds of coefficients

    • Raw score betas (unique relationship controlling for other predictors)

    • Standardized betas (unique relationship controlling for other predictors)

    • R and R2(all predictors together)

  • Multicollinearity


Examining and quantifying relationships among variables

Shared Variance in Multiple Regression, the same time

all overlap = R2


Examining and quantifying relationships among variables

Unique overlap of an individual predictor (x the same time2) is standardized beta

All overlap of an individual predictor (x2) is Pearson’s r2


Examining and quantifying relationships among variables

Y the same time

X2

X1

Multicollinearity is the area the two predictors have in common, the extent to which the two predictors correlate with each other