multiple regression n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Multiple Regression PowerPoint Presentation
Download Presentation
Multiple Regression

Loading in 2 Seconds...

play fullscreen
1 / 24

Multiple Regression - PowerPoint PPT Presentation


  • 49 Views
  • Uploaded on

Multiple Regression. Multiple Regression. Multiple regression extends linear regression to allow for 2 or more independent variables. There is still only one dependent (criterion) variable. We can think of the independent variables as ‘predictors’ of the dependent variable.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Multiple Regression' - tareq


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
multiple regression1
Multiple Regression
  • Multiple regression extends linear regression to allow for 2 or more independent variables.
  • There is still only one dependent (criterion) variable.
  • We can think of the independent variables as ‘predictors’ of the dependent variable.
  • The main complication in multiple regression arises when the predictors are not statistically independent.

Statistics

example 1 predicting income
Example 1: Predicting Income

Age

Multiple

Regression

Income

Hours Worked

Statistics

example 2 predicting final exam grades
Example 2: Predicting Final Exam Grades

Assignments

Multiple

Regression

Final

Midterm

Statistics

coefficient of multiple determination
Coefficient of Multiple Determination
  • The proportion of variance explained by all of the independent variables together is called the coefficient of multiple determination (R2).
  • R is called the multiple correlation coefficient.
  • R measures the correlation between the predictions and the actual values of the dependent variable.
  • The correlation riY of predictor i with the criterion (dependent variable) Y is called the validity of predictor i.
uncorrelated predictors
Uncorrelated Predictors

Variance explained by assignments

Variance explained by midterm

Statistics

uncorrelated predictors1
Uncorrelated Predictors
  • Recall the regression formula for a single predictor:
  • If the predictors were not correlated, we could easily generalize this formula:

Statistics

example 1 predicting income1
Example 1. Predicting Income

Correlations

HOURS

WORKED

FOR PAY

OR IN

SELF-

EMPLOY

MENT - in

Referenc

TOTAL

AGE

e Week

INCOME

AGE

Pearson Correlation

1

.040

*

.229

**

Sig. (2-tailed)

.012

.000

N

3975

3975

3975

HOURS WORKED

Pearson Correlation

.040

*

1

.187

**

FOR PAY OR IN

Sig. (2-tailed)

.012

.000

SELF-EMPLOYMENT

- in Reference Week

N

3975

3975

3975

TOTAL INCOME

Pearson Correlation

.229

**

.187

**

1

Sig. (2-tailed)

.000

.000

N

3975

3975

3975

*.

Correlation is significant at the 0.05 level (2-tailed).

**.

Correlation is significant at the 0.01 level (2-tailed).

Statistics

correlated predictors
Correlated Predictors

Variance explained by assignments

Variance explained by midterm

Statistics

correlated predictors1
Correlated Predictors
  • Due to the correlation in the predictors, the optimal regression weights must be reduced:

Statistics

semipartial part correlations
Semipartial (Part) Correlations
  • The semipartial correlations measure the correlation between each predictor and the criterion when all other predictors are held fixed.
  • In this way, the effects of correlations between predictors are eliminated.
  • In general, the semipartial correlations are smaller than the validities.

Statistics

calculating semipartial correlations
Calculating Semipartial Correlations
  • One way to calculate the semipartial correlation for a predictor (say Predictor 1) is to partial out the effects of all other predictors on Predictor 1and then calculate the correlation between the residual of Predictor 1 and the criterion.
  • For example, we could partial out the effects of age on hours worked, and then measure the correlation between income and the residual hours worked.

Statistics

calculating semipartial correlations1
Calculating Semipartial Correlations
  • A more straightforward method:

Statistics

example 2 predicting final exam grades1
Example 2: Predicting Final Exam Grades

Assignments

Multiple

Regression

Final

Midterm

Statistics

spss output
SPSS Output

Statistics

example 3 2006 07 6130 grades
Example 3. 2006-07 6130 Grades
  • Try doing the calculations on this dataset for practice.

Statistics