Stat 155 section 2 last time
This presentation is the property of its rightful owner.
Sponsored Links
1 / 50

Stat 155, Section 2, Last Time PowerPoint PPT Presentation


  • 69 Views
  • Uploaded on
  • Presentation posted in: General

Stat 155, Section 2, Last Time. Relations between variables Scatterplots – useful visualization Aspects: Form, Direction, Strength Correlation Numerical summary of Strength and Direction Linear Regression Fit a line to data. Reading In Textbook.

Download Presentation

Stat 155, Section 2, Last Time

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Stat 155 section 2 last time

Stat 155, Section 2, Last Time

  • Relations between variables

    • Scatterplots – useful visualization

    • Aspects: Form, Direction, Strength

  • Correlation

    • Numerical summary of Strength and Direction

  • Linear Regression

    • Fit a line to data


Reading in textbook

Reading In Textbook

Approximate Reading for Today’s Material:

Pages 132-145, 151-163, 192-196, 198-210

Approximate Reading for Next Class:

Pages 218-225, 231-240


Section 2 3 linear regression

Section 2.3: Linear Regression

Idea:

Fit a line to data in a scatterplot

Reasons:

  • To learn about “basic structure”

  • To “model data”

  • To provide “prediction of new values”


Linear regression approach

Linear Regression - Approach

Given a line, , “indexed” by

Define “residuals” = “data Y” – “Y on line”

=

Now choose to make these “small”


Linear regression approach1

Linear Regression - Approach

Make Residuals > 0, by squaring

Least Squares: adjust to

Minimize the “Sum of Squared Errors”


Least squares

Least Squares

Can Show: (math beyond this course)

Least Squares Fit Line:

  • Passes through the point

  • Has Slope:

    (correction factor)


Least squares in excel

Least Squares in Excel

First explore basics, from 1st principals

(later will do summaries & more)

Worked out example:

http://stat-or.unc.edu/webspace/postscript/marron/Teaching/stor155-2007/Stor155Eg14.xls

  • Construct Toy Data Set

    • Fixed x’s (-3, -2, …, 3) (A4:A10)

    • Random Errors: “eps” (B4-B10)

    • Data Y’s = 1 + 0.3 * x’s + eps (C4-C10)


Least squares in excel1

Least Squares in Excel

  • First Attempt: just try some

    • Arbitrarily choose (B37 & B38)

    • Find points on that line (A41:A47)

    • Overlay Fit Line

      (very clumsily done with “double plot”)

    • Residuals (A51:A57) & Squares (B51-B57)

    • Get SSE (B59) = 11.6 “pretty big”


Least squares in excel2

Least Squares in Excel

3.Second Attempt: Choose

  • To make line pass through

  • By adjusting: , since

  • Recompute overlay fit line

  • Recompute residuals & ESS

  • Note now SSE = 9.91

    (smaller than previous 11.6, i.e. better fit)


Least squares in excel3

Least Squares in Excel

4.Third Attempt: Choose

  • To also get slope right

  • By making:

  • Recompute line, residuals & ESS

  • Note now SSE = 0.041

    (much smaller than previous, 9.91,

    and now good visual fit)


Least squares in excel4

Least Squares in Excel

5.When you do this:

  • Use EXCEL summaries of these operations

  • INTERCEPT (computes y-intercept a)

  • SLOPE (computes slope b)

  • Much simpler than above operations

  • To draw line, right click data & “Add Trendline”

    HW: 2.47a


Next time

Next time

Add slide(s) about difference between:

Regression of Y on X

And

Regression of X on Y


Effect of a single data point

Effect of a Single Data Point

Nice Webster West Example:

http://www.stat.sc.edu/~west/javahtml/Regression.html

  • Illustrates effect of adding a single new point

  • Points nearby don’t change line much

  • Far away points create “strong leverage”


Effect of a single data point1

Effect of a Single Data Point

HW: 2.71


Least squares prediction

Least Squares Prediction

Idea: After finding a & b (i.e. fit line)

For new x, predict new value of y,

Using b x + a

I. e. “predict by point on the line”


Least squares prediction1

Least Squares Prediction

EXCEL Prediction: revisit example

http://stat-or.unc.edu/webspace/postscript/marron/Teaching/stor155-2007/Stor155Eg14.xls

EXCEL offers two functions:

  • TREND

  • FORECAST

    They work similarly, input raw x’s and y’s

    (careful about order!)


Least squares prediction2

Least Squares Prediction

Caution: prediction outside range of data is called “extrapolation”

Dangerous, since small errors are magnified


Least squares prediction3

Least Squares Prediction

HW:

2.47b, 2.49,

2.55 (hint, use Least Squares formula above, since don’t have raw data)


Interpretation of r squared

Interpretation of r squared

Recall correlation measures

“strength of linear relationship”

is “fraction of variation explained by line”

for “good fit”

for “very poor fit”

measures “signal to noise ratio”


Interpretation of r squared1

Interpretation of r squared

Revisit

http://stat-or.unc.edu/webspace/postscript/marron/Teaching/stor155-2007/Stor155Eg13.xls

(a, c, d) “data near line”

high signal to noise ratio

  • “noisier data”

    low signal to noise ratio

  • “almost pure noise”

    nearly no signal


Interpretation of r squared2

Interpretation of r squared

HW:

2.47c


And now for something completely different

And now for something completely different

Recall

Distribution

of majors of

students in

this course:

Anna Miller:

What about

Statistics?


And now for something completely different1

And Now for Something Completely Different

Joke from Anna Miller:

Three professors (a physicist, a chemist, and a statistician) are called in to see their dean.

As they arrive the dean is called out of his office, leaving the three professors.

The professors see with alarm that there is a fire in the wastebasket.


And now for something completely different2

And Now for Something Completely Different

The physicist says, "I know what to do! We must cool down the materials until their temperature is lower than the ignition temperature and then the fire will go out."

The chemist says, "No! No! I know what to do! We must cut off the supply of oxygen so that the fire will go out due to lack of one of the reactants."


And now for something completely different3

And Now for Something Completely Different

While the physicist and chemist debate what course to take, they both are alarmed to see the statistician running around the room starting other fires.

They both scream, "What are you doing?"

To which the statistician replies, "Trying to get an adequate sample size."


And now for something completely different4

And Now for Something Completely Different

This was a variation on another old joke:

An engineer, physicist and mathematician were taking a long care trip, and stopped for the night at a hotel.

All 3 went to bed, and were smoking when they fell asleep.


And now for something completely different5

And Now for Something Completely Different

The 3 cigarettes fell to the carpet, and started a fire.

The engineer smelled the smoke, jumped out of bed, ran to the bathroom, grabbed a glass, filled it with water, ran back, and doused the fire.


And now for something completely different6

And Now for Something Completely Different

The physicist smelled the smoke, jumped out of bed, made a careful estimate of the size of the fire, looked for the bathroom, found it and went in, found a glass, carefully calculated how much water would be needed to put out the fire, put that much water in the glass, went to the fire, and doused it.


And now for something completely different7

And Now for Something Completely Different

The mathematician smelled the smoke, jumped out of bed, went to the bathroom, found the glass, carefully examined it, to be sure it would hold water, turned on the faucet to be sure that water would come out when that was done, and…


And now for something completely different8

And Now for Something Completely Different

went back to bed, satisfied that a solution to the problem existed!


Diagnostic for linear regression

Diagnostic for Linear Regression

Recall Normal Quantile plot shows “how well normal curve fits a data set”

Useful visual assessment of how well the regression line fits data is:

Residual Plot

Just Plot of Residuals (on Y axis),

versus X’s (on X axis)


Residual diagnostic plot

Residual Diagnostic Plot

Toy Examples:

http://stat-or.unc.edu/webspace/postscript/marron/Teaching/stor155-2007/Stor155Eg15.xls

  • Generate Data to follow a line

    • Residuals seem to be randomly distributed

    • No apparent structure

    • Residuals seem “random”

    • Suggests linear fit is a good model for data


Residual diagnostic plot1

Residual Diagnostic Plot

Toy Examples:

http://stat-or.unc.edu/webspace/postscript/marron/Teaching/stor155-2007/Stor155Eg15.xls

  • Generate Data to follow a Parabola

    • Shows systematic structure

    • Pos. – Neg. – Pos. suggests data follow a curve (not linear)

    • Suggests that line is a poor fit


Residual diagnostic plot2

Residual Diagnostic Plot

Example from text: problem 2.74

http://stat-or.unc.edu/webspace/postscript/marron/Teaching/stor155-2007/Stor155Eg15.xls

Study (for runners), how Stride Rate depends on Running Speed

(to run faster, need faster strides)

a. & b. Scatterplot & Fit line

c. & d. Residual Plot & Comment


Residual diagnostic plot e g

Residual Diagnostic Plot E.g.

http://stat-or.unc.edu/webspace/postscript/marron/Teaching/stor155-2007/Stor155Eg15.xls

a. & b. Scatterplot & Fit line

  • Linear fit looks very good

  • Backed up by correlation ≈ 1

  • “Low noise” because data are averaged

    (over 21 runners)


Residual diagnostic plot e g1

Residual Diagnostic Plot E.g.

http://stat-or.unc.edu/webspace/postscript/marron/Teaching/stor155-2007/Stor155Eg15.xls

c. & d. Residual Plot & Comment

  • Systematic structure: Pos. – Neg. – Pos.

  • Not random, but systematic pattern

  • Suggests line can be improved

    (as a model for these data)

  • Residual plot provides “zoomed in view”

    (can’t see this in raw data)


Residual diagnostic plot3

Residual Diagnostic Plot

HW 2.73, 2.63


Chapter 3 producing data

Chapter 3: Producing Data

(how this is done is critical to conclusions)

Section 3.1: Statistical Settings

2 Main Types:

  • Observational Study

    Simply “see what happens, no intervention”

    (to individuals or variables of interest)

    e.g. Political Polls, Supermarket Scanners


Producing data

Producing Data

2 Main Types:

  • Observational Study

  • Experiment

    (Make Changes, & Study Effect)

    Apply “treatment” to individuals & measure “responses”

    e.g. Clinical trials for drugs, agricultural trials

    (safe? effective?) (max yield?)


Producing data1

Producing Data

2 Main Types:

  • Observational Study

  • Experiment

    (common sense)

    Caution: Thinking is required for each.

    Both if you do statistics & if you need to understand somebody else’s results


Helpful distinctions

Helpful Distinctions

(Critical Issue of “Good” vs. “Bad”)

  • Observational Studies:

    • Anecdotal Evidence

      Idea: Study just a few cases

      Problem: may not be representative

      (or worse: only considered for this reason)

      e.g. Cures for hiccups

      Key Question: how were data chosen?

      (early medicine: this gave crazy attempts at cures)


Helpful distinctions1

Helpful Distinctions

  • Observational Studies:

    B.Sampling

    Idea: Seek sample representative of population

    HW:

    3.1, 3.3, 3.5, 3.7

    Challenge: How to sample?

    (turns out: not easy)


How to sample

How to sample?

History of Presidential Election Polls

During Campaigns, constantly hear in news “polls say …” How good are these? Why?

  • Landon vs. Roosevelt

    Literary Digest Poll: 43% for R

    Result: 62% for R

    What happened?

    Sample size not big enough? 2.4 million

    Biggest Poll ever done (before or since)


Bias in sampling

Bias in Sampling

Bias: Systematically favoring one outcome

(need to think carefully)

Selection Bias: Addresses from L. D. readers, phone books, club memberships

(representative of population?)

Non-Response Bias: Return-mail survey

(who had time?)


How to sample1

How to sample?

  • Presidential Election (cont.)

    Interesting Alternative Poll:

    Gallup: 56% for R (sample size ~ 50,000)

    Gallup of L.D. 44% for R ( ~ 3,000)

    Predicted both correct result (62% for R),

    and L. D. error (43% for R)!

    (what was better?)


Improved sampling

Improved Sampling

Gallup’s Improvements:

  • Personal Interviews

    (attacks non-response bias)

    (ii)Quota Sampling

    (attacks selection bias)


Quota sampling

Quota Sampling

Idea: make “sample like population”

So surveyor chooses people to give:

  • Right % male

  • Right % “young”

  • Right % “blue collar”

  • This worked well, until …


How to sample2

How to sample?

  • Dewey Truman sample size

    Crossley 50% 45%

    Gallup 50% 44% 50,000

    Roper 53% 38% 15,000

    Actual 45% 50% -

    Note: Embarassing for polls, famous photo of Truman + Headline “Dewey Wins”


What went wrong

What went wrong?

Problem: Unintentional Bias

(surveyors understood bias,

but still made choices)

Lesson: Human Choice can not give a Representative Sample

Surprising Improvement: Random Sampling

Now called “scientific sampling”

Random = Scientific???


Random sampling

Random Sampling

Key Idea: “random error” is smaller than “unintentional bias”, for large enough sample sizes

How large?

Current sample sizes: ~1,000 - 3,000

Note: now << 50,000 used in 1948.

So surveys are much cheaper

(thus many more done now….)


  • Login