1 / 17

Part 2: Evaluating your program

Part 2: Evaluating your program. You can download this presentation at: http://faculty.smcm.edu/acjohnson/PREP/. Evaluating your program. What are your objectives for your program? What data would let you know you’re meeting those objectives?

dernestine
Download Presentation

Part 2: Evaluating your program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Part 2: Evaluating your program • You can download this presentation at: • http://faculty.smcm.edu/acjohnson/PREP/

  2. Evaluating your program • What are your objectives for your program? • What data would let you know you’re meeting those objectives? • What data would convince your administration to keep funding the program?

  3. Evaluating your program • Informal assessments: To let you know that the program is working; to fine-tune it as you go along • Observations of student progress, conversations with students, informal surveys

  4. Evaluating your program • Formal assessments • Of the first year • Longer-term

  5. Evaluating your program • The basics: • Comparison groups • Independent variables • Dependent variables

  6. Comparison groups • Historical: ESP participants vs similar students before ESP • Comparable: ESP participants vs similar students not in ESP • To the norm: ESP participants vs all non-participants • To decliners: People who rejected an invitation to ESP

  7. Independent variables • ESP participation • Race • Gender • Academic preparation (SAT scores; CCI pre-test) • Financial need • Motivation

  8. Dependent variables • CCI post-test scores • CCI growth scores • Calc grades • Raw data, % receiving A or B, % failing • Enrollment/grades in Calc II • Declaring SEM major • Graduating at all • Graduating with SEM major

  9. Analyzing the data • Descriptive statistics: Simply compare the performances of the relevant groups • Are differences in grades or scores significant? Independent-samples t-tests • Are differences in percent of students doing something (getting As & Bs, graduating) significant? Chi-square

  10. Analyzing the data • Controlling for preparation: Divide data into groups according to some measure of preparation

  11. Analyzing the data • Controlling for preparation: Construct a regression equation using all your available independent variables; see whether ESP participation is a significant predictor of the dependent variable of interest • For continuous dependent variable: OLS; for binary: Logistic regression

  12. Calculus concept inventory • Pros • This is the gold standard--did the students learn calculus? Did they learn more than other students? • This approach--at least the descriptive stats--can be used for small n • Cons • Limited number of test items--test might not be reliable or valid enough for your comfort • Requires access to all calculus students, not just ESP students

  13. Calc grades, SAT & GPA data • Pros: • Lets you control for preparation • Administrators like statistical analyses • Cons: • Someone has to like stats--might need SPSS • You have to find someone in institutional research to let you have the data • Requires a substantial n

  14. Examples • Fullilove & Treisman, 1990 • Comparison groups: • Historical--pre-MWP African Americans • African American accepters & decliners • Preparation measures: • Special admission? Math SAT scores • Dependent variables: • Calc performance, graduation

  15. Examples • Johnson, 2007a • Comparison groups (all with 1st major in science): • White/Asian; Black/Latino/American Indian • Independent variables: • Financial need, predicted GPA • Dependent variables: Graduation with science/math major, grad GPA

  16. Expanding your program • Evidence that matriculation-to-graduation programs produce even bigger benefits: • Johnson (2007a) • Maton, Hrabowski & Schmitt (2000) • Maton & Hrabowski (2004) • Gándara (1999)

  17. Evaluating your program • What are your objectives for your program? • What data would let you know you’re meeting those objectives? • What data would convince your administration to keep funding the program? • Evaluating your program • Informal assessments: To let you know that the program is working; to fine-tune it as you go along • Observations of student progress, conversations with students, informal surveys • Evaluating your program • Formal assessments • Of the first year • Longer-term • Evaluating your program • The basics: • Comparison groups • Independent variables • Dependent variables • Comparison groups • Historical: ESP participants vs similar students before ESP • Comparable: ESP participants vs similar students not in ESP • To the norm: ESP participants vs all non-participants • To decliners: People who rejected an invitation to ESP • Independent variables • ESP participation • Race • Gender • Academic preparation (SAT scores; CCI pre-test) • Financial need • Motivation • Dependent variables • CCI post-test scores • CCI growth scores • Calc grades • Raw data, % receiving A or B, % failing • Enrollment/grades in Calc II • Declaring SEM major • Graduating at all • Graduating with SEM major • Analyzing the data • Descriptive statistics: Simply compare the performances of the relevant groups • Are differences in grades or scores significant? Independent-samples t-tests • Are differences in percent of students doing something (getting As & Bs, graduating) significant? Chi-square • Analyzing the data • Controlling for preparation: Divide data into groups according to some measure of preparation • Analyzing the data • Controlling for preparation: Construct a regression equation using all your available independent variables; see whether ESP participation is a significant predictor of the dependent variable of interest • For continuous dependent variable: OLS; for binary: Logistic regression • Calculus concept inventory • Pros • This is the gold standard--did the students learn calculus? Did they learn more than other students? • This approach--at least the descriptive stats--can be used for small n • Cons • Limited number of test items--test might not be reliable or valid enough for your comfort • Requires access to all calculus students, not just ESP students • Calc grades, SAT & GPA data • Pros: • Lets you control for preparation • Administrators like statistical analyses • Cons: • Someone has to like stats--might need SPSS • You have to find someone in institutional research to let you have the data • Requires a substantial n • Examples • Fullilove & Treisman, 1990 • Comparison groups: • Historical--pre-MWP African Americans • African American accepters & decliners • Preparation measures: • Special admission? Math SAT scores • Dependent variables: • Calc performance, graduation • Examples • Johnson, 2007a • Comparison groups (all with 1st major in science): • White/Asian; Black/Latino/American Indian • Independent variables: • Financial need, predicted GPA • Dependent variables: Graduation with science/math major, grad GPA • Expanding your program • Evidence that matriculation-to-graduation programs produce even bigger benefits: • Johnson (2007a) • Maton, Hrabowski & Schmitt (2000) • Maton & Hrabowski (2004) • Gándara (1999)

More Related