1 / 39

Using Action Research to Ensure Relevance and Excellence

Using Action Research to Ensure Relevance and Excellence. Student Success Symposium. Dan Friedman, Ph.D. Director, University 101 Programs University of South Carolina. Does it work?. Academic achievement – GPA and hours earned Retention persistence to second year

topaz
Download Presentation

Using Action Research to Ensure Relevance and Excellence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Action Research to Ensure Relevance and Excellence Student Success Symposium Dan Friedman, Ph.D. Director, University 101 Programs University of South Carolina

  2. Does it work? • Academic achievement – GPA and hours earned • Retention persistence to second year • Utilization of campus resources Paul Fidler

  3. A movement created 87% of institutions with a FYS Padgett & Keup (2011). 2009 National Survey of First-Year Seminars.

  4. High-Impact Educational Practices (AACU)

  5. My Focus for Today Using assessment/action research to demonstrate the value of our programs, and to continually improve what we do by understanding why our programs work and for whom.

  6. FAITH-BASED? • “Estimates of college quality are essentially "faith-based," insofar as we have little direct evidence of how any given school contributes to students' learning.” • RICHARD HERSCH (2005). What does college teach? ATLANTIC MONTHLY.

  7. 3) Interpret Evidence 2) Gather Evidence 4) Implement Change 1) Identify Outcomes Assessment Cycle Maki, P. (2004). Assessing for learning: Building a sustainable commitment across the institution. Sterling, VA: Stylus Publishing.

  8. Easy Stuff!

  9. Friedman, D. (2012). Assessing the first-year seminar.

  10. Two Types of Assessment 1) Summative – used to make a judgment about the efficacy of a program 2) Formative – used to provide feedback in order to foster improvement.

  11. The Prescription • Relevance • (doing the right things) • Excellence • (doing things right)

  12. Astin’s Value-Added I – E – O Model E Environments (College) IO “outputs must always be evaluated in terms of inputs” Inputs (Students) Astin, A. (1991)

  13. Common Mistakes Just looking at inputs E Environments IO Inputs Outcomes

  14. Common Mistakes Just looking at environment E Environments IO Inputs Outcomes

  15. Common Mistakes Just looking at outcomes E Environments IO Inputs Outcomes

  16. Common Mistakes E Environments IO Inputs Outcomes E-O Only (No Control for Inputs)

  17. Astin’s Value-Added I – E – O Model E Environments (College) IO “outputs must always be evaluated in terms of inputs” Inputs (Students) Astin, A. (1991)

  18. Disaggregating the Inputs

  19. What does this tell us?

  20. What does this tell us?

  21. Positive Impact on Graduation

  22. Need to Disaggregate • Disaggregate data by input variables • Predicted GPA • SAT/ACT • High School grades • Race/Ethnicity • Family Income (Pell eligible) • First Generation

  23. Disaggregating Environmental Variables High Impact Practice How to do it WELL

  24. Which factors predict persistence? • Used FYI data set & included variables from student data file (persistence and GPA) • 2,014 responses (72% response rate) • A series of logistic regressions were conducted • Controlled for gender, race, and high school grades • A standard deviation increase in Sense of Belonging & Acceptance increased the odds of persisting into the second year by 38% (p < .001), holding all other variables constant.

  25. Assessing Educational Methods Compare methods to determine if one approach is better than another

  26. Continual Improvement • Identifying and replicating best practices

  27. Structural Variables CRAP FACT

  28. Fact or Crap? Student Affairs professionals had higher ratings on overall course effectiveness than other instructors.

  29. No statistically significant differences were found on any of the fifteen FYI factors or course evaluation factors for Division of Student Affairs employees versus non-division employees. CRAP

  30. Fact or Crap? Sections that met 3 days a week (MWF) had significantly higher course effectiveness ratings than sections that met twice a week.

  31. CRAP Overall Course Satisfaction by Days Per Week Data from fall 2011 Course Evaluations

  32. Fact or Crap? Sections with a Peer Leader had significantly higher course effectiveness ratings than sections without a Peer Leader.

  33. FACT Overall Course Satisfaction by Peer Leader Status Data from fall 2011 Course Evaluations

  34. “Can’t fatten pig without weighing it” Need to use assessment data to drive continual improvement

  35. Contact Information Dan Friedman friedman@sc.edu University 101 Programs 1728 College Street Columbia, South Carolina 29208 (803) 777-6029 www.sc.edu/univ101

More Related