1 / 17

CCSSE – A brief history of the survey

Analyzing CCSSE Data Over Time TAIR 2013, Galveston, TX February 11, 2013 E. Michael Bohlig, Ph.D. Sr. Research Associate CCCSE. CCSSE – A brief history of the survey. Center established in 2001 Survey item development and revisions through 2004 Main survey has not changed since 2005

adin
Download Presentation

CCSSE – A brief history of the survey

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analyzing CCSSE Data Over TimeTAIR 2013, Galveston, TXFebruary 11, 2013E. Michael Bohlig, Ph.D.Sr. Research AssociateCCCSE

  2. CCSSE – A brief history of the survey • Center established in 2001 • Survey item development and revisions through 2004 • Main survey has not changed since 2005 • Special focus items to address issues of current importance available since 2006 • Since 2011, Special item set focusing on Promising Practices

  3. Source of information for evidence-based decision-making to improve student engagement and outcomes • Once decisions made and programs in place, what next? • Have your efforts have paid off? • Using CCSSE to help answer this question

  4. Measuring Change – before you start • Implementation– early considerations • Goals achievable within reasonable timeframe? • Are they measurable? • Focus on things the college and staff can directly impact • Patience • Implementation dip (Fullan, 2001) • Can take up to 3 years to see positive impact • Implementation Fidelity • Can’t use CCSSE for this, but changes could be spurious if implementation not monitored

  5. Planning Your Longitudinal Analysis

  6. The Data – Where To Begin • At least three administrations since 2005 • Benchmarks • CCSSE website: Standardized benchmark scores are not appropriate for longitudinal analysis. • Standardized scores re-standardized every year based on 3-year cohort • Raw benchmark scores – available in download data file. • Scale: 0 - 1

  7. Review: How Benchmarks Are Created • 1. Convert benchmark items to a 0 – 1 scale • Items used in benchmarks do not all have same scale • Subtract 1 and divide by the number of response categories – 1 (e.g., [4 – 1]/3). • 2. Calculate the average of the 0-1 scaled variables in a benchmark • This is the raw benchmark score. • 3. Using full 3-year cohort, standardized scores with a mean of 50 & standard deviation of 25, are created, for each student.

  8. Standardized vs. Raw Benchmark Scores

  9. Standardized vs. Raw Benchmark Scores

  10. Driving Benchmark Scores - Means

  11. Driving Benchmark Scores - Frequencies Q9 Items: Percent reported Some or Very little Q13 Items: Percent reported Rarely/Never

  12. Longitudinal Analysis - Benchmarks

  13. Longitudinal Analysis: Item-level

  14. Longitudinal Analysis: Item-level

  15. Extending this work

  16. Additional work with CCSSE data • AIR Forum 2013 – an extension of this presentation in a discussion forum. • Center Work • Initiative on Promising Practices. • Strengthening the Role of Part-Time Faculty • Improving outcomes for Men of Color • Validation of SENSE survey • Revalidation of CCSSE survey • What can you do at your colleges?

  17. TAIR 2013Galveston, TX E. Michael Bohlig Sr. Research Associate Center for Community College Student Engagement bohlig@cccse.org

More Related