1 / 32

Assessing a First-Year Seminar

Assessing a First-Year Seminar. FYS Leadership Institute Columbia, SC April 2012. FAITH-BASED?. “Estimates of college quality are essentially "faith-based," insofar as we have little direct evidence of how any given school contributes to students' learning.”

zed
Download Presentation

Assessing a First-Year Seminar

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing a First-Year Seminar FYS Leadership Institute Columbia, SC April 2012

  2. FAITH-BASED? • “Estimates of college quality are essentially "faith-based," insofar as we have little direct evidence of how any given school contributes to students' learning.” • RICHARD HERSCH (2005). ATLANTIC MONTHLY

  3. Assessment Defined • Any effort to gather, analyze, and interpret evidence which describes program effectiveness. • Upcraft and Schuh, 1996 • An ongoing process aimed at understanding and improving _______. • Thomas Angelo

  4. 3) Interpret Evidence 2) Gather Evidence 4) Implement Change 1) Identify Outcomes Assessment Cycle Maki, P. (2004).

  5. Easy Stuff!

  6. Friedman, D. (2012).

  7. Two Types of Assessment 1) Summative – used to make a judgment about the efficacy of a program 2) Formative – used to provide feedback in order to foster improvement.

  8. Word of Caution Assessment only allows us to make inferences about our programs, not to draw absolute truths.

  9. Multiple Lenses of Assessment • Criterion • Peer referenced • Longitudinal • Value added Suskie, L (2004)

  10. Hypothetical Scenario • The average score on an end of writing assessment was 80. • How’d we do? • NEED MORE INFORMATION! • Need a lens to help us make a judgment.

  11. Lens 1: Criterion Referenced(eg: placement tests, exit exams, externally mandated assessments) Key Question: How did students do against a pre-determined standard? Example: • 80% of students scored above 80, the minimum threshold for proficiency.

  12. Lens 2: Peer Referenced (benchmarking) Key Question: How do we compare with our peers? • Gives a sense of relative standing. Example: • 80% of students were proficient. • 90% for our Peer Group

  13. Lens 3: Longitudinal Key Question: Are we getting better? Example: • 80% of students were proficient • But 3 years ago, only 60% were proficient • Showed great improvement. • Is that due to our efforts? • Maybe we just admitted better students!

  14. Lens 4: Value Added Key Question: Are our students improving? Example: • Proficiency level of freshman class during the first week was 60%. At the end of the semester, 80% of same cohort were proficient.

  15. Astin’s Value-Added I – E – O Model E Environments IO Inputs Outcomes “outputs must always be evaluated in terms of inputs” Astin, A. (1991)

  16. Common Mistakes Just looking at inputs E Environments IO Inputs Outcomes

  17. Common Mistakes Just looking at environment E Environments IO Inputs Outcomes

  18. Common Mistakes Just looking at outcomes E Environments IO Inputs Outcomes

  19. Common Mistakes E Environments IO Inputs Outcomes E-O Only (No Control for Inputs)

  20. Summary of Value Added • Outputs must always be evaluated in terms of inputs • Only way to “know” the impact an environment (treatment) had on an outcome.

  21. Methods to Assess Outcomes • Indirect • Proxy measures that stand for the construct of interest; Self-reported data • Direct • Demonstration of abilities, information, knowledge, etc. as the result of participation in a program or utilization of a service

  22. Indirect Measure • An indirect measure is something a student might tell you he or she has gained, learned, experienced, etc. • Aka: self-reported data • Ex: surveys, interviews, focus groups, etc.

  23. Indirect Assessment Methods • Examples • Satisfaction measures • Program evaluations • Self-ratings of skills • Self-assessment of change • Agreement with statements • Inventories • Informal peer-to-peer conversations • Use existing data to every extent possible

  24. Survey Examples for Indirect Measures • End of course evaluation (local) • College Student Experiences Questionnaire (CSEQ) • National Survey of Student Engagement (NSSE) • Community College Survey Student Engagement (CCSSE) • Your First College Year (YFCY) • First-Year Initiative Survey (FYI) http://nrc.fye.sc.edu/resources/survey/search/index.php

  25. Qualitative Examples for Indirect Measures • Interviews • Focus groups • Secret shopper • Advisory council

  26. Direct Measures • A direct measure is tangible evidence about a student’s ability, performance, experience, etc. • Ex: performances (papers), common assignments, tests, etc.

  27. Ways to assess direct measures • Course embedded (essays, assignments) • Portfolios (electronic or hard copy) • Writing sample at beginning of course v. end of course. • Pre-and post-testing on locally developed tests (of knowledge or skills) • National tests • http://www.sc.edu/fye/resources/assessment/typology.html

  28. Rubrics

  29. Caution! • Be careful not to overextend by directly measuring multiple outcomes each year. • Indirectly assess all outcomes each year, but limit direct measures to 1-3 per year.

  30. Factors to consider when deciding which outcome to directly measure • What matters most to overall effectiveness? • Which outcomes would position us well politically? • What’s doable? • What’s usable? • Which outcomes will set us up for a successful first effort?

  31. Motivation (for direct measures) How do we ensure students take assessment seriously? Is there a hook? Is growth due to our interventions? How do you control for all the variables that could influence the outcomes? Challenges with Value Added Approach

  32. Best Practices

More Related