1 / 24

Assessing Student Learning Outcomes in Student Development – Part II

Assessing Student Learning Outcomes in Student Development – Part II. Student Development Division Retreat SUNY Oneonta May 27, 2008. Presenter. Patty Francis, Associate Provost for Institutional Assessment and Effectiveness. Topics/Activities for Today.

kin
Download Presentation

Assessing Student Learning Outcomes in Student Development – Part II

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing Student Learning Outcomes in Student Development – Part II Student Development Division Retreat SUNY Oneonta May 27, 2008

  2. Presenter Patty Francis, Associate Provost for Institutional Assessment and Effectiveness

  3. Topics/Activities for Today • Continued themes from May 9 division meeting • Developing a unit assessment plan • Writing good outcome statements • Measuring student learning outcomes • Hands-on opportunity to develop outcome statements/measures for department

  4. Continued Themes from May 9 Meeting

  5. Assessment Basics and Rationale • Assessment as an ongoing, iterative process which must culminate with “closing the loop” • Importance of congruence in the assessment process • Assessment as an opportunity for ongoing dialogue, professional development and, most important, improving student services

  6. Developing a Unit Assessment Plan

  7. Overall, Unit Objectives Would Reflect: Institutional effectiveness performance indicators Documentation of all services and programs offered Tracking of use of services (and by whom) Student satisfaction with services/programs Direct impact of services/programs on students

  8. Writing Good OutcomeStatements

  9. Issues to Address at the Outset • Consistency between college, division, and unit mission statement (and, ultimately, outcome statements) • Do all staff have the opportunity to provide input? • Who are all your constituents? • What results do you expect your programs and services to produce?

  10. Components of a Good Outcome Statement • Who is the target? • What domain of student development is the target? • What change is expected?

  11. But Other Things to Keep in Mind Do you have a program/activity in place to bring about outcome? Can desired change in students be measured? How will you know you were successful? Do external standards apply (i.e., in case of external accreditation/certification)?

  12. Outcome Writing Rule #1: Focus on Student, Not Program Good example: “Student workers will identify, provide, and implement technical equipment that is appropriate for specific union events.” Poor example: “College union work study program will teach student workers how to select and set up equipment for union events.”

  13. Outcome Writing Rule #2: State Outcomes Using Concrete Language and Action Verbs Good example: “Students will negotiate necessary academic accommodations with faculty members.” Poor example: “Our objective is to enhance students’ independence and self-confidence.”

  14. Outcome Writing Rule #3: Focus on Results, Not Process Good example: “Students will demonstrate increased job search skills (e.g., letter and resume writing, interviewing, employer research).” Poor example: “Students will participate in three 2-hour sessions on letter writing, interviewing, and employer research.”

  15. Measuring Student Learning Outcomes

  16. Basic Principles • Measurement techniques must be rigorous, since unreliable data are of minimal value • Best to use variety of quantitative and qualitative measures • Quantitative easier, but not often as rich • Qualitative often more informative, but require check on scoring (e.g., rubrics) • Rely as much as possible on existing data sources

  17. Types of Information to Include • Survey data • SUNY SOS • NSSE • Local surveys of student satisfaction/perceptions • National benchmarking data (e.g., ACUHO/EBI, ACUI/EBI) • Performance-based data • Focus groups 1. Normative/benchmark information whenever possible (external sources, own performance over time) 2. Locally collected data

  18. General Types of Measures (Maki, 2004) • Direct measures – students actually demonstrate learning so that evaluators can match results to expectations • Standardized tests • Authentic, performance-based – embedded into students’ actual educational context • Indirect – students’ perceptions of learning • Should never be used as sole indicator

  19. From Measures to Criteria • Assessment criteria reflect your expectations about student performance (i.e., how you know you were successful) • Set criteria at reasonable but challenging levels • Often take these forms: • “90% of students will …..” • “80% of students will score at least 70% on…”

  20. Authentic, Performance-Based Assessment: The Value of Rubrics • Rubrics provide reliable means of rating student performance in more qualitative way • Steps in developing rubrics • Use entire staff to help develop and be as specific as possible in differentiating between performance levels • Use existing rubrics as guide • Pilot test to assure scoring is reliable

  21. Rubric Example #1: Recreation & Sport Services (Univ. of W. Florida)

  22. Rubric Example #2: Career Services (Interviewing) (Univ. of W. Florida)

  23. Congruence in the Assessment Process: A Detailed Example Handout #1

  24. Your Turn! Handout #2

More Related