1 / 22

College of Health Sciences Assessment Workshop

College of Health Sciences Assessment Workshop. Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist. The Provost’s LEARNING Initiative Dual Track Implementation Strategy.

manny
Download Presentation

College of Health Sciences Assessment Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. College of Health Sciences Assessment Workshop Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist

  2. The Provost’s LEARNING Initiative Dual Track Implementation Strategy

  3. Six Fundamental Questions • How are your stated student learning outcomes appropriate to your mission, programs, degrees, and students? • What evidence do you have that students achieve your stated learning outcomes? • In what ways do you analyze and use evidence of student learning? • How do you ensure shared responsibility for student learning and for assessment of student learning? • How do you evaluate and improve the effectiveness of your efforts to assess and improve student learning? • In what ways do you inform the public and other stakeholders about what and how well your students are learning?

  4. Review of Assessment Basics • University Assessment • Campus-wide assessment of student learning at the program level (e.g., General Education) • University assessment is the primary charge of the Office of Assessment • University assessment is separate and distinct from evaluation of teaching effectiveness • Evaluation of teaching effectiveness is the responsibility of departments/colleges • Assessment data are analyzed and reported only in the aggregate • You can’t assess everything all the time! • Plan for assessment that is practical, given current time and resource constraints • Assess 1 or 2 outcomes per year

  5. Review of Assessment Basics • Assessment vs Evaluation • Assessment requires us to “take a step back” from the interaction between student and teacher • Grades are evaluations, generally not used for assessment • Team approach to evaluation • Essentially a juried assessment in that more than one individual is scoring/evaluating • A periodic, objective validation process of some kind required to ensure validity and reliability

  6. Review of Assessment Basics • Three levels of assessment • Course • Program • Undergraduate majors/programs • General education program • Graduate majors/programs • Institutional • Course, Program, and Institutional outcomes should be aligned, but are not identical

  7. Review: Program Level Assessment • Focused on curricular, environmental improvement • Formative and Summative, Direct and Indirect methods • Curriculum mapping, program improvement

  8. Review: Program Outcomes • Focus on broad skills developed over time • Not restricted to a single course or learning experience • Demonstrate acquisition of specific disciplinary/professional knowledge and skills necessary after graduation • Ask: “What makes a graduate of the program able to function and learn in a specific discipline/profession after the degree?” • Measurable • Confirmable through evidence

  9. Measuring Learning Outcomes • Measures must be appropriate to outcomes • Avoid cumbersome data-gathering • Use both direct and indirect methods • Indirect methods measure a proxy for student learning • Direct methods measure actual student learning • “Learning” = what students know (content knowledge) + what they can do with what they know

  10. Defining Evidence • Information that tells you something directly or indirectly about the topic of interest • Evidence is neutral -- neither “good” nor “bad” • Requires context to be meaningful • Two types of assessment evidence • Direct (“authentic”) and Indirect • Best practice calls for multiple methods

  11. Direct Evidence • Students show achievement of learning goals through performance of knowledge, skills: • Scores and pass rates of licensure/certificate exams • Capstone experiences • Individual research projects, presentations, performances • Collaborative (group) projects/papers which tackle complex problems • Score gains between entry and exit • Ratings of skills provided by internship/clinical supervisors • Substantial course assignments that require performance of learning • Portfolios

  12. Indirect Evidence • Indirect methods measure proxies for learning • Data from which you can make inferences about learning but do not demonstrate actual learning, such as perception or comparison data • Surveys • Student opinion/engagement surveys • Student ratings of knowledge and skills • Employers and alumni, national and local • Focus groups/Exit interviews • Course grades • Institutional performance indicators • Enrollment data • Retention rates, placement data • Graduate/professional school acceptance rates

  13. Mapping Outcomes for Program-Level Assessment • Create a visual map: • Lay out program courses and learning outcomes (competencies) on a grid • Refer to examples (Handouts) • Identify the courses at which each competency is: • Introduced • Reinforced • Emphasized

  14. Basic Program Map Template I = Outcome is introduced; baseline, formative assessment R = Outcome is reinforced; formative assessment E = Outcome is emphasized; summative assessment

  15. Finding Evidence: An Evidence Inventory • Lets you discover the evidence you already have, such as: • Institutional Research data • Student Life data • Exit Surveys (seniors) • Alumni Surveys • Start with the obvious … but don’t stop there

  16. Finding Evidence: Perils and Pitfalls • Institutional history • “We’ve already done that, and it didn’t tell us anything!” • Territory; Politics • Fighting for scant resources • Institutional policy/culture about sharing information • “I don’t want somebody ‘policing’ my classrooms!”

  17. Fundamental Question #1: Appropriate Evidence • Does the evidence address student learning issues appropriate to the institution? • Does the evidence tell you something about how well the institution is accomplishing its mission and goals? • The questions you have about student learning should guide your choice of appropriate existing evidence and identify gaps where a new type of evidence might be needed

  18. Assisting Academic Departments: Addressing Common Barriers • “This is a lot of work!” • Use some sort of evidence inventory to help faculty understand how existing academic practices yield evidence • Keep expectations reasonable, given limited time and resources • Remember: it is not necessary to gather all the evidence all of the time

  19. Assisting Academic Departments: Addressing Common Barriers • “How do I know you won’t use this against me?” • Be consistent and firm in the message that assessment is not faculty evaluation, that results will only be reported in the aggregate • Remember: Assessment results will link to allocation of resources, ideally through the strategic planning process

  20. Completing an Assessment Cycle • Assessment is only a means to an end • The purpose of assessment is continuous improvement of student learning • The assessment cycle is complete when assessment results have been used successfully for evidence-based decision making

  21. The Assessment Cycle • Articulate expectations in the form of student learning outcomes • Measure achievement of expectations • Collect and analyze data • Use evidence to improve learning • Assess the effectiveness of improvement

  22. Unit Assessment Plans • Unit Assessment Plan Template (Handout) • Use this template as a foundation for your unit assessment plan, revising and reshaping as necessary

More Related