1 / 32

WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment

WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment. 2013-2014 Institutional Research & Effectiveness Neil M. Patel, Ph.D. Juan Ramirez, Ph.D . Meeting Roadmap. The goals are to understand Why assessment needs to take place

yul
Download Presentation

WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment 2013-2014 Institutional Research & Effectiveness Neil M. Patel, Ph.D. Juan Ramirez, Ph.D.

  2. Meeting Roadmap • The goals are to understand • Why assessment needs to take place • Who should be involved in assessment • What needs to be assessed • How to assess the learning outcomes • When assessment reports are due

  3. Assessment Overview • Why assess? • Accountability • To measure learning • To identify challenges related to instruction, curriculum, or assignments. • To improve learning • Methods must be in place to properly assess • Information should be shared widely and used to inform decision-making • Key Players • Deans, Faculty, Curriculum committees, Assessment committees, Assessment Specialists, Preceptors

  4. What needs to be assessed?

  5. What needs to be assessed? (cont.):We cannot assess everything! • Direct assessment of Signature Assignments • Signature assignments have the potential to help us know whether student learning reflects “the ways of thinking and doing of disciplinary experts” • Course-embedded assessment • Aligned with LO’s • Authentic in terms of process/content, “real world application” • Indirect assessment, i.e., Student perceptions • First year survey • Graduating survey • Alumni surveys • Student evaluation of course

  6. Ilo assessment template

  7. Assessment Template • Timeline • Section I: Progress Report • Section II: Learning Outcome Alignment • Section III.1: Methodology • Section IV.1: Results • Section V.1: Discussion & Implications • Section III.2: Methodology • Section IV.2: Results • Section V.2: Discussion & Implications

  8. Assessment Template • Timeline • Section I: Progress Report • Section II: Learning Outcome Alignment • Section III.1: Methodology • Section IV.1: Results • Section V.1: Discussion & Implications • Section III.2: Methodology • Section IV.2: Results • Section V.2: Discussion & Implications

  9. Section I: Progress Report • Goal: To document what occurred as a result of 2012-2013 assessment .

  10. Section II: Learning Outcome Alignment • Goal: To determine which PLO’s align with the ILO, and, to determine, over time, which PLO’s are not assessed.

  11. Section III: Methodology • It will be necessary to copy and paste sections III-V if there are more than two assessments completed. • Every ILO report needs to include one direct and one indirect assessment-Multiple assessments may be necessary to cover ALL PLOs.

  12. Section III: Methodology

  13. Section III: Methodology

  14. Section III: Methodology • Note: Participation section is for participation in assessment process not for the participation of the development of the student work

  15. Section IV: Results • Analytical approach • Should align with assessment goal • To determine how many students are achieving at a specific level/score: Frequency distribution • To determine if differences in scores exist between two or more groups: chi-square, t-test or ANOVA • To determine if scores from one assignment predict scores of another assignment: Regression • Sample size: number of students assessed • Statistical results: Frequency table, p value, Etc.

  16. Section V: Discussion & Implications

  17. Example

  18. Example Scenario: Following a discussion between faculty, Curriculum Committee, the Program Assessment Committee and the Dean, it was decided Critical Thinking will be assessed using 4th year preceptor evaluations. Question: What do we need to do?

  19. Example: 4th year preceptor evaluations to assess Critical Thinking • Things to consider: • Which PLO(s) are assessed? • How is the assessment scored? • Who has the data? • What is/are the quantifiable assessment goals? • Standards of success • How do we analyze the data?

  20. Example: 4th year preceptor evaluations to assess Evidence-Based Practice • Assessment: The preceptor evaluation of students occurs during various time points within the 4th year rotations. For the purpose of assessment, the program has decided to use the students’ entire 4th year preceptor evaluations (eight evaluations in total). The preceptors are asked to indicate using a Yes/No format if a student has been observed demonstrating a list of certain skills or has been observed displaying certain knowledge elements; there are 20 total items in the evaluation form. These elements are commonly displayed within the profession. The data is sent directly to the 4th year Director. To assess Critical Thinking, a single item within the checklist is used: The student utilizes and displays critical thinking.

  21. Example: 4th year preceptor evaluations to assess Critical Thinking • Assessment Goal: 90% of students will demonstrate critical thinking skills. • Why did we come up with 90%? • Peer or aspirational college has similar standard • Professional community suggests such standard • Our own data has set the standard • The assessment goal is different than grading • For grading, passing = 70%; 14/20; “Yes” = 1 point • It is possible for all students to score 0 on the Critical Thinking item.

  22. Averaged data of 4th year preceptor evaluations assessing Critical Thinking per student CT Score: 0 = no, 1 =yes

  23. Example: Section III.1 Methodology

  24. Example: Section III.1 Methodology

  25. Example: Section IV.1 Results

  26. Example: Section V.1 Discussion & Implications

  27. You can see a lot by just looking---Yogi Berra CT Score: 0 = no, 1 =yes Gender: 1 = male, 2 =female

  28. Group Activity

  29. Timeline

  30. CAPE Workshops Spring 2014 • Measurable Student Learning Outcomes • Tuesday, January 14 at 12pm • Curricular Mapping • Tuesday, February 11 at 12pm • Operationalizing and assessing WesternU ILOs • Tuesday, March 4 at 12pm • Developing Valid and Reliable Rubrics • Tuesday, April 8 at 12pm • Basic Techniques in Presenting Data • Tuesday, May 6 at 12pm • Closing the Loop • Tuesday June 10 at 12pm

  31. Questions?Concerns?Institutional Learning Outcomes Assessment information can be found on the IRE website: http://www.westernu.edu/ire-assessment-home

More Related