1 / 72

How to Effectively and Efficiently Conduct and Use Annual Assessment

How to Effectively and Efficiently Conduct and Use Annual Assessment. Director, Office of Academic Program Assessment (OAPA) Dr. Amy Liu, Professor of Sociology Assessment Consultants, OAPA Dr. Jacqueline Brooks, Professor of Sociology Dr. Chia-Jung Chung, Professor of Education

ayates
Download Presentation

How to Effectively and Efficiently Conduct and Use Annual Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How to Effectively and Efficiently Conduct and Use Annual Assessment Director, Office of Academic Program Assessment (OAPA) Dr. Amy Liu, Professor of Sociology Assessment Consultants, OAPA Dr. Jacqueline Brooks, Professor of Sociology Dr. Chia-Jung Chung, Professor of Education Dr. Milica Markovic, Professor of Engineering http://www.csus.edu/programassessment/ California State University, Sacramento Spring 2019

  2. Outline • The Annual Assessment Process, OAPA, and Your Program 2. 2017-2018 Feedback Report and Appendices as Resources How to Use the Feedback Reports to Improve Learning & Success 3. 2018-2019 Assessment Template: Simple, Clear, and Useful Report One Outcome in Detail: How to Answer the Open-Ended Assessment Questions

  3. Office of Academic Program Assessment (OAPA) • Run by faculty to provide guidance and feedback on program assessment • Help Academic Affairs • Program Review • WASC • Academic Strategic Planning & Resource Allocation • Help Programs - To make Annual Assessment: simple, clear, and useful (high quality) - To connect to the university strategic plan and assess the program strategic plan

  4. How OAPA uses annual assessment? • Provide feedback to programs/departments to improve: • student learning and success • annual assessment • program review • strategic planning and resource allocation • Successfully complete University and Department accreditations • WASC, ABET, AACSB, or CCTC • Develop professional development workshops and FLCs • Assess strategic plan and use action plan for improvement

  5. Outline • The Annual Assessment Process, OAPA, and Your Program 2. 2017-2018 Feedback Report and Appendices as Resources How to Use the Feedback Reports to Improve Learning & Success 3. 2018-2019 Assessment Template: Simple, Clear, and Useful Report One Outcome in Detail: How to Answer the Open-Ended Assessment Questions

  6. Define Learning Goals • Outcomes & Definitions • Curriculum Map • Decide on Assessments • Evidence of Learning Outcomes • Design Instruction • Help Students Achieve Outcomes Assessment PrinciplesAssessment—an ongoing, interactive process using backward design and five basic principles: “Backward Design” for Assessment (Outcomes & Definitions) Q 2.2

  7. Appendices 8-9 Baccalaureate Learning Goals, Graduate Learning Goals, and AAC&U VALUE Rubrics (Simple) Refer to the table below to know the relationship between the BLGs and GLGs and how they both relate to the AAC&U VALUE rubrics. 7

  8. Baccalaureate Learning Goals, Graduate Learning Goals, AAC&U VALUE Rubrics, DQP, and Bloom’s Taxonomy (Detailed) 8

  9. Appendix 3: WSCUC “Rubric for Assessing the Quality of Academic Program Learning Outcomes” http://www.wascsenior.org/search/site/Rubrics%20combined 9

  10. Outline • The Annual Assessment Process, OAPA, and Your Program 2. 2017-2018 Feedback Report and Appendices as Resources How to Use the Feedback Reports to Improve Learning & Success 3. 2018-2019 Assessment Template: Simple, Clear, and Useful Report One Outcome in Detail: How to Answer the Open-Ended Assessment Questions

  11. Make 2018-2019 Assessment Simple, Clear, and Useful 11

  12. Section 1 – Report all outcomes assessed in 2018-2019 What do you want your students to know, to do, and to value?

  13. Baccalaureate Learning Goals, Graduate Learning Goals, AAC&U VALUE Rubrics, DQP, and Bloom’s Taxonomy (Detailed)

  14. Think-Pair-Share Circle all the program learning outcomes assessed in your program. Where does each PLO belong in the BLG and GLG? Where does each PLO belong in the detailed curriculum map?

  15. Think-Pair-Share How is that outcome practiced in your detailed curriculum map? Are your students struggling with the thesis or critical thinking? If yes, what do you plan to do?

  16. Table 5: Sociology Evidence Map at the Program Level: Other than GPA, What Data Are Used to Measure PLOs

  17. An Example: The Curriculum Map for the Sociology Graduate Program:Aligning (Linking) Graduate Program Learning Outcomes to Each Course in the Curriculum1 “I” stands for “Introduced”, “D” for “Developed”, and “M” for “Mastered”.

  18. The Detailed Curriculum/Evidence Map for the Sociology Graduate Program: Aligning (Linking) Graduate Program Learning Outcomes to Key Assignments in Each Course in the Curriculum

  19. An Example: The Curriculum Map for a Graduate Program:Aligning (Linking) Graduate Program Learning Outcomes to Each Course in the Curriculum1 “I” stands for “Introduced”, “D” for “Developed”, and “M” for “Mastered”. 20

  20. Section 2 – Report one learning outcome in detail

  21. Outline for one PLO: The PLO and Its Definition (Q2.1 and Q2.1.1) The Rubric(s) and Standard(s) of Performance/Expectations (Q2.2a) The Direct Measure(s) (Q3.3.2) The Data Table(s) (Q4.1) The Assessment Plan = Action Plan (Q5.1.1-Q5.1.2) 22

  22. PLO 6: Critical Thinking in the Sociology Graduate Program 23 6: Sociology graduate students will demonstrate a habit of systematically exploring social issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion” (Learning Goal/Outcomes): they will (PLO 6: Critical thinking adopted from the VALUE rubric): 6.1: Clearly identify and state the social issue/problem that needs to be considered critically, comprehensively describe the social issue/problem, and deliver all relevant information so it is necessary for a full understanding of the issue/problem (Explanation of issues); 6.2: Thoroughly interpret and evaluate sociological perspectives, theories, social methods/statistics, and any other current, credible, and relevant information (research, knowledge, tools, and/or views) to complete a thorough review of relevant literature and the problems to develop a comprehensive analysis or synthesis (Evidence); 6.3: Thoroughly analyze their own and others' assumptions and carefully evaluate the relevance of contexts when presenting a position (Influence of context and assumptions); 6.4: Consider the complexities (all sides) of a social issue. Limits of position and others' points of view are acknowledged and synthesized within position (Student's position); 6.5: Form conclusions, consequences and implications that are logical and reflect student’s informed evaluation and ability to place evidence and perspectives discussed in priority order (Conclusions and related outcomes).

  23. Outline for one PLO: 24 The PLO and Its Definition (Q2.1 and Q2.1.1) The Rubric(s) and Standard(s) of Performance/Expectations (Q2.2a) The Direct Measure(s) (Q3.3.2) The Data Table(s) (Q4.1) The Assessment Plan = Action Plan (Q5.1.1-Q5.1.2)

  24. The VALUE Rubric for the Critical Thinking Skill 25 Continue onto next slide

  25. The VALUE Rubric for the Critical Thinking Skill An example of the Program Standard of Performancefor the Critical Thinking PLO: Eighty percent (80%) ofour graduate students shouldachieve a score ofat least 3 in all dimensionsof the above rubric (eg. Soc. 200B) and 3.5 by the time of graduation (Soc.500). The program standard of performance helps programs identify how well students perform within and across the program learning outcome (PLO). 26

  26. Define Standard of Performance • Expectations 2: • -Define Standard of Performance • Plan step 3 Answers to Q2.2.a: Q2: Standards of Performance/Expectations: Eighty percent (80%) of our students will score 3.0 or above in all five dimensions of the VALUE rubric (by the time they graduate).

  27. Outline for one PLO: The PLO and Its Definition (Q2.1 and Q2.1.1) The Rubric(s) and Standard(s) of Performance/Expectations (Q2.2a) The Direct Measure(s) (Q3.3.2) The Data Table(s) (Q4.1) The Assessment Plan = Action Plan (Q5.1.1-Q5.1.2) 28

  28. Attach assignment instructions that students received • Methods and Measures • -Assignments • -Tests • -Projects • Plan step 4 Questions from 2018-19 Annual Assessment Report Template: Q3.3.2. Please attach the assignment instructions that the students received to complete the assignment (Appendix I): Q3.3.1. Which of the following direct measures (key assignments, projects, portfolios, course work, student tests, etc.) were used? [Check all that apply] 1. Capstone projects (including theses, senior theses), courses, or experiences 2. Key assignments from required classes in the program 3. Key assignments from elective classes 4. Classroom based performance assessments such as simulations, comprehensive exams, critiques 5. External performance assessments such as internships or other community based projects 6. E-Portfolios 7. Other portfolios 8. Other measure. Specify: Back to Guide

  29. Example assignment description Answers to Q3.3.2: The key assignment for the iMET program assessment is the Action Research Report. iMET used this Action Research Report (Master’s Thesis) included in an ePortfolio as its direct measure to assess its critical thinking program learning outcome. This culminating experience report (the master thesis) includes the following tasks: 1. Designing and implementing a study using data collection tools that will allow the students to "show" the reader what happened during and as a result of the intervention. 2. Sorting through the findings after collecting the data, looking for data that reveal some information pertinent to the study. 3. Looking for relationships (patterns) between the data. These patterns emerge from a variety of sources such as things that have happened, things that students have observed, things that people have said, and things that students have measured. These are the findings (conclusions) of the study.

  30. Is the annual assessment useful to you? What do you want your students to know, to do, and to value? Are your students struggling with thesis or critical thinking? If yes, what do you plan to do?

  31. Outline for one PLO: The PLO and Its Definition (Q2.1 and Q2.1.1) The Rubric(s) and Standard(s) of Performance/Expectations (Q2.2a) The Direct Measure(s) (Q3.3.2) The Data Table(s) (Q4.1) The Assessment Plan = Action Plan (Q5.1.1-Q5.1.2) 32

  32. Think-Pair-Share: Data Presentation What does the data table look like? What should be included in the data table?

  33. Describe results • Results • -Data Tables • -Findings • -Conclusions • Report step 1 Questions from the 2018-19 Annual Assessment Report Template: Q4.1. Please provide tables and/or graphs to summarize the assessment data, findings, and conclusions for the selected PLO in Q2.1 (see Appendix 12 in our Feedback Packet Example). Please do NOT include student names and other confidential information. This is going to be a PUBLIC document. Q4.3. For the selected PLO, the student performance: 1. Exceeded expectations/standards 2. Met expectations/standards 3. Partially met expectations/standards 4. Did not meet expectations/standards 5. No expectations or standards have been specified 6. Don’t know Back to Guide

  34. Example Data Table • Results • -Data Tables • -Findings • -Conclusions • Report step 1

  35. Example Data Table 6.1: 38 + 54 = 92% achieving 3.0 or higher. 6.2: 15 + 40 = 55% not achieving 3.0 or higher. 6.3: 15 + 41 = 56% not achieving 3.0 or higher. 6.4: 23 + 54 = 77% achieving 3.0 or higher. 6.5: 15 + 55 = 70% achieving 3.0 or higher.

  36. Q2: Standards of Performance/Expectations: Seventy (70%) of our students will score 3.0 or above in all the five dimensions of the VALUE rubric (by the time they graduate from the four semester program.)

  37. Results: Numerical Conclusions Q2: Standards of Performance/Expectations: Seventy percent (70%) of our students will score 3.0 or above in all the five dimensions of the VALUE rubric (by the time they graduate from the four semester program.) • Results • -Data Tables • -Findings • -Conclusions Summary conclusion for Q4.2 Students meet the following standards 6.1 (92%), 6.4 (77%) and 6.5 (70%). Students do not meet the following standards 6.2 (61%) and 6.3 (61%). The two areas needing improvement: 6.2: Evidence (61%) 6.3: Influence of context and assumptions (61%). • Report step 1 Student Performance (Q4.3): 3. Partially meet the standards

  38. Data Collection Sheet for Each Student Q2: Standards of Performance/Expectations: Seventy percent (70%) of our students will score 3.0 or above in all the five dimensions of the VALUE rubric (by the time they graduate from the four semester program.)

  39. Question 4 (Q4): An example of a Data Collection Sheet for Each Student/Assignment Reference: Your data tables are based on the rubric and the data collection sheet. The moment your rubric is developed, you also have a data collection sheet! The following table is an example of a data collection sheet for a student:

  40. Results: Conclusions Answer for Q4.2: We can see (using the above table) that students meet the criteria for 6.1 (92%), 6.4 (77%), and 6.5 (70%) based on the assessment of our selected Critical Thinking PLO and our identified program standards of performance (70% of students should achieve a score of 3 or higher in all dimensions of the Critical Thinking Rubric). Students do not meet the criteria of 6.2 (61%) and 6.3 (61%). Students meet some of our program standards for the Critical Thinking Skill, thus they “Partially Met Program Standards.” Two areas need improvement: 1) Criterion 6.2: Evidence (61%), and 2) Criterion 6.3: Influence of context and assumptions (61%). • Results • -Data Tables • -Findings • -Conclusions • Report step 1

  41. Outline for one PLO: The PLO and Its Definition (Q2.1 and Q2.1.1) The Rubric(s) and Standard(s) of Performance/Expectations (Q2.2a) The Direct Measure(s) (Q3.3.2) The Data Table(s) (Q4.1) The Assessment Plan = Action Plan (Q5.1.1-Q5.1.2) 42

  42. Think-Pair-Share: Data Analysis and Presentation • How useful is the data? • Can you use the data to improve student learning and success? • What issues are you struggling with? • Are you collecting data to address those issues? • Have you used the data to promote your program?

  43. Question 5.2: Use of Assessment Data • Updated Assessment Action Plan: • -Using Assessment Data • -Closing the loop • Report step 2

  44. Question 5.1.1: Use of Assessment Data • Updated Assessment Action Plan: • -Using Assessment Data • -Closing the loop • Report step 2 Questions from 2018-19 Annual Assessment Report Template: Q5.1.1. Please describe what changes you plan to make in your program as a result of your assessment of this PLO. Include a description of how you plan to assess the impact of these changes. Answer to Q5.1.1: In order to help students in our program successfully become critical thinking researchers, we will design more classroom activities and assignments related to: 1) Re-examination of evidence (6.2) and context and assumptions (6.3) in the research, and 2) Require students to apply these skills as they compose comprehensive responses for all their assignments. Back to Guide

  45. Thank you!

  46. A Simple Example 1 Educational Technology (iMET), MA (Example of a graduate-level Intellectual skills PLO with multiple dimensions)

  47. A Simple Example 2 Chemistry, BS/BA (Example of an Undergraduate-level Disciplinary Competence PLO)

  48. Table 5: Sociology Evidence Map at the Program Level: Other than GPA, What Data Are Used to Measure PLOs

  49. How programs have used and/or can use assessment data Q5.1. Q1.1. The Office of Academic Program Assessment Preliminary Data, 2014

More Related