1 / 27

Periodic Program Review

Periodic Program Review. Guiding Programs in Today’s Assessment Climate . LaMont Rouse Executive Director of Assessment, Accreditation & Compliance. Goals of the Presentation. Describe the periodic program review process; Focus on PPR and Middle States;

bao
Download Presentation

Periodic Program Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance

  2. Goals of the Presentation • Describe the periodic program review process; • Focus on PPR and Middle States; • Review core elements of the PPR process; • Discuss the new era of accountability in higher education; • Define what “good” assessment is; • Documenting assessment of student learning; • Closing the Loop: Using data to make and inform decisions and resource allocation; • Q&A

  3. Purpose of Program Review Strengthen and Validate Programs • What are the intended learning outcomes? • Do students know the intended learning outcomes of the program? Does the public? • Are students achieving these outcomes? • How do we prove it? • How can we strengthen and support programs?

  4. Purpose of Program Review Align Resources to Data and Outcomes • What does the data tell us? • How can we improve student learning? • How can we improve teaching effectiveness? • How do we take the program to the next level? • How does the program support Cedar Crest’s mission?

  5. Purpose of Program Review Middle States Commission on Higher Education • Demonstrates a model for institutional effectiveness; • MSCHE considers program review part of the new era of accountability; • Colleges must begin to report regularly to their various constituents (students, alumni, employees, employers, the federal government, the state, etc.) • The demands for accountability are growing stronger, not weaker.

  6. New Era of Accountability The Big Picture • PPR will strengthen programs; • PPR will lead to sustained enrollment growth at the college; • PPR will help align the College with the best practices in higher education; • PPR will allow Cedar Crest to tell its story in a more cohesive fashion.

  7. Key Features of PPR Helps Programs Continuously Update their • Mission statements • Student learning outcomes • Curriculum maps

  8. Key Features of PPR Provides the following: Data on student learning outcomes; Provides a quantitative record of accomplishments; The data should ultimately suggest the next steps in the maturation of the program.

  9. Key Features of PPR Provides the following: • Strengthens programs by seeking alignment with the best practices in the field. • Helps align resources to stated needs. • Helps faculty and administrators see trends in a more coherent fashion.

  10. The Role of Chairs • Chairs assign duties as needed to complete the program review report. • Works with the Office of Assessment and the Assessment Committee to ensure the PRR schedule is maintained. • Notifies the Office of Assessment when issues arise.

  11. The Role of Chairs • Best Practices • Review with faculty the program’s mission statement and student learning outcomes before the full review begins. • Read though prior years’ assessment reports. Ensure that all SLOs have been assessed within the last 3 years. • Create a central location to collect/gather data.

  12. The Role of Chairs • Best Practices • Build trust • Open communication • Stay on schedule

  13. The Role of Chairs/Time Line • Phase I (September-October) • Go to the program review orientation sponsored by the Office of Assessment • Gather assessment reports from the last 3 to 5 years (as appropriate) • Assign appropriate roles for the completion of the task.

  14. The Role of Chairs/Time Line • Phase 2 (November-January) • Start placing documents into the appropriate section of the binder. • Contact Institutional Research & the Office of Assessment for any additional data. • Complete sections 1,2,4,8 &10

  15. The Role of Chairs/Time Line • Phase 3 (February - April) • Provide an update to the Office of Assessment and the Assessment Committee • By the first week of February, complete all data gathering activities. • Write and complete the remaining sections of the report.

  16. The Role of Chairs/Time Line • Phase 4 (May - July) • Complete the report and submit it to the Office of Assessment and the Assessment Committee • Make suggestionsthat will support your program through the next cycle. • Make suggestions (if any) for curricular changes and modifications • Meet with the Assessment committee to review the report.

  17. Questions

  18. What is “Good” Assessment? Start with clear statements of the most important things you want students to learn from the course, program, or curriculum. Teach what you are assessing. Help students learn the skills needed to do the assessment task. For example, if you are giving a writing assignment, help your students understand how you define good writing and give them feedback on drafts. Because each assessment technique is imperfect and has inherent strengths and weaknesses, Collect more than one kind of evidence of what students have learned.

  19. What is “Good” Assessment? Make assignments and test questions crystal clear. Write them so that all students will interpret them in the same way and know exactly what they are expected to do. Before creating an assignment, write a rubric: a list of the key things you want students to learn by completing the assignment and to demonstrate on the completed assignment. Likewise, before writing test questions, create a test “blueprint”: a list of the key learning goals to be assessed by the test and the number of points or questions to be devoted to each learning goal.

  20. What is “Good” Assessment? Collect enough evidence to get a representative sample of what your students have learned and can do. Collect a sufficiently large sample that you will be able to use the results with confidence to make decisions about your course, program, or curriculum. Score student work fairly and consistently. Before scoring begins, have a clear understanding of the characteristics of meritorious, satisfactory, and inadequate papers. Then use a rubric to help score assignments, papers, projects, etc., consistently. Use assessment results appropriately. Never base any important decision on only one assessment. (Failure to adhere to this maxim is one of the major shortcomings of many high-stakes testing programs.) Assessments shouldn’t make decisions for us or dictate what we should teach; they should only advise us as we use our professional judgment to make suitable decisions. .

  21. What is “Good” Assessment? • Features • Each program should have 4 to 6 primary student learning outcomes. • Each program should describe at least one direct and indirect method for capturing data for each SLO.. • Every program should be assessing at least one or two of their SLOs each year.

  22. Documenting Assessment • Data should be gathered from rubrics, in house tests, external products including surveys, etc. • A plan for collecting this data must be established along with roles and responsibilities. • Data should be shared across the department. Transparency is key.

  23. Documenting Assessment • Use templates as much as possible when reporting data. • Gather a little bit of data each year. • Strategically map your assessment plan between PPRs. • Find balance between gathering “good enough” evidence and a sustainable system.

  24. Closing the Loop/Meaningful Modifications Closing the Loop: A step in an institutional effectiveness or assessment cycle. Many people think of closing the loop as a final step; it is, however, a step that is used to begin the cycle again, based on an analysis of what has been accomplished and learned up to that point. At Cedar Crest, assessment measures are used to determine program strengths and challenges. Analysis follows and leads to some decision about improving the program or continuing the program without change. It is that decision that serves to close the loop.

  25. Closing the Loop/Meaningful Modifications What do the findings tell us now? Did our actions improve learning? What else do the findings show? What’s the next step? What have we learned about our assessment process? What can be improved?

  26. Closing the Loop/Meaningful Modifications • Must document the closing the loop actions • Must allocate resources – in part – based on these closing the loop actions. • Have to demonstrate that we are a “learning” and adaptive organization.

  27. Thank you.

More Related