1 / 54

Increasing faculty investment in program review and program assessment : a view from the trenches

Increasing faculty investment in program review and program assessment : a view from the trenches. CIHE Assessment Forum December 6, 2012. Increasing faculty investment in program review and program assessment. Committee Composition. Internal Members External Voices. Committee Dynamic I.

didina
Download Presentation

Increasing faculty investment in program review and program assessment : a view from the trenches

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Increasing faculty investment in program review and program assessment :a view from the trenches CIHE Assessment Forum December 6, 2012

  2. Increasing faculty investment in program review and program assessment

  3. Committee Composition • Internal Members • External Voices

  4. Committee Dynamic I Academic Sparring……….. OR:

  5. Committee Dynamic II Deliberative Dialogue?

  6. From Status Determination to Continuous Improvement  Or 

  7. So we applied for and received a grant from the Davis Educational Foundation to improve program review and program assessment in the VSC.

  8. Our plan was to . . . • Create a steering committee • Educate ourselves about best practices in PR and PA • Research local faculty attitudes/perceptions • Develop resources to support faculty conducting reviews • Improve the process at PR meetings

  9. But then, faculty began speaking. • Many hated the PR process. • They saw it as risky and potentially punitive. • Thus, reports were guarded; those attending PR meetings often were defensive. • Faculty of accredited programs, particularly, resented the additional work.

  10. In addition, we found other problems. • Who wrote the reports? • Who knew what they said? • Who knew the outcome of the PR process? • How were the reports used • By the faculty? • By the president and dean?

  11. So, what was the point?Why were we doing all this?

  12. It became clear that past PR efforts had few if any benefits and were being engaged in by faculty who took only a “compliance” point of view.

  13. So we revised our objectives, raised our aspirations, and redirected the grant. • We wanted faculty to see that these activities could benefit their programs and students. • For that, we needed to remove the punitive elements of the policy. • Indeed, if at all possible, we needed to sever the connection to the old policy. • So we sought to change and rename the policy. • And we did.

  14. Our first win: the low-hanging fruit • The Board agreed that accredited programs should not have to undergo the PR process (11/4/10). • Importantly, this convinced many faculty • that our commitment to positive change was serious and • that the Board was willing to listen to faculty.

  15. Then we moved on to the harder work of changing the policy as it affects the rest of academic programs.

  16. This required surgery. • Eliminating a much-disputed cost-revenue analysis • Eliminating the requirement that faculty address such issues as • “competitive advantages and disadvantages,” • institutional recruiting strategies • etc.

  17. More surgery • And, most important, eliminating the stipulation that PR could result in Board decisions that included—Gasp!— “Termination.” But we needed to do this . . .

  18. . . . while beefing up honest self-evaluation of program effectiveness.

  19. Finally, April 28, 2011:Program Review and Continuous Improvement Process, PReCIP, adopted.

  20. What was the process of our two-year effort?

  21. The process. • Ten day-long steering committee (SC) retreats • Fall of year 1 was devoted to learning how faculty viewed the process and what was needed to improve it. • Each college held faculty focus group meetings. • We administered a faculty survey.

  22. The process, continued. • Took steps to change Board policy • With chancellor’s support, advanced the “modest proposal” regarding accredited programs. • SC developed proposal for a comprehensive policy change as well as substantial changes to the self-study “template.”

  23. The process, continued. • In year 2, work shifted to increasing faculty expertise in PA. • This took many forms. • VTC’s dean and department chairs read Walvoord’s Assessment Plain and Simple; chairs took turns facilitating discussions of each chapter. • JSC’s dean focused several of the chairs meetings on assessment; chairs took turns presenting and getting feedback on their educational objectives, assessment plans, and resulting data.

  24. The process, continued. • Castleton and CCV faculty attended NEEAN workshops and conferences. • JSC’s SC members prepared an RFP, inviting faculty to request support for an assessment-related project. • Several colleges brought in consultants, e.g., Peggy Maki and Martha Stassen.

  25. The process, continued. May 2012 system-wide retreat held for faculty of programs that were scheduled for 2013 review. Five SC faculty developed an on-line PR and PA manual to support faculty writing self-studies and working on program assessment.

  26. VERMONT STATE COLLEGESASSESSMENT GUIDERELATED TO PROGRAM REVIEW AND CONTINUOUS IMPROVEMENT PROCESS (PReCIP) REPORTSTABLE OF CONTENTS • PURPOSE AND USE OF THIS GUIDE • BASICS OF ASSESSMENT • THINGS TO KNOW BEFORE YOU START • TROUBLESHOOTING PROBLEMS YOU MIGHT ANTICIPATE • ASSORTED BEST PRACTICES AND WISDOM • GLOSSARY OF TERMS AND INDEX • APPENDIX 1 - ILLUSTRATIVE EXAMPLES AND GUIDANCE

  27. The process, continued. • May 2012 system-wide retreat held for faculty of programs that were scheduled for 2013 review. • Five SC faculty developed an on-line PR and PA manual to support faculty writing self-studies and working on program assessment. • VSC deans developed statements regarding the desired qualities of outside members and their role.

  28. The process, continued. May 2012 system-wide retreat held for faculty of programs that were scheduled for 2013 review. Five SC faculty developed an on-line PR and PA manual to support faculty writing self-studies and working on program assessment. VSC deans developed statements regarding the desired qualities of outside members and their role. The SC developed three instruments to evaluate PReCIP products and processes.

  29. The role of steering committee faculty • We were very fortunate to have selected outstanding faculty for the steering committee. • All were highly respected. • Most were tenured senior faculty; but the SC also included a few early-/mid-career faculty. • They represented a broad range of disciplines. • They had a broad range of experience with assessment. • All were willing to play leadership roles among their peers.

  30. The role of steering committee faculty, continued • At most colleges, SC faculty became the principal spokespersons and public advocates for PR/PA and its importance/value. • Most became coaches/mentors of faculty who were going through the process. • CCV’s faculty member periodically writes articles on assessment for the dean’s monthly newsletter. • All worked closely with their deans to assess how the process was being conducted and to seek ways to improve it.

  31. The role of the deans • Often in a role secondary to SC faculty, deans helped explain the new policy and process. • Often with the assistance of SC faculty, deans help faculty develop expertise in program assessment. • Deans obtain essential resources. • Deans play a critical role in raising institutional awareness and the significance of PR and PA at the college. Towards these ends: • JSC’s strategic plan now includes a priority related to the continuous improvement of academic programs. • In what is also an annual budget meeting, each VTC academic department presents its educational objectives and outcomes to the president’s Cabinet.

  32. The faculty experience:Rounds 1 and 1.5

  33. Lyndon’s Round 1:2011-2012Three departmentsFive programsThree different approachesEverybody else watching(or refusing to look)

  34. Onecampus-wideinitialresponse:

  35. One common response:

  36. First Round Departments•Computer Info Systems– one full-time faculty•Mountain Rec Mgmt – four full-time faculty•Natural Sciences – four full-time faculty* including one SC member

  37. Strategy: Preparation Fall 2011 – campus wide meeting •Reports due May 2012, May 2013: all department faculty •Reports due subsequent years: 1 department representative Info-sharing, brainstorming responses across disciplines

  38. Strategy: Implementation Spring 2012 – initial internal deadlines •spaced evenly throughout term •”easiest” sections due first Reality: You want these when?? •three departments, three experiences •common threads emerge

  39. Common Threads Concerns •PReCIP felt like the old Policy 101 -first round = starting from the ground up -may remain a problem for whole first cycle •Very difficult to complete during semester -faculty time constraints Discoveries •Despite problems, this process is far more productive than Policy 101

  40. Experiences • Degree of collaboration during writing • CIS • MRM • NS • Writing process • Reflective • Entire document meaningful • Reflection obviously helpful for faculty work • Summer Meetings • Uniformly beneficial

  41. Moving Forward Towards the Next Cycle • Due date later in the summer to acknowledge difficulties of the undertaking during the term. • Establish the most productive way to move towards writing the next report • Pass the word! • Participants actively encourage departments reporting in successive years • Helps to break down silos

  42. Round 1: Most important lessons •Improve departmental collaborative writing process •Extend due date farther into the summer  •Summer review meetings are really helpful! --excellent atmosphere for constructive criticism new collaborations •Implement continual preparation for next review

  43. Progress for Round 1.5 •Departments have begun active planning •Some skepticism remains •Another SC member directly involved distinct benefit •Generally on-track

  44. •Truly reflective•A useful summary --guides assessment reflection•Supports much better preparation for the next report•Increased buy-in after first round New process much improved!

More Related