1 / 37

Shared Governance, Evaluation of Leaders and Governance Structures and OMG What a Mess I’m in!

Explore the personal insights and thoughts on the dynamic journey of addressing Recommendation 5 in the assessment conference. Learn about the refinement of governance structure policies, clear communication and reporting processes, and annual evaluation of leaders for institutional improvement.

sstoddard
Download Presentation

Shared Governance, Evaluation of Leaders and Governance Structures and OMG What a Mess I’m in!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Best Practices in Assessment Conference March 23, 2009 Shared Governance, Evaluation of Leaders and Governance Structuresand OMG What a Mess I’m in! … thoughts and personal insights of the life and career-changing journey into the dynamics of addressing Recommendation 5 …

  2. Recommendation 5(ACCJC letter, January 21, 2008) “The team recommends, to ensure appropriate participation and input, that the college refine its current governance structure policies by including written definition of the roles and responsibilities for all constituent groups and formalize processes and structures for clear, effective communication and reporting relationship. In addition, the college should implement an annual evaluation process to assess the effectiveness of leaders and decision-making which leads to institutional improvement”

  3. … the background scene …(circa August 2008) • The college had been warned by the commission • the ACCJC report and expectations were on everyone’s mind • there was a lack of campus-wide awareness of work in progress to address Recommendation 5 • there was disharmony at the college • there had been a no confidence vote in the Chancellor • “vetting” and “policies and procedures” were the campus watchwords • there was “contract stress” - as Lingle had demanded a two-year extension of the contract, with no salary increases • the economy was sour, oil prices and electricity bills were soaring, and the presidential campaign was in full swing (McCain was leading – and predicted to win!) • The French Report had also tagged the Office of Institutional Research as possibly the group to coordinate the evaluation, and since everyone thinks that IR can simply, say “make it so!”, that I thought, out of self-preservation, that I ought to get involved in the formative stages of this (bad idea! next time just write a memo as Jack should have done in “Red October”!)

  4. … evaluation versus assessment … … what’s the difference …… be prepared to get scars if you use the wrong term …because, if you say

  5. “Evaluation” When you say evaluation you get a 900 pound gorilla affectionately named, “EllenNancyJaniceKathleenPamFrankPaulandTara”on your back

  6. “Assessment” …but when you say “assessment” then everybody is happy and the world is a beautiful place …

  7. … the joint UHPA and Faculty Senate meeting … • August 27, 2008, the start of my journey • the dialogue at the meeting indicated that no one knew what was going on • misinformation was being presented • no knowledge of the French Report which described a course of action in response to the ACCJC’s Recommendation 5

  8. … the French Report …

  9. Three Recommendations for Recommendation 5(French Report, p. 1) “Recommendation 5 needs to be understood as three equally important parts: First, WCC needs to develop an annual evaluation process to assess the effectiveness of its leadership and decision-making structures. Second, the college need to identify one group as the monitor of the evaluation process, with the responsibility of widely communicating the results of the study to the campus community, and then using the results to make suggestion for improvement. Finally, the college needs to act upon these suggestions to implement needed institutional improvements. By incorporating all three parts into and annual evaluation process, the college can be sure that it will satisfy Recommendation 5 – and, in doing so, promote continuous improvements for the institution.”

  10. … the floating, to-be-shot-down proposal to get at Rec 5 …(thank the force that I stated that …”this proposal should be reviewed and modified such that it receives college-wide acceptance and evolves into an institutionalized process of evaluation of leadership, decision-making and governance structures”)

  11. … the joint Faculty Senate and Administration Meetings and the Institutional Effectiveness Committee … September to November meetings with the Faculty Senate, the joint Faculty Senate and Administration meetings, and the IEC on a weekly basis to develop the annual evaluation process strategic hallway discussions and endless dialogue and that’s a lot of “voices” wanting to change the proposal (and a lot of lashes and scars on my back!) must “follow the threads” and we had a deadline to meet in submitting the Accreditation Progress Report due march 15, 2009 …

  12. … the committees and councils …(18 committees, 11 councils, 10 ad hoc committees)

  13. … the final list of “decision-making governance structures” …(as developed by the “best minds” on campus, that is, the Faculty Senate and the Administration)

  14. … and the flowsheet of assessment …

  15. … morphs to the original proposal …(reached through negotiation, dialogue and just plain stubbornness) • the use of “self-assessment” by the leader or chair in place of prescribed measurable levels • outcome statements would be used to state the improvement goals, based on the surveys and self-assessments of the leader or governance structure • the results would be posted on a “private” webpage and included in annual assessments and program reviews • what did not change was that the GSIEC was an autonomous, independent subcommittee that reported to the IEC and was composed of five senior, respected, trustworthy, fair, reasonable, objective, non-judgemental, clear-headed, tenured and/or secure staff!(who btw were excused from all obligations for all other committees – which caused considerable angst about a leadership void on certain committees) • the process was collaborative and facilitated by the faculty

  16. … the survey subcommittee … • once the structures and flow of evaluation (er, assessment) were agreed on, the next step was to have the Institutional Effectiveness Committee (IEC) create the “Survey Committee” to develop the perception surveys • the “Survey Committee” was comprised of faculty skilled in surveys (the “Socialites”) and a few who just dropped in to add their viewpoints • and the Governance Improvement Committee (GIC) became the Governance Subcommittee of the Institutional Effectiveness Committee (GSIEC)

  17. … the survey committee …. • was tasked to develop the final perception surveys based on the “2008 Faculty/Staff Pilot Survey” • needed to align the survey questions with Standard IV.A • for both the leader and governance structure • for both “members” and “non-members”to ascertain the responses from both groups • and mix in a practical considerations of survey length, number of surveys, and other factors • was ready to fly to Kauai to get away after pushing the button to send out the first set of surveys • (and btw, learned that there are two kinds of people in the world, those that use commas and want the verbs at the end of sentences and those that don’t; and you always need a final arbiter, editor!)

  18. … the 2008 pilot surveys for “members” and “non-members” …

  19. … the perception surveys …

  20. … round 1 surveys …

  21. … round 2 surveys … (emails were sent with a link to member committees)

  22. … the GSIEC Policies and Procedures…

  23. … the perception survey results for … • member and non-member surveys • leaders and governance structures • how to display the results? • how to interpret the results? • how to use the results? • what about written comments?

  24. … member leader survey results …

  25. … member leader column 100 chart …

  26. … member leader area 100 chart …

  27. … non-member leader column 100 chart …

  28. … non-member leader area 100 chart …

  29. … how to interpret the results … • easiest way is to use the area graphs • you want to be “broad-banded in the right places • “are you broad-banded”?

  30. “the self-assessment form”

  31. … the self-assessment example …

  32. … the “fuzzy rubric” …

  33. … the GSIEC “seal of approval “or the “please re-do” request …

  34. … communication of self-assessments and outcome improvement statements … to be posted on a private campus webpage (most likely Laulima) to be incorporated in annual assessments and program reviews follow-up next year in second year of surveys, re: improvement statement outcomes

  35. … anecdotal insights, observations and surprises … the surveys and assessment process actually change daily interactions – reflective outcomes come out of them knowing that this process is occurring permeates one’s attitude so that daily interactions are more collegial and less confrontational friendly emails are being received a modicum of civility and collegiality has returned the benefits include the two poles moving back to the center (dogs and cats living together!) a culture of evidence for governance improvement is developing this is a “new protocol” contributing to assessment of leaders and governance structures – administrators and faculty must change and accommodate the developments faculty must understand the difference between an advisory role and decision-making role and present cogent arguments for change and not simply whine that even so, we still want it dialogue has elicited discussions such as “how do you get range 3, 4, or 5 faculty to contribute their share to the college? why is there so many “I don’t knows” in the surveys? two suggested reasons - the mode or mechanism needs change or the flip side – no one is paying attention

  36. … where do we go from here … • assessment of the GSIEC and the annual evaluation process • recommendation to the ACCJC regarding changes • utilization of resources • does governance and decision-making improve as a result of this process • is this a contributing, productive aspect of self-governance? • if you’re considering this at your campus, consider the following: the process must be worked out until all the players are comfortable with and buy into it; it must be transparent; it must fit your needs; it must be workable, and; consider the resources required to be successful

  37. … the cycles I’ve seenthe end of my journey … 1985-1990 Joyce Tsunoda “competencies” 2005-2009 John Morton “SLOs” 1975-1980 Mel Sakaguchi “behavioral objectives” 2015-???? rising admin star “reading, ‘riting, and ‘rithmetic” thank you for your indulgence – now I’m going to put on Bilbo’s and Frodo’s ring and disappear before I really make the governance thing worse!

More Related