1 / 46

AQIP System Portfolio

AQIP System Portfolio. Overview of Appraisal Feedback Report Planning Council July 8, 2009. Overview of AQIP Process. AQIP – Academic Quality Improvement Process Accreditation process that intended to help institution move forward, rather than look backward (PEAQ)

loe
Download Presentation

AQIP System Portfolio

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AQIP System Portfolio Overview of Appraisal Feedback Report Planning Council July 8, 2009

  2. Overview of AQIP Process • AQIP – Academic Quality Improvement Process • Accreditation process that intended to help institution move forward, rather than look backward (PEAQ) • Submitted electronic system appraisal portfolio in November 2008; basis for reaccreditation of institution • 100 pages with links, responding to set of prompts / questions • 9 categories that are analyzed • Questions related to context, process, results and improvements • Writing team: Cecelia, Ethelle, Carrie and Linda

  3. Feedback Report • DSU’s appraisal portfolio reviewed by team selected by AQIP • Reviewers read and respond to information provided with ratings of SS, S, O and OO (strengths and opportunities) • Feedback report – approximately 60 pages • Intended to guide institution’s continuous improvement processes • Overall, our review was very positive and complimentary for an institution new to continuous improvement • Good News: no significant / accreditation issues

  4. AQIP Portfolio Categories • Helping Students Learn • Accomplishing Other Distinctive Objectives • Understanding Student / Stakeholder Needs • Valuing People • Leading and Communicating • Supporting Institutional Operations • Measuring Effectiveness • Planning Continuous Improvement • Building Collaborative Relationships

  5. #1: Helping Student Learning • Focuses on teaching-learning process within formal instructional context but also how entire institution contributes to teaching and learning • Strongest category (also largest and longest category relative to number of questions and length of responses) • Ratings: SS – 3; S – 19; O – 18; OO – 1

  6. #1: Helping Student Learning Super Strengths: • Learning objectives for general education and academic programs • Collection and analysis of assessment data • Strong commitment to academic advising and other student support services Support for Super Strengths: • AQIP Action Project on Academic Advising, funded through Title 3 • AQIP Action Project on general education assessment

  7. #1: Helping Student Learning Outstanding Opportunities: • No evidence of how admission and placement policies were established and reviewed Note: this is a system issue, not a DSU issue, but probably does need attention at the system level

  8. #1: Helping Student Learning From Feedback Report: • Learning objectives for system gen ed, institutional graduation requirements and major field requirements in place and data is being collected • Institution needs to implement systematic processes for reviewing and analyzing data relative to student performance to guide changes in programs and student learning experiences

  9. #2: Accomplishing Other Objectives • Focuses on distinctive objectives that complement student learning and institutional mission • Note: portfolio narrative focused on • Institutional mission / integration of technology • DSU’s contribution to governor’s 2010 objectives • DSU’s contribution to BOR’s Opportunities Plan • Ratings: SS – 0; S – 4; O – 7; OO – 3

  10. #2: Accomplishing Other Objectives Strengths: • Clear focus on mission, supported across the institution • Relationships with external constituents are often directly linked to IT mission Support for Strengths: • AQIP Action Project: establishing D.Sc. program Outstanding Opportunities: • Incorporate continuous review into IEC process • No measures or results relative to accomplishing BOR / Governor’s 2010 initiatives

  11. #2: Accomplishing Other Objectives From Feedback Report: • Processes to measure and assess other distinctive objectives exist at various stages but were not discussed in-depth in narrative • Lack of well-documented processes may undermine DSU’s ability to achieve the objectives or set aggressive targets for the future. • Unclear what role, if any, DSU stakeholders play in achieving or defining DSU’s activities related to SDBOR and Governor’s 2010 initiatives

  12. #3: Understanding Stakeholder Needs • Examines how your institution works to understand student and other stakeholder needs • Ratings: SS – 0; S – 9; O – 13; OO – 1

  13. #3: Understanding Stakeholder Needs Strengths: • Excellent methods for collecting data from student groups; maintains student representation in nearly all committees • Builds relationships with prospective and accepted students • Solicits feedback from alumni and employers • Student satisfaction results published / shared • NSSE results include strong student satisfaction; enrollment, retention and graduation rates increasing Support for Strengths: • Title 3 investments in activities linked to retention

  14. #3: Understanding Stakeholder Needs • Outstanding Opportunities: • No evidence that institution identifies changing needs of key stakeholders such as parents, BOR, area legislators, donors, conference attendees and outreach service recipients; no evidence of systematic processes for analyzing and responding to changing needs

  15. #3: Understanding Stakeholder Needs From Feedback Report: • Relationships with students are a priority for DSU • Data collected relative to student and other stakeholder needs • Voice of the off-campus students (61% of student population) may be under-represented in that data; additional data collection and analysis needed relative to distance students. • DSU needs to define a process for identifying stakeholder needs from the data collected and for evaluating strengths/weaknesses of relationships with students and other stakeholders.

  16. #4: Valuing People • Focuses on institutional commitment to development of employees • Ratings: SS – 0; S – 8; O – 13; OO – 3

  17. #4: Valuing People Strengths: • Staff involved in establishing applicant credentials for open positions; systems in place for reviewing applicant credentials; systems in place for hiring and orienting new employees • Good training programs in place and results from CQS used to guide decisions about future training • Systems in place for recognizing accomplishments; administrative evaluations tied to strategic plan • Increase in grants and contracts

  18. #4: Valuing People • Outstanding Opportunities: • DSU collects data related primarily to student satisfaction but less data related to employee satisfaction, health and safety, etc. and no data on how this affects institutional development • Building and documenting processes associated with use of employee data would lead to stronger culture of valuing people and continuous improvement • No comparative data provided relative to results from other higher education institutions

  19. #4: Valuing People From Feedback Report: • DSU collects data and monitors some indicators related to satisfaction, health and safety, and well being for employees. • DSU should focus more attention on: • Analyzing data and identifying key results to aid in decision-making • processes that contribute to valuing people • identification of effective measures for those processes

  20. #5: Leading and Communicating • Focuses on institutional leadership and communication structures, networks and processes and how those processes guide institution in setting directions, making decisions, seeking new opportunities and building and sustaining a learning environment. • Ratings: SS – 0; S – 12; O – 12; OO – 1

  21. #5: Leading and Communicating Strengths: • Strategic plan developed with input from stakeholders; Dashboard system has been implemented to track results • Functional units expected to develop action plans relative to strategic plan • Variety of techniques used to build leadership skills • Uses SSI and CQS results to assess effective leadership and communications; results comparable to other institutions Supporting Strengths: • Investment in institutional effectiveness through Title 3

  22. #5: Leading and Communicating Outstanding Opportunities: • No evidence of how leaders communicate a shared mission, vision, values and expectations for learning, continuous improvement, ethics and equity, social responsibility or community service and involvement.

  23. #5: Leading and Communicating From Feedback Report: • DSU could improve the quality of its leadership by: • evaluating its leadership processes • developing leadership skills among its employees • using systematic processes to gather, analyze, and use data in its decision-making instead of relying on informal, personal contacts

  24. #6: Supporting Institutional Operations • Focuses on institutional support processes that help to providing an environment where learning can thrive. • Ratings: SS – 0; S – 3; O – 6; OO – 3 • Weakest category (S’s v. O’s)

  25. #6: Supporting Institutional Operations Strengths: • Significant level of data collected relative to student support needs • Day-to-day accountability and service-oriented culture at the institution provide appropriate foundation for support services • Budget deficiencies identified by units during annual budget hearings Support for Strengths • Institutional effectiveness efforts funded through Title 3

  26. #6: Supporting Institutional Operations • Outstanding Opportunities: • No specific measures / results provided relative to learner and administrative support service processes • Few results provided beyond satisfaction data and financial data • Expenses exceeded revenues for last three fiscal years reported, resulting in reduction of net assets of approximately $1.5 million between FY05 and FY07. (Note: may have been misinterpretation of data provided)

  27. #6: Supporting Institutional Operations From Feedback Report: • Sound hardware and software infrastructure to share information and data • DSU takes an organized approach to assessment and accreditation activities, with posted reviews and updates available online.

  28. #7: Measuring Institutional Effectiveness • Focuses on collection and analysis and use of information to manage institution and to drive performance improvement. • Ratings: SS – 0; S – 6; O – 8; OO – 0

  29. #7: Measuring Institutional Effectiveness Strengths: • Data collected for strategic planning, unit planning, etc. ; data stored by OIEA and identity of individuals protected; • Nationally normed assessment instruments used where available; institutional effectiveness and academic assessment plans and results are posted on web • Institutional peers selected by VP Council, which facilitates access to comparative data Support for Strengths: • Institutional effectiveness activities funded through Title 3 • AQIP Action Project on general education assessment • Focus on institutional effectiveness within IEC; focus on updating assessment plans within Assessment Committee

  30. #7: Measuring Institutional Effectiveness Opportunities: • Processes are in place to collect data; institution needs to use this infrastructure to refine processes and use data collected to inform change. • No comparative data provided for institutions outside SDBOR system • No evidence that budget decisions are linked to institutional goals for improvement

  31. #7: Measuring Effectiveness From Feedback Report: • Efforts to create a Dashboard system are commendable but need to develop a feedback loop and advance use of Dashboard by faculty and staff. • Data collected relative to performance on strategic initiatives but has not evaluated processes and systems associated with these initiatives. • Institutional Effectiveness Committee (IEC) has a very important role but there are no processes to guide and improve the effectiveness of the committee. Planning and evaluation processes need to be documented.

  32. #8: Planning Continuous Improvement • Focuses on planning processes and strategies and action plans to help institution achieve mission and vision. • Ratings: SS – 0; S – 8; O – 14; OO – 0

  33. #8: Planning Continuous Improvement Strengths: • Strategic planning process included environmental scan and input from stakeholders • Strategic plan includes long-range goals • Requests for new resources must be tied to institution’s strategic goals / outcomes • Functional units develop their own action plans and projects relative to strategic plan • Favorable progress toward 2007-2012 strategic goals and some goals already met Support for Strengths: • Focus on institutional effectiveness through IEC process

  34. #8: Planning Continuous Improvement Opportunities: • Confusion between analyzing data and performance v. analyzing processes and systems • Need system of measures to evaluate planning process with process measures (elements such as timelines for each phase, participants and roles, efficiency of data collection and analysis) and satisfaction measures, through surveys. • Strategic plan includes long-term goals but no short-term / intermediate goals to ensure that long-term goals are met • Unclear how the units, the IEC, and Vice Presidents’ Council work together to ensure appropriate measures and performance projections are set

  35. #8: Planning Continuous Improvement From Feedback Report: • DSU is in very early stages of a comprehensive continuous improvement program. Still need to document processes, conduct deeper analysis of data and use results from that analysis to plan improvements. • Confusion between results obtained from collecting data and results obtained from analysis of collected data. Not clear if strategic planning process was a one-time event or a continuing process. • Suggest using a high-performing AQIP institution as a mentor

  36. #9: Building Collaborative Relationships • Focuses on institution’s relationships (current and potential) to analyze how they contribute to institution’s mission. • Ratings: SS – 0; S – 4; O – 10; OO – 0

  37. #9: Building Collaborative Relationships Strengths: • Representatives from key stakeholder groups sit on advisory boards and committees • Shared governance structure allows staff, faculty and students to interact , communicate and express opinions / concerns • DSU gathers and reports trend data on relationships with alumni, K-12 constituents, etc. • DSU has an established system for setting long-term goals and for setting performance targets Support for Strengths: • Focus on institutional effectiveness through Title 3 funding

  38. #9: Building Collaborative Relationships Opportunities: • Little evidence of how DSU systematically creates, prioritizes, or sustains collaborative relationships or how DSU monitors and analyzes the effectiveness of those relationships or how the institution reacts to changing needs of partners • DSU needs a comprehensive plan with specific timelines for analyses related to building and fostering internal and external relationships. By creating a systematic approach DSU could build a sustainable way to evaluate its critical relationships and identify opportunities to improve them and develop them further. • DSU has a well-defined recruitment calendar for traditional students but does not address how non-traditional students (a significant portion of the student body) are targeted/recruited. • No comparative data provided relative to relationship-building, etc.

  39. #9: Building Collaborative Relationships From Feedback Report: • DSU has numerous collaborative relationships and collects and publishes related data but focus is on managing current relationships rather than creating new collaborative relationships or improving the relationships over time. • DSU needs to move to the next level of continuous improvement by identifying the processes used to monitor, evaluate, and improve these relationships and by using measures to evaluate processes and relationships. • DSU needs to align its collaborative relationships with its strategic priorities.

  40. Strategic Issues from Reviewers Strategic Issue 1: The various processes presented by DSU appear to be lacking a fundamental aspect that is required for continuous improvement to occur. Continuous improvement requires that the process consist of pieces that are linked to form a closed or continuous loop: steps chosen to achieve specified objective ->data collected relative to outcomes of process and to critical steps in the process ->data analyses to determine if process objectives of process were achieved -> improvements to the process planned.

  41. Strategic Issues from Reviewers Strategic Issue 2: DSU will benefit from clearly documenting the processes that lead to the achievement of the objectives for the various categories. For example, clear documentation of the steps in the strategic planning process, advisement process, changing courses or curriculum, etc. will help DSU evaluate the value of each step in the processes, identify measures, and then gather and analyze more effective data so the institution can plan improvements.

  42. Strategic Issues from Reviewers Strategic Issue 3: Systematic processes are needed for reviewing and analyzing data that are collected. Data collection is distinctly different from data analysis. For example, DSU’s Dashboard provides data but does not provide results from the analysis of the data.

  43. Strategic Issues from Reviewers Strategic Issue 4: Continuous improvement requires that strategic goals and initiatives be reconsidered on a regular basis. While DSU has clearly stated objectives, the impression is that they were established and have never been reconsidered or evaluated through the analysis of collected data and other sources of evidence.

  44. Overall Summary • Review was very fair and positive one • Opportunities and Outstanding Opportunities all focused on process and evaluation of process – concept of continuous quality improvement • Is there an action project on process in our future??  Remember: we control how we respond to this evaluation and what we do with the reviewer’s recommendations

  45. Next Step • Planning Council members assigned to one of 9 categories • Read System Portfolio for that category and read Reviewers’ Feedback Report for that category • Thoughtfully evaluate O’s and OO’s: • Was portfolio misinterpreted / misunderstood by reviewers and therefore feedback is irrelevant (or did they not follow a critical link)? • Was narrative poorly presented (writing team blew it!) and therefore feedback is irrelevant? • Were reviewers accurate in their analysis? If yes, is this an issue we need to work on? If yes, then help us identify opportunities for growth / change • As part of Planning Council Retreat: recommend action projects or individual unit plans to help resolve the issue (develop a process, analyze data, chart results and change) or identify projects already in process that will help address the issue

  46. Questions? Questions?

More Related