1 / 35

Institutional Effectiveness at the University of North Alabama

IE Presentation, November 20, 2008. Institutional Effectiveness at the University of North Alabama. Dr. Andrew L. Luna Institutional Research, Planning, and Assessment. Connections to IE?. Telephones and IE?. Walter Shewhart. Edwards Deming. Joseph Juran.

keran
Download Presentation

Institutional Effectiveness at the University of North Alabama

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IE Presentation, November 20, 2008 Institutional Effectiveness at the University of North Alabama Dr. Andrew L. Luna Institutional Research, Planning, and Assessment

  2. Connections to IE?

  3. Telephones and IE? Walter Shewhart Edwards Deming Joseph Juran Hawthorne Works, Bell Laboratories

  4. FACT... The Shewhart Cycle is the foundation for all quality and Continuous improvement processes that we use today Shewhart Cycle Plan Continuous Improvement Act Do Check

  5. Points of Discussion • Similarities between the Shewhart Cycle and Institutional Effectiveness • Overview of Institutional Effectiveness at UNA • Review of Outcomes and Improvement Processes • Review of Assessment • Questions

  6. More on the Shewhart Cycle • Plan – Create a strategy as to what you want to do and how you will measure success • Do – Follow the plan and do what you say you will do • Check – Assess the effectiveness of the current plan by looking at the success outcomes measures • Act – Make changes to the strategies to improve the measured outcomes • Repeat the Cycle!

  7. Why is the Shewhart Cycle Important? • If you can’t measure something, you can’t understand it… • If you can’t understand it, you can’t control it… • If you can’t control it, you can’t improve it… • If you can’t improve it…then why the heck are you doing it?

  8. So, What is Institutional Effectiveness? • A sharpened statement of institutional mission and objectives • Identification of intended departmental/programmatic outcomes or results (Plan) • Establishment of effective means of assessing the accomplishments outcomes and results (Do, Act, Check) FACT... Institutional Effectiveness is primarily undertaken to improve what we do…not just to pass accreditation.

  9. Shewhart Cycle and SACSMacro IE Check and Act Plan Core Requirement 2.5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that (1) incorporate a systematic review of institutional mission, goals, and outcomes; (2) result in continuing improvement in institutional quality; (3) demonstrate the institution is effectively accomplishing its mission.” Do

  10. Key Points to Core Requirement 2.5 • Emphasizes an expectation that the institution is the primary focal point for compliance • Sets expectations for the description of planning and evaluation processes that are active and continuous rather than static or single occurrences. • Points to a clear and strong expectation for documentation of the systematic review of institutional mission, goals and accomplishments consistent with its mission • Sets expectations for the documented use of results of institutional planning and evaluation to achieve institutional improvements

  11. Shewhart and SACS, Cont.Micro IE Plan Comprehensive Standard 3.3.1: “The institution identifies expected outcomes for its education programs … and its administrative and educational support services; assesses whether it achieves those outcomes; and provides evidence of improvement based on analysis of those results.” Check Do and Act

  12. Key Points to Comprehensive Standard 3.3.1 • Emphasizes the unit level of individual educational programs and support services • The expected achievements of educational programs and support services should be articulated, and evidence presented concerning accomplishments • Distinguishes between program outcomes and learning outcomes • Sets expectations that improvement is guided by the establishment and evaluation of program and learning outcomes

  13. Shewhart and SACS, Cont.General Education and IE Plan Comprehensive Standards 3.5.1 “The institution identifies college-level competencies within the general education core and provides evidence that graduates have attained those competencies.” Do, Check, Act

  14. Key Points to Comprehensive Standard 3.5.1 • General Education should be part of the institutional mission • The expected achievements of the General Education program should be articulated, and evidence presented concerning accomplishments • Improvement should be guided by the establishment and evaluation of learning outcomes

  15. Overview of Institutional Effectiveness Focus on Assessment • Comprehensive Dept./Program Review • Program Outcomes • Quality Indicators • Productivity Indicators • Viability Indicators • Evaluation of Learning • Learning Outcomes • What graduates know • What graduates can do • What attitudes/values graduates possess Mission Strategic Goals Continuous Improvement of Student Learning Continuous Improvement of Programs and Departments Institutional Effectiveness

  16. Institutional Effectiveness System at UNA • Annual Report - Annual Action Plan and Assessment Report • Comprehensive Program and Department review – Five-year Review • Review of General Education – Five-year cycle of General Education assessment

  17. Schematic of Institutional Effectiveness Process Five-Year Review for Selected Depts. Five-Year Review for Selected Depts. Five-Year Review for Selected Depts. Five-Year Review for Selected Depts. Five-Year Review for Selected Depts. Year 2 Annual Reports Year 1 Annual Reports Year 3 Annual Reports Year 5 Annual Reports Year 4 Annual Reports Area III Five-Year Assessment Area II Five-Year Assessment Area IV Five-Year Assessment Overall Gen. Ed. Assessment Area I Five-Year Assessment No • OIRPA • IE Committee • Gen. Ed. Committee 5-Year Cycle? Yes Review of Strategic Goals

  18. Five-Year Program/Department Review Timeline (pending IE Committee Approval) Last year’s Depts. that underwent 5-year review submits outcomes of review as AAPAR priority initiatives OIRPA submits Five-Year Enrollment report to academic departments OIRPA meets with Deans/VP for overview Deans/VPs meet with departments to discuss review OIRPA conducts assessment workshop for UNA campus September October November December January February March April May June July August September OIRPA initiates individual departments meetings OIRPA meets with departments up for review Five-Year Reviews completed and sent to Dean/VP OIRPA submits overview of Five-Year process to IE Committee

  19. Annual Action Plan and Assessment Report Timeline (pending IE Committee Approval) OIRPA submits AAPAR overview to IE Committee Budget initiatives based on Priority Initiatives are established President, VP, and Dean Initiatives Due Next FY Priority Initiatives by VPs September October November December January February March April May June July August September 1st part of AAPAR due for current fiscal year w/ one Priority Initiative for next FY 2nd Part of AAPAR completed by depts. Next FY Priority Initiatives by Deans SPBS reviews Next FY Priority Initiatives

  20. Outcomes • Operational Outcomes - measures of how well the institution/division/department is meeting/exceeding requirements • Learning Outcomes - statements of the knowledge, skills, and abilities the individual student possesses and can demonstrate upon completion of a learning experience or sequence of learning experiences (e.g., course, program, degree).

  21. Problems with Outcomes • Outcomes are too broad • Outcomes do not address core requirements/competencies or mission • Outcomes are not measurable

  22. Types of Measurement • Discrete or Attributes data • Binary data with only two values • Continuous or Variable data • Information that can be measured on a continuum or scale • Yes/No • Good Bad • On/Off • Male/Female • Pass/Fail • Height/Weight • Temperature • Test Scores • Time • Distance

  23. Bloom’s Taxonomy of Learning Outcomes

  24. Bloom’s Taxonomy, Cont.

  25. Forms of Measurement Longitudinal data is gathered over an extended period , , … Semester1 Semester2 Semester3 Semestert

  26. Forms of Measurement, Cont. Cross-sectional data represent a snapshot of one point in time

  27. What is Improvement? • Measurable actions that increase learning, efficiency, effectiveness, and/or the bottom line • Decrease the Bad • Increase the Good • Decrease Variability

  28. Decrease Variability? What the heck is that? Class A Class B 100, 100 99, 98 88, 77 72, 68 67, 52 43, 42 91, 85 81, 79 78, 77 73, 75 72, 70 65, 60 Mean Mean = 75.5 Mean = 75.5 STD = 21.93 STD = 8.42

  29. Inputs, Processes, and Outcomes Measurement Materials Methods X’s X’s X’s X’s Y’s Input Outcomes X’s X’s X’s Environment People Machines

  30. Assessment Steps • Develop learning/operational objectives • Check for alignment between the curriculum/business process and the objectives • Develop an assessment plan • Collect assessment data • Use results to improve programs/department • Routinely examine the assessment process and correct, as needed

  31. Types of Assessment – DirectAcademic • Published Tests • Locally Developed Tests • Embedded Assignments and Course Activities • Competence Interviews • Portfolios

  32. Types of Assessment – DirectEducational Support/Administrative • People enrolled/participating/served • Work accomplished • Revenue generated • Turnaround time • Decrease in nonconformities

  33. Types of Assessment - Indirect • Surveys • Interviews • Focus Groups • Reflective Essays

  34. How Can OIRPA Assist? • Create university wide reports – Five-year departmental reports • Analyze university-wide assessment data – NSSE, CAAP • Hold workshops on assessment and IE • Work with individual departments on annual reports, program review, and outcomes assessment • Provide ad hoc data reports for departments • Work with committees to develop assessment plans – IE Committee, Gen. Ed. Committee

  35. Questions or Comments?

More Related