1 / 43

A Model For Evaluating Institutional Research Functions

A Model For Evaluating Institutional Research Functions. AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University. Objectives. Learn how to describe what you do. Develop a systematic plan to evaluate the IR functions.

ivo
Download Presentation

A Model For Evaluating Institutional Research Functions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Model For Evaluating Institutional Research Functions AIR 2000 May 20, 2000 Frank Doherty Director of Institutional Research James Madison University

  2. Objectives • Learn how to describe what you do. • Develop a systematic plan to evaluate the IR functions. • Freebie: Learn a method to evaluate other administrative offices on your campus.

  3. Schedule

  4. JMU OIR Evaluation • 1992 SACS Visiting Team Report • “Although OPA (now OIR) has occasionally evaluated the usefulness of some of its products and services, evaluation has not been established as a routine matter. Thus, the Committee recommends that the University establish regular and ongoing evaluation mechanisms for the institutional research function.”

  5. Evaluation Is User-Oriented • Objective is program improvement and accountability • Seek information that will improve office • User control of evaluation is very important

  6. Elements of the Evaluation • Program design • Evaluation design • Program review team • Data collection and analysis • Reporting and recommendations • Improvement plan • Ongoing evaluation

  7. Program Design Philosophy • You cannot evaluate that which you cannot describe • First step in self-study • Facilitates clarification of program goals and operation—wonderful communication device • Aids the planning process • Serves as an implementation guide • Provides a sense of the whole • Documents program operation

  8. Program Design • Discrepancy Evaluation Model • Systems approach • Inputs • Processes • Outputs • Compare performance with standard (gap analysis) • OIR Program Design • Network • Input-Process-Output statements

  9. Evaluation Plan Philosophy • States intentions publicly • Organizes complexity of evaluation effort • Facilitates and justifies evaluation resource allocation decisions • Serves as a “standard” for judging an evaluation effort

  10. Evaluation Design • Overall plan for the evaluation • Concerns/Issues • Program-specific • Common • Questions • Information sources/methodology • OIR Evaluation Design

  11. Program Review Team • Consists of 8-10 staff recommended by office • Chair not from office, but appointed by division head • Collect data • Write report and recommendations • Recommendations discussed with division head and supervisor • Annual objectives developed to address recommendations

  12. Ongoing Program Review • OIR program review is conducted every three years • Online survey • Accountability and use of results • Annual objectives based on recommendations

  13. Components • Components Of An Effective Program Review at James Madison University

  14. Summary • IR evaluation should be: • Thorough • User-oriented • On-going and accountable • OIR evaluation report • http://www.jmu.edu/instresrch/present/air99/oireval.pdf

  15. Program Design Exercise • Program design consists of two parts • Network • Input-Process-Output statements

  16. Network • Numbering and levels • Functional dependencies • Let’s create a Network

  17. IPO Statements • Inputs • Processes • Outputs • JMU OIR Network

  18. Inputs • Things which set processes into motion and keep them running • resources • receptors • staff • independent groups/organizations • preconditions • enabling outputs from other components

  19. Processes • Described as event-sequences • Process descriptions of • intended interactions of people, • materials and media, and • current context in which they take place • Be specific • indicate who is doing what to whom, how, when, where, and for how long • Linked to outputs

  20. Outputs—Terminal • Two types: terminal objectives and enabling objectives • Terminal objectives are changes or products which • result from program-controlled processes • intended to be fed into the external environment • outputs for which the program holds itself accountable—bottom line

  21. Outputs—Enabling • Enabling objectives result from program-controlled processes • used within program rather than without • “enables” the achievement of terminal objectives • can be output of one process and input into another

  22. IPO Development Exercise • Let’s develop an IPO for your office. • IPO Exercise

  23. Evaluation Plan • Address primary needs of area • “What do you need to know?” • May want to address common institutional issues and questions • Customer satisfaction • Planning • Use of results • Etc.

  24. Stages of Evaluation • Design evaluation • Input evaluation * • Process evaluation * • Output evaluation * • Cost-Benefit Analysis

  25. Design Evaluation • Assessment of substantive adequacy of a program’s design • Is this likely to be a good program? • Examination of the substance, assumptions, and structure of a program prior to installation.

  26. Input Evaluation • Appropriate for: • New programs and replication efforts • Installation evaluation • Inputs are present as prescribed by program design • Planned processes have been set in motion • Design preconditions have been met • Stipulated preconditions are critical • Fiscal monitoring

  27. Process Evaluation • Monitors continued operation and sequential accomplishment of enabling objectives • Formative: Discrepancy reports used to modify and improve program operations

  28. Process Evaluation • Sets the stage for summative evaluation • Documents and defines “treatment” until program process stable • Clarifies relationship between program activities and accomplishment of interim objectives • Evaluation plans should emphasize process evaluation • Particularly useful during early stages of program operation

  29. Output Evaluation • Refers to terminal objectives only • Have terminal objectives been achieved? • Investigation of causation • Most useful when preceded by formative evaluation • Previous evaluation stages contribute to program stability and improvement

  30. Evaluation Design

  31. Selection Criteria • Critical functional importance • Areas that are problematic • Areas of direct concern to external evaluation audiences (i.e. accrediting agency) • Areas of concern to internal evaluation audiences (customer satisfaction) • Areas where information is needed soon

  32. Evaluation Concerns Identification • Common models of organization • By design component • By cross-cutting function • By evaluation stage

  33. Evaluation Questions • Derived from a larger area of concern • Guide to collection of performance information • What kind of performance information is necessary to answer questions posed? • Determine standard for each variable identified

  34. Evaluation Questions • Develop for each evaluation concern • Should direct systematic collection of performance information • Evaluation question directs one to performance information

  35. Design Referents • Relate program design to evaluation design. • Design referent should point to a component in the program design and indicate whether the question is related to input, process, or output.

  36. Information Needs • Provides rationale for each question • Explains what kind of information sought • Indicates how collected information will be used, and by whom.

  37. Information Needs Justification • Record keeping • Routine monitoring • Verification of preconditions • Management troubleshooting • Functional criticality • Accountability • Bargain information

  38. Information Needs Continued • Information need should tell reader purpose of the information • (F) for formative • (S) for summative • Sometimes can be F and S

  39. Sources of Information • Task #1: Brainstorm information possibilities for each question • Task #2: Pick and choose from possibilities • Factors to consider: • Reliability and validity • Cost (time and resources)

  40. Report Dates • Establish ballpark estimate when discrepancy reports should be available • May differ from audience to audience

  41. Evaluation Design Exercise • Let’s create an evaluation design • Evaluation Concerns • Evaluation Questions • Design Referent • Information Needed • Source of Information • Date Information Needed

  42. Data Analysis • Questions determine methods • Multiple methods used • Statistical analysis of data • Document review • Surveys • Interviews • Focus groups

  43. Reporting and Recommendations • Reports are organized by evaluation issue/concern • Self-Study team develops recommendations

More Related