1 / 17

Using Quantitative Data for Effective Program Evaluation

Using Quantitative Data for Effective Program Evaluation. Jared C. Schultz, Ph.D., CRC Utah State University. Misleading Title?. The first critical question is not “How do we measure?” The first critical question is “What do we measure?”

Download Presentation

Using Quantitative Data for Effective Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Quantitative Data for Effective Program Evaluation Jared C. Schultz, Ph.D., CRC Utah State University

  2. Misleading Title? • The first critical question is not “How do we measure?” • The first critical question is “What do we measure?” • This presentation is not an overview of quantitative data collection procedures. • This presentation is intended to provide a foundation to address the question of “What do we measure?”

  3. Transformational Learning • Adult learners bring the context of their experience to learning • Has a positive and negative impact on learning • “becoming critically aware of one’s own tacit assumptions and expectations and those of others and assessing their relevance for making an interpretation” (Mezirow, 2000,p.4) • Take a few minutes and write down your beliefs, attitudes, and emotions regarding program evaluation. Positive and Negative. Ambitions and Concerns.

  4. What Program Evaluation is Not • Only a requirement for reporting to RSA and other oversight entities. • Program evaluation, quality assurance, and continuous improvement are not events, or things to implemented quickly. They represent shifts in organizational ideology and culture, and require a long-term, systemic view.

  5. What is Program Evaluation? • “The use of social research methods to systematically investigate the effectiveness of social intervention programs…” (Rossi, Lipsey, & Freeman, 2004). • Data Collection and Analysis • Illuminates Effectiveness and Informs Continuous Improvement • Is useful for all aspects of organizational functioning.

  6. An issue of organizational culture. • A systemic method of addressing quality of services, outcomes, and personnel. • Addresses processes and systems within the organization.

  7. Role of Data • Informs Program Development • Informs Program Evaluation • Informs Program Improvement • Informs Professional Hiring • Informs Professional Evaluation • Informs Professional Improvement / Development

  8. Bead Experiment • We need to hire: • 2 Counselors • Supervisor • Data Analysis Specialist

  9. Processes and Systems • Process • The use of inputs into an action to achieve a desired outcome. • Inputs include people, materials, equipment, methods, measurement, environment. • System – network of processes within an organization that work together for the aim of the organization. (Neave, 1990; Deming, 2000)

  10. An Example of Systems Thinking A State agency implemented an incentive payment program that was directly linked to case closures.

  11. Incentive Program Goals • Improve Rehabilitation Counselor Retention, Decrease Turnover • Increase Rehabilitation Counselor Compensation • Increase the quantity of Successful closures, Productivity • It was originally presented as a Trial Program

  12. Since Implementation • Voluntary terminations have increased from an average of 15 per year to 21 per year • Voluntary resignations have not declined with the existence of the incentive program

  13. Compensation • An average of 25 counselors receive the incentive in a given year ($894 before taxes) • On average 91 counselors receive no incentive • The average increase in compensation across the board is $180 before taxes per counselor for the 3 year period

  14. Increase Productivity

  15. Asking the Program Evaluation Question • Identify the Problem • Define the problem in terms of processes and systems • Draw out the process • Identify the way that processes influence the system

  16. Models and Theories • Models and theories provide the context for making the questions concrete. • Provide the structure for understanding the variables. • Don’t assume that models and theories do not apply to your unique situation.

  17. References Deming, W.E. (2000). Out of the crisis. Cambrige, MA: The MIT Press. Mezirow, J. (2000).Learning to think like an adult: Core concepts of transformation theory. In J.Mezirow and Associates (Eds.). Learning as transformation: Critical perspectives on a theory in progress. San Francisco: Jossey-Bass. Neave, H.R. (1990). The Deming dimension. Knoxville, TN: SPC Press. Rossi, P.H., Lipsey, M.W., & Freeman,H.E. (2004). Evaluation: A systematic approach (7th Ed.). Thousand Oaks, CA: Sage.

More Related