1 / 53

Making Personnel Development Effective: Using Outcome Data for Program Improvement

Center to Improve Project Performance. Making Personnel Development Effective: Using Outcome Data for Program Improvement. Tom Fiore Cynthia Helba Susan Berkowitz Michael Jones Jocelyn Newsome OSEP Project Directors’ Conference, Washington, D.C., July 23, 2012. Overview.

keita
Download Presentation

Making Personnel Development Effective: Using Outcome Data for Program Improvement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Center to Improve Project Performance Making Personnel Development Effective: Using Outcome Data for Program Improvement Tom Fiore Cynthia Helba Susan Berkowitz Michael Jones Jocelyn Newsome OSEP Project Directors’ Conference, Washington, D.C., July 23, 2012

  2. Overview • A structure for using data for program improvement • Project logic model • Evaluation inquiry model • Feedback loops • Useful tools • Surveys • Sampling • Interviews and focus groups • Summary

  3. Goals • Activities • Outputs • Outcomes Basic Project Logic Model • Increase the ability of teachers to make evidence-based practice decisions about teaching and intervention dilemmas that arise in daily practice • Develop, validate, and disseminate training modules • Middle school teachers download and report using modules • Middle school SPED teachers demonstrate increased ability of to make and apply evidence-based practice decisions

  4. Basic Project Logic Model—Data of Interest • Output—something that is counted • [Number of] Training modules developed • [Number of] Training events held • [Number of] Individuals trained • Outcome—something that is measured • Trainees report, at 6 months post-training, that they learned something and are using it [percent of learning or percent of information used] • Trainees demonstrate proficiency on trained skills [scores obtained through an observation protocol] • Customers of those trained demonstrate improved performance [scores obtained through a standardized instrument]

  5. Evaluation Inquiry Model • A logic model provides a representation of the linear process of the theory of action for the project • An evaluation inquiry model is non-linear and is a discovery model focused on the theory of change for the project • Specifically, it puts project planning and decision-making functions at the center of the model, and it embeds outcomes within that function

  6. Evaluation Inquiry Model Planning and Decision-Making Vision and Purpose-Setting Implement- ation • Evaluation • Formative • Summative Outcomes 1 • 2 • 3

  7. Evaluation Inquiry Model • Four functions to focus on when using data for program improvement • Vision and Purpose-Setting • Define the fundamental intention of the project • Establish an overall direction and purpose • Planning and Decision-Making • Allocate resources • Design implementation strategies • Establish measurable expectations

  8. Evaluation Inquiry Model • Four functions to focus on when using data for program improvement (cont.) • Implementation • Execute strategies and activities • Evaluation • Establish metrics for measuring outputs and outcomes • Collect, analyze, and interpret data

  9. Evaluation Inquiry Model • Why outcomes are inside the Planning and Decision-Making box—keeping an eye on the prize • Having outcomes at the far right implies a strictly linear process • But projects that use data to follow their progress don’t operate linearly • Also, having the outcomes at the end suggests that they can be achieved and achieved in the life of the project—that the work is done

  10. Evaluation Inquiry Model with Feedback Loops Planning and Decision-Making Vision and Purpose-Setting Implement- ation • Evaluation • Formative • Summative A Outcomes 1 • 2 • 3 D B Feedback 1 C Worth Feedback 2 Reflec- tion

  11. Evaluation Inquiry Model with Feedback Loops • Evaluation and analysis plans are developed—outputs and outcomes associated with activities are identified, selected, and measured • Evaluative data are used to adjust plans, activities, and management—implementation is or is not occurring as expected • Evaluative data are used to determine the worth or value of outputs and immediate outcomes in ultimately achieving the desired long-term outcomes • Reflections on the worth or value of activities, outputs, and outcomes lead to modifications of the vision/purpose, which leads to changes in plans and activities

  12. Collecting and Using Data as Feedback Evaluative feedback will be useful to your project if: The evaluation data provide accurate information about the results of your project’s activities and outputs The data are targeted so that they are useful for the specific purpose of making changes to your project The data are accurate and can be collected within your project’s resources The data are available in time to allow for change to your project

  13. Useful Tools for Obtaining Formative Feedback Three tools for collecting data for formative purposes: • Surveys - Perhaps the most common means of collecting data about professional development activities - Must be done correctly to obtain accurate and analyzable information • Sampling - An often overlooked way to make data collection more efficient and accurate. • Interviews - Another common means of collecting feedback about project activities -Method to understand data collected through other means, and to drill down to specific information needed to make changes to a project

  14. Surveys Step 1: Data Analysis Plan • Most important step • Questionnaire development should be driven by project goals

  15. Surveys Developing a Data Analysis Plan

  16. Surveys Field Considerations • What languages will you ask in?  Shapes item development because of wordingand cultural implications • Are there other components of your project evaluation or your data collection activities, e.g., classroom observation or other surveys?  Shapes length and administration of questionnaire

  17. Surveys Step 2: Item Development

  18. Surveys Reality Check

  19. Surveys If respondent cannot understand the question, cannot remember the answer (or never knew it), doesn’t want to give you the answer, or can’t figure out how to give you the answer, then you can’t ask the question. Reality Check

  20. Surveys • Only asks one question at a time(avoids double-barreled questions) Do you exercise regularly and eat healthy foods? • Contains a clear threshold for answering “yes” Have you seen a doctor in the past month? • Provides a timeframe How many times do you eat in a restaurant? • Provides a timeframe appropriate to the topic In the past 12 months, how many times have you sneezed? Elements of a Good Item

  21. Surveys • Uses clear terminology and plain language How strong is your fertility motivation? • Gives exhaustive and mutually exclusiveresponse options From which of these sources did you first learn about the tornado in Derby? Radio Television Someone at work While at home While traveling to work Elements of a Good Item (continued)

  22. Surveys How many sex partners have you had in the past 12 months? • How many sex partners have you had in the past 12 months? ________ VERSUS No item is an island • How questions are answered can be shaped by: • Ordering of items Should U.S. reporters be allowed into the Soviet Union? Should reporters from the Soviet Union be allowed into the U.S.? • Mode of administration

  23. Surveys Step 3: Review and Testing • Content Expert Review A content expert review can identify improper use of terms or concepts in the field How many students do you serve under Part C?VERSUS How many children do you serve under Part C? • Methodological Expert Review A methodological review can identify well-known (but not obvious) issues with items In the past week VERSUS In the past seven days

  24. Surveys Step 3: Review and Testing (continued) • One-on-One Testing This testing involves an in-depth interview with people like your respondents. It explores how they are understanding and answering the questions. Has a student in your classroom been expelled this year? When did that happen? Tell me how you came up with your answer • Field Testing A field test is an opportunity to see how the questionnaire actually works, identifying any items respondents have difficulty answering (or interviewers have difficulty asking). It is also an opportunity to test procedures and protocols.

  25. Surveys Step 4: Rinse and repeat.

  26. Surveys Reading List • Don Dillman, Jolene Smyth & Leah Melani Christian. Internet, Mail, and Mixed-Mode Surveys. 3rd edition. 2009. • Robert Groves, Floyd Flower, Mick Couper, James Lepkowski, Eleanor Singer & Roger Tourangeau. Survey Methodology. 2004. • Janet Harkness et al. Survey Methods in Multinational, Multiregional, and Multicultural Contexts. 2010. • Gordon Willis. Cognitive Interviewing: A Tool for Improving Questionnaire Design. 2005.

  27. Sampling • A Hypothetical Example • A program is implemented in 20 HSs Montgomery County, MD • Within each school, 10 teachers selected to be trained • Once trained, teachers instruct “upper class” students • Trained students then instructed to mentor 3 to 5 freshmen throughout the school year

  28. Sampling • A Hypothetical Example (continued) • Program needs to be evaluated • Have teachers been trained? • Once trained, have they instructed students in the program? • Once students are instructed, have they mentored fellow students? • Is it having a positive effect on the mentored students?

  29. Sampling Sample vs. Census Census: every unit in a population (e.g., schools, or persons) is surveyed Example: all teachers, all mentor students, and/or all mentored students

  30. Sampling Sample vs. Census (continued) Sample: a portion of the population is surveyed Example: a portion of the teachers, of the mentor students, of the mentored students are selected

  31. Sampling • Why sampling? Why not a census? Sampling • saves time, • saves money, and • if done correctly, answers questions more accurately

  32. Sampling • How can a sample be more accurate than a census? • Generally, a census is a mammoth undertaking • Many kinds of errors can be introduced • A lower response rate • Lax data integrity A sample may allow for increased nonresponse follow-up More resources can be spent on data integrity

  33. Sampling • Considerations when implementing sampling How precise do your estimates need to be? • Do you only want a general idea of how the program is doing, or • Are you making comparisons between groups?

  34. Sampling • Considerations when implementing sampling (continued) • How large should the sample be? Depends on many factors • Complexity of the sample • Often, the more complex the sample is, the larger it needs to be • Try to keep it “simple”

  35. Sampling • Considerations when implementing sampling (continued) • A sample should be RANDOM • Results can be generalized to the population • Leads to unbiased estimates • Increases credibility of results

  36. Sampling • Considerations when implementing sampling (continued) • If a sample is notRANDOM • Can lead to biased estimates and poor credibility • Not all samples can be random • Avoid “convenience” samples • The sample should be designed during program planning

  37. Sampling • Considerations when implementing sampling (continued) • The complexity of survey instrument • How long is the instrument? • Does the survey use established measures or new and unproven measures?

  38. Sampling • Considerations when implementing sampling (continued) • What resources are available? • How much time and money can be spent? • How many people are available to work on the survey? • from inception to data collection to analysis of the data

  39. Sampling • Another Consideration – Measurement Scales • E.g., ordinal • Rank your favorite three flavors of ice-cream 1. Mint chocolate chip 2. Toffee bar crunch • Vanilla Is 1 close to 2, or is 1 a clear favorite?

  40. Sampling If you have questions… ...consult a statistician and/or a survey methodologist.

  41. Sampling • The statistician should ask you questions • What is it that you’re trying to answer? • What are you measuring? • How precise do you need your estimates to be? • Are you looking to make comparisons between groups? • Etc. etc. • The answers to these kinds of questions will determine what kind of sample design will best suit your purposes and how large your sample should be.

  42. Interviews and Focus Groups What Are Qualitative Methods of Data Collection? Both in-depth interviews and focus groups pose questions designed to ascertain and explore people’s views on a given subject in their own terms and framework of understanding. Both use semi-structured or open-ended instruments that concentrate on clearly defined subjects or sets of interrelated topics. Often (but not only)used to explore how a program or project is being implemented (process evaluation)

  43. Interviews and Focus Groups What Are Qualitative Methods of Data Collection? (continued) Intensive interviews are usually conducted one-on-one, by phone or in person; can vary in length from 30 minutes to several hours (more than about 1 ½ hours in one sitting usually too long) Focus groups pose a question or series of questions on a selected topic or theme to a group of 4-10 persons

  44. Interviews and Focus Groups Why Use Qualitative Methods of Data Collection? Can add depth of understanding to evaluation question(s) not obtainable via close-ended survey responses; allow for probing and back-and-forth between interviewer/moderator and participants Can be complementary to survey component (e.g., help to explain survey responses) or stand alone (e.g., as a way of gaining deeper understanding of selected issues)

  45. Interviews and Focus Groups Selecting Cases for Qualitative Methods Principles and goals of purposive sampling not the same as for probability sampling To add deeper understanding of “information-rich” cases Depends on goals of evaluation/important analytic categories: e.g., can select to achieve regional coverage; to account for differences in school, program or classroom structures; can choose only “successful” cases, can select for typical or for extreme cases

  46. Interviews and Focus Groups Designing In-depth Discussion Guides and Focus Group Moderators Guides Concise; logical development of themes or topics (may not correspond to order of evaluation questions) Questions should be open-ended (not yes/no), not leading (don’t you think that?), lend themselves to thoughtful exploration Framed broadly enough to encourage discussion, but not so broadly respondent has no idea what is being sought (“Tell me about yourself”)

  47. Interviews and Focus Groups Common Misconceptions about Qualitative Data Collection It’s easy It’s “purely subjective” Anyone with good people skills can be a good qualitative interviewer/focus group moderator

  48. Interviews and Focus Groups Qualities Needed in Qualitative Data Collectors Not the same as those for interviewers administering close-ended surveys Should have knowledge of subject matter Active role Ability to guide dialogue, draw out respondents, probe on important issues but w/o leading/steering Adaptable to context but not ceding control

  49. Interviews and Focus Groups Analysis of Qualitative Data Value of collecting qualitative data can be seriously impaired w/o someone who has the experience and expertise to make systematic sense of it Qualities needed in analysis of qualitative (word-based, thematic) data not necessarily the same as those important in collection of these data Best when combined/integrated well with survey findings (if mixed method study)

  50. Interviews and Focus Groups What Value Does Qualitative Data Add to a Formative Evaluation? Can identify important and/or unexpected themes, issues or areas of concern that can be acted upon in “real time”/enable course correction Better understanding of “world view” and issue-framing of specific subgroups of interest Allows for a more fluid process and evolving understanding in guiding both the evaluation and the program or project that is the subject of an evaluation)

More Related