1 / 26

Planning How to Conduct the Evaluation

Planning How to Conduct the Evaluation. Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten). Relationship of focus and planning. The structure of evaluation consists of: focusing the evaluation collecting, organizing, analyzing & reporting info

shelley
Download Presentation

Planning How to Conduct the Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Planning How to Conductthe Evaluation Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)

  2. Relationship of focus and planning • The structure of evaluation consists of: • focusing the evaluation • collecting, organizing, analyzing & reporting info • administering the evaluation (see CIPP notes) • Evaluation should be conducted flexibly • Evaluator needs clear understanding of evaluation’s purpose and role

  3. Focusing the evaluation determines what information is needed • Origin and context of the proposed evaluation • Identifying/selecting questions, criteria, standards • Once the evaluation questions are known, the next step is to determine what information is needed to answer each question

  4. Planning the evaluation consists of collecting info, organizing info, analyzing info, reporting info, and administering the evaluation SAMPLE • Evaluation Question:Have the critical program activities occurred on time and within budget? • Things evaluator would need to know: • Which activities were viewed as critical? • Program time frames budget • Program time frames budget by activity • When each activity began/ended • Total cost of each critical activity

  5. Identifying Design andData Collection Methods • Involve client and stakeholders in deciding necessary info to best answer each question • Designs specify organization/structure for data collection • Causal designs: (quasi)experimental designs • Descriptive case studies, multiple regression or other stats methods to answer evaluation Qs • Descriptive designs: describe(case study), analyze the program, show a trend(time series), assess public opinions(cross-sectional), illustrate a process(thick description) • Commonly used in needs assessment and process studies

  6. Evaluator and stakeholders examine each question carefully to identify any important research design issues • Most evaluations are multiple research designs or combinations • Important to discuss early to see if: • groups are available, appropriateness of random assignment, time collecting data, etc… • Is the design “do-able?”

  7. Identifying AppropriateInformation Sources • Once needed information is agreed upon, the source(s) of that information must be specified • “Source”: group of individuals or location of existing information that answers each Q • Who will have information or access to it? • Who will be able to collect those data?

  8. Using existing data as information source • Does necessary information already exist in a readily available form? • Commonly used information sources • Program recipients, deliverers, persons who have knowledge of the program recipients, public documents/databases • Policies that restrict information sources • Do policies exist concerning collecting data from clients or existing files? • Confidentiality, anonymity, privacy, IRB protocols

  9. Client involvement in identifying sources • Evaluator, by training and experience, often can identify key sources of information • Client will be able to identify sources of information that may be missed by the evaluator • Collaboration yields helpful answers and enhances sense of shared ownership

  10. Identifying Data Collection Methods and Instruments • Data collected directly from individuals identified as sources of information • Self reports • interviews, surveys, rating scales, focus groups, logs/journals • Personal Products: • tests (objective, essay), performances (simulations, role-play, competency testing), work samples (portfolios) • Data collected by independent observer • Narrative accounts • Observation forms (rating scales, checklists)

  11. Data collected with technological device • Audiotape • Videotape • Time-lapse photos • Others • BP cuffs, speed, graphic recordings of performance skills, computer collation of participant responses • Data collection from unobtrusive measures

  12. Data collected from existing information • Public documents • federal, state, local, databases etc.. • Review of organizational documents • client files, notes of employees/directors, audits, minutes, publications • personal files • Correspondence, e-mails, etc.

  13. After identifying for use some of the methods on the next slides, it is important to review adequacy of techniques • Will the info collected provide a comprehensive picture? • Are the methods legal and ethical? • Is cost of data collection be worthwhile? • Can they be collected w/o undo disruption? • Can data be collected w/in time constraints? • Will the information be reliable and valid for the purposes of the evaluation?

  14. Determining Appropriate Conditions for Collecting Information • 3 key issues around data collection: • Will sampling be used? • How will data actually be collected? • When will data be collected? • Specifying sampling procedures to be employed • Sampling helps researcher draw inferences about the population in the study • Sampling is a tool to use whenever resources and time are limited • Sampling useful when it will not diminish the confidence of results • Sample size must be appropriate; too small a sample is of limited value

  15. Specifying how the information will be collected • Who will collect data? • For interviews, focus groups etc… how will characteristics of the evaluator influence data collection? • What training should be given to people collecting the data? • In what setting should data collection take place? • Confidentiality protected? • Special equipment, materials needed? • Evaluators need a bigger ‘bag of tools’ needed than most researchers to examine a broad array of phenomena

  16. Specifying when the information will be collected • When will the information be needed? • When will the information be available? • When can the information conveniently be collected?

  17. Determining Appropriate Methods to Organize, Analyze, Interpret Information • Develop a system to code, organize, store, and retrieve data • For each evaluation question, specify how collected information will be analyzed • Identify statistical/summarizing techniques • Designate some means for conducting the analysis • Interpreting results (Statistical reports do not speak for the themselves) • Share information with clients to gain perspective on potential interpretations of the data • Criteria/Standards guide interpretation of some Qs • Eval plan should allow recording of multiple or conflicting interpretations • Interpretations should consider multiple perspectives

  18. Determining Appropriate Ways toReport Evaluation Findings • Using a matrix appropriate way to plan reporting (Fig 13.1) • Audience, content, format, date, context of presentation • Suggested Qs (Brknerhoff, Brethower, Hluchyj, & Nowakowski, 1983) • Are reporting audiences defined? • Are report formats and content appropriate for audience needs? • Will the evaluation report balanced information? • Will reports be timely and efficient? • Is the report plan responsive to the rights to info of the audiences?

  19. Appropriate Ways to ReportEvaluation Findings Worksheet approach; summarize: 8 topics for each Q 1. Information required to answer the question 2. Design(s) to be used to collect information 3. Sources(s) of that information 4. Method(s) for information collection

  20. 5. Information-collecting arrangements -sampling procedure -collection procedure -schedule for collection 6. Analysis procedures 7. Interpretation procedures (including standards) 8. Reporting procedures -audiences -content -format -schedule -context

  21. The Management Plan • Final task in planning the evaluation study is describing how it will be carried out • Management plan essential to help oversee the project • Who will do it? • How much will it cost? • Will it be within budget?

  22. Evaluation management is multifaceted • supervise staff • serve as liaison to evaluation clients, participants, and stakeholders • identify and cope with political influences • communicate, communicate, communicate • Evaluation, whether by one person or a team, cannot afford to be disorganized or haphazard • Management plan needed to structure and control resources • Good management plans specify for ea eval Q (Fig 13.4) : • Tasks & timelines, personnel/resources for each task, cost

  23. Managing and estimating time for conducting evaluation • PERT (program eval & review technique) & Gantt charts are commonly used to estimate time on tasks • Gantt charts (Fig 13.6): simple displays that include chronologically scaled time frames for each evaluation task • Y (vertical) axis: tasks • X (horizontal) axis: time scale • Horizontal line drawn for each task show time needed • Help highlight interim deadlines or milestones that must be met to stay on the time in the study • Well-specified milestones essential monitoringtool

  24. Analyzing personnel needs and assignments • Quality of evaluation depends heavily on those who carry it out • Are qualified individuals available to carry out the tasks? • “Personnel role specifications” for all tasks (Suarez, 1981) • Specify who would manage the study, complete the eval design, select or develop instruments, collect data, analyze data, write summary reports, etc…

  25. Estimating costs and developing budgets • Staff salary and benefits • Consultants • Travel and per diem ( for staff/consultants) • Communications (postage, phone, etc.) • Printing and duplication • Data processing • Printed materials • Supplies and equipment • Subcontracts • Overhead (facilities, utilities) If initial estimate exceeds expectations, review each line item

  26. Agreements and contracts • Potential problems that arise during the evaluation can be more easily resolved if client and evaluator share a firm understanding • A well-documented agreement prior to launching the evaluation study concerning important procedures is very helpful • http://www.wmich.edu/evalctr/checklists/contracts.pdf

More Related