1 / 20

Evaluation: Methods & Concerns

This article discusses the framework of social research design, including quantitative and qualitative research methods, the need for judicial training, and the evaluation process. It also explores the advantages and disadvantages of quantitative and qualitative data collection methods and the complementarities between them.

Download Presentation

Evaluation: Methods & Concerns

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation:Methods & Concerns Otojit Kshetrimayum V.V. Giri National Labour Institute, Noida otojit@gmail.com

  2. Framework • Social Research Design • Quantitative Research • Qualitative Research • Need for Judicial Training? • Evaluation • Case Studies • Judicial Academy: Training Methods

  3. Social Research Design • a "blueprint" for research, dealing with at least four problems: • what questions to study, • what data are relevant, • what data to collect, and • how to analyze the results.

  4. Social Research Design Identifying the problem Review of Literature Developing Hypothesis/Research Questions Research Methodology (Qualitative/Quantitative) Data Collection Data Analysis & Interpretation Conclusion

  5. Social Research Design Identifying the problem Review of Literature Developing Hypothesis/Research Questions Research Methodology (Qualitative/Quantitative) Data Collection Data Analysis & Interpretation Conclusion

  6. Quantitative Method • Consist of counts or frequencies, rates or percentages, or other statistics that document the actual existence or absence of problems, behaviors, or occurrences. • Examples of quantitative data collection methods • Surveys • Questionnaires that ask close-ended questions such as pre- and post-tests • Performance tests • Clinical tests, such as urine and blood tests • Observation checklists • Archival research that provide statistical data • Advantages: easy to administer, can include relatively large number of questions, can yield large samples, are easier to summarize, and are more widely accepted as a form of evidence regarding program effectiveness. • Disadvantages: Data may not be as rich or as detailed as qualitative methods. Survey/written questionnaires may be difficult for some participants; may not provide all the information needed for interpretations of data findings, and the large amounts of data may require more sophisticated analysis approaches.

  7. Qualitative Method • Descriptions of problems, behaviors or events, can provide narrative descriptions of people’s thoughts and opinions about their experiences, attitudes, and beliefs. • Examples of qualitative data collection methods include: • Key informant or individual interviews • Focus groups • Open-ended questions on a survey or questionnaire • Logs, journals, diaries and/or essays • Stories/Case studies • Participant observations/field notes • Document review: examining written records such as logs, correspondence, • meeting minutes, or news articles or other published accounts • Advantages: capture more depth and provide insights as to the “why” and “how” of attitudes and behaviors; clarify quantitative data and sometimes puts it into the context of people’s lives and experiences. This makes quantitative data easier to understand, provides more details and nuances, and explains what the program means to the people involved. • Disadvantages: time consuming to capture and analyze; more subjective and may be difficult to summarize and compare systematically; generally viewed as less reliable because qualitative data is more subjective than quantitative methods, and may yield smaller sample sizes.

  8. Complementarities •  Debate of "qualitative versus quantitative" • How the techniques can be integrated (e.g. mixed methods research/ interdisciplinary research)

  9. Need for Judicial Training?

  10. Evaluation • Systematic acquisition and assessment of information to provide useful feedback about some object. • Evaluation as the systematic collection and analysis of data needed to make decisions

  11. Evaluation: Steps • STEP 1:Get an overview of the programme • STEP 2: Determine why you are evaluating • STEP 3:Determine what you need to know and formulate research questions • STEP 4:Figure out what information you need to answer questions • STEP 5:Design the evaluation • STEP 6:Collect information/data • STEP 7:Analyze information • STEP 8:Formulate conclusions • STEP 9:Communicate results • STEP 10:Use results to modify programme Source:WHO

  12. Types of Evaluation-1 • Formative Evaluation: conducted during the planning and design of the programme; provides immediate feedback for programme modification and improvement; on-going; helps to determine programme strengths and weaknesses. Types: • Needs assessment: who needs the programme, how great the need is, and what might work to meet the need; • Evaluability assessment: whether an evaluation is feasible and how stakeholders can help shape its usefulness; • Structured conceptualization helps stakeholders define the programme or technology, the target population, and the possible outcomes; • Implementation evaluation monitors the fidelity of the programme or technology delivery; • Process evaluation investigates the process of delivering the programme or technology, including alternative delivery procedures.

  13. Types of Evaluation-2 2. Summative Evaluation concerned with the evaluation of an already completed programme; to determine whether the programme has achieved its goals; summarizes the strengths and weaknesses of a programme. Types: • Outcome evaluation: investigate whether the programme or technology caused demonstrable effects on specifically defined target outcomes; • Impact evaluation: broader and assesses the overall or net effects – intended or unintended – of the programme or technology as a whole; • Cost-effectiveness and cost-benefit analysis: address questions of efficiency by standardizing outcomes in terms of their costs and values; • Secondary analysis: re-examines existing data to address new questions or use methods not previously employed; • Meta-analysis: integrates the outcome estimates from multiple studies to arrive at an overall or summary judgement on an evaluation question.

  14. Donald Kirkpatrick’s Four Levels of Evaluation: Reaction, Learning, Behaviour, and Results • Level One- Reaction Evaluation: a measure of satisfaction -usually, participants get a form at the end of a course or educational event, asking them to tick off little boxes, often on a five point scale, to solicit their immediate reaction to a course or event. Often, there is also a place in where participants can write specific comments. The form usually includes questions about what other courses and topics the learner would find useful. • Level Two- Learning Evaluation: a measure of learning -tries to measure what knowledge or skills participants have acquired and retained. This can be done through a test (either at the end of the course or days or months afterwards) to measure what knowledge or skills they acquired and kept or alternatively through a post-course evaluation asking the participants what they have retained. • Level Three- Behaviour Evaluation: a measure of behaviour change -assesses whether there have been any behavioural changes as a result of the education programme. This is done by observation or interviewing. • Level Four- Results Evaluation: a measure of results -tries to identify whether the education generated change in the recipients’ organisation.

  15. Case Studies • Delhi Judicial Academy (2015-16) • No. of Training Programmes: 42 • No. of training programmes for officers/judges:39 • No. of training programmes for officials/ministerial staffs: 03 Topics - Strengthening Justice Delivery System: 20 • Course on the New Emerging Areas of Law: 06 • From Judging to Justicing: 06 • Environment Awareness & Stress Management: 04

  16. Case Studies • Gujarat State Judicial Academy (2015-16) • No. of Training Programmes: 25 • No. of training programmes for officers/judges:22 • No. of Refresher Course for Deputy Section Officers: 03

  17. Gujarat State Judicial Academy

  18. Gujarat State Judicial Academy

  19. Judicial Academy: Training Methods 1.Needs Assessment: regularly assess and analyse participants’ learning needs, responsibilities and performance 2.Learning Objectives: specific, realistic, and measurable 3.Learning Activities: promote active participation and engage all learning styles 4. Learning Environment: physical environment should support learning and the learning objectives 5.Evaluation: determine (during & after the activity) whether the learning activities achieved the stated learning objectives and met the partcipants’ expectations

  20. Thank You

More Related