1 / 52

Community Data-Driven Decision Making in Education Workshop DAY 2

Community Data-Driven Decision Making in Education Workshop DAY 2. Manila, May 24 – 26, 2011. Diving into Data Part I: Assessing the Data You Have. Wednesday, May 25, 2011. Jumpstart Storytelling.

nellis
Download Presentation

Community Data-Driven Decision Making in Education Workshop DAY 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Community Data-Driven Decision Making in Education WorkshopDAY 2 Manila, May 24 – 26, 2011

  2. Diving into DataPart I: Assessing the Data You Have Wednesday, May 25, 2011

  3. Jumpstart Storytelling 60-second Stories. Divide into groups of four. Each person in the group has 60 seconds to tell their colleagues a story drawn from their personal experience: Tell a story about how education really made a positive difference in someone’s life.

  4. Jumpstart Storytelling Rotate and Retell. Each person rotates to a new group and retells their 60-second story to the new audience.

  5. Jumpstart Storytelling Clusters and Chains. Choose which story most impacted you, either because it was so compelling, or because it was highly informative and relevant to the topic. Find that storyteller, place a hand on their shoulder and keep it there. 

  6. Data "In Godwe trust, but from everyone else we expect data.”

  7. Data • Data literacy - a basic understanding of how data can be used to inform instruction. • Raw data - data that has not been processed. • Information - the end product of data processing.

  8. Why is data literacy important? • Enables educators to determine whether data is a good measurement of student performance • Enables educators to use good data in planning, implementation, assessment, and revision of instruction

  9. Cycle of inquiry

  10. Qualitative Data • Answers questions related to "how" issues; • Answers questions in ways that cannot be put into numbers very easily • For example, "What are the most successful management practices of the Philippines’ most successful head teachers?

  11. Quantitative Data • Answers questions related to “how much”; Uses numbers to answer questions; • Categorizes data into patterns as the primary basis for organizing and reporting results. • For example, How many children have learned to read by grade 2 in the Philippines? • How many grade 1 and grade 2 teachers in the Philippines have received training in teaching reading?

  12. Baseline Data • Data to find out about an existing issue • Aimed at providing information on a potential intervention • Establishes a comparison point for later data collection that will demonstrate a change

  13. Monitoring and Evaluation Data • Shows links between what was done and what resulted Activity Output: Whathappened Outcome:What changed Impact:What difference did the change make Input: What you put in Logic Model

  14. Data Quality Standards 1) Validity: Does the data represent what you intended to measure? 2) Reliability: Was the data measurement process stable? Could it be repeated with the same results?

  15. Data Quality Standards 3) Precision: Does this data have the right level of detail to answer my question? 4) Integrity: Has the data been properly organized? Are there errors in the data? 5) Timeliness: Is this data available in time to make use of it?

  16. Education Management Information Systems (EMIS) • EMIS produce, manage, and disseminate educational data and information, usually within a national Ministry or Department of Education. • At the student and classroom level there may be little or no performance data available and decision makers need to create their own data collection tools and do their own research to improve instruction.

  17. SWOT Analysis SWOT stands for: Strengths Weaknesses Opportunities (to use data) Threats (barriers or challenges to using data) It is a framework for analysis – in this case, we’ll use it to strategically inventory the data sets you already have available in your school settings.

  18. SWOT Analysis Step One: Use data terminology to describe the qualities of your data set. Step Two: Discuss and list the strengths, weaknesses, opportunities (for using), and threats to using each data set that is available to you at your school. Example: NAT data

  19. Diving into DataPart II: Methods and Tools to Collect Data Wednesday, May 25, 2011

  20. Data Collection Methods

  21. Data Collection Methods

  22. Data Collection Methods

  23. Data Collection Methods

  24. Data Collection Methods

  25. Data Collection Methods

  26. Example Assessment Tools for Education • What method? • What are some strengths of this tool? Some weaknesses? • How would you adapt this tool to your own school setting (if at all)?

  27. Designing an Assessment Tool • Form your research question: What do you need to learn from collecting and analyzing data? • Avoid collecting the same data in another way – revisit your inventory of data sets you already have, and make sure that you really need to collect new data; confirm the ‘gap’ • Consider methods: which method will be the quickest and easiest way to get the data you need? • Be objective: avoid bias in your tool

  28. Share your tool and process • Tell us about the tool • Tell us about the data collection and organization process • Who will you collect data from using this tool? • When will you collect it? • Who will collect the data (implement the tool)?

  29. Analyzing Data Wednesday, May 25, 2011

  30. Analyzing Data Objective data reveals the truth of a situation. Data can be objective when it is analyzed in the appropriate way.

  31. Analyzing Data Prepare your data set for analysis: -Ensure method of data collection was appropriate -Ensure size of data set is adequate to offer information you need -Clean data: look for duplicates, errors, missing entries Anything else?

  32. Quantitative Data • a type of data used to describe an information that can be counted or expressed numerically • the aim is to come up with a statistical record in order to describe what is observed • makes use of tools such as surveys, questionnaires to gather numerical data • is ‘objective’ – how to reconcile

  33. Quantitative Data 3 most common ways of describing quantitative data: • Mean • Median • Mode

  34. Quantitative Data Mean – average • To compute: • Sum up all the values and divide by the number of values • Example: In a given Math test scores 15 13 18 16 14 17 12 • Average: 15

  35. Quantitative Data Mean Advantage – includes every value in the data set as part of the calculation Disadvantage – can be influenced by outliers • Example: Staff 1 2 3 4 5 6 7 8 9 10 Salary 15k 18k 16k 14k 15k 15k 12k 17k 90k 95k Mean = Php 30.7K

  36. Quantitative Data Median – the number found at the exact middle of the values To compute: Example 1 – if the number of values is odd Given: 17 12 13 19 15 a. Arrange the values in ascending order. 12 13 15 17 19 Median : 15 In general, Median = ½ (n + 1)

  37. Quantitative Data Cont... Median • Example 2 – if the number of values is evenGiven: 10 12 13 16 17 18 19 21a. Arrange the values in ascending order. 10 12 13 16 17 18 19 21Median: 16.5In general, Median = n + 1 then = 16 + 17 2 2 = 8 + 1 = 33 2 2 = 4.5 = 16.5

  38. Quantitative Data Mode – the most frequently occurring set of value - normally used for categorical data - not unique Example: In a given test score 15 21 17 16 15 10 13 15 Mode: 15

  39. Quantitative Data - Mode

  40. Quantitative Data - Mode References:http://www.socialresearchmethods.net/http://statistics.laerd.com/statistical-guides/measures-central-tendency-mean-mode-median-faqs.php

  41. Exercise: The Cycle in Miniature Steps in the process of data driven decision-making: • Define the question • Develop a data collection tool • Collect the data • Analyze the data • Present the findings • Utilize the findings Challenge Exercise: Do an entire evaluation, from start to finish, in 1 hour!

  42. Our Area of Inquiry We want to play the music during break that will make the most people happy.

  43. Examples of Assessment Tools Wednesday, May 25, 2011

  44. Assessment Tools 2 examples of rigorous assessment tools that can be adapted for use in different countries:  EGRA – Early Grade Reading Assessment  SSME – Snapshot of School Management Effectiveness Question for participants: What lessons can you take away from the development and use of these tools that you can apply at your school/classroom level?

  45. Assessment Tools EGRA – Early Grade Reading Assessment “The ability to read and understand a simple text is one of the most fundamental skills a child can learn.” Purpose of EGRA: simple instrument that can report on foundation levels of student learning, including assessment of the first steps students take in learning to read. • Can be used as a snapshot at a national/regional level, OR by teachers and individual students in school.

  46. Assessment Tools EGRA – Early Grade Reading Assessment • Testing: Piloted in Kenya, Egypt, and other countries; developing. • Includes an oral reading fluency component • Designed to be a diagnostic tool for Ministry/donors to identify areas of improvement at the overall system-level

  47. Assessment Tools SSME – Snapshot of School Management Effectiveness Purpose: let school, district, provincial, or national administrators or donors learn what is currently going on in their schools and classrooms and to assess how to make their schools more effective. Data collection methods: direct classroom and school observation; student assessments interviews with parents, teachers, principals, and parent

  48. Assessment Tools SSME – Snapshot of School Management Effectiveness Tested: Piloted in Jamaica and Peru in 2007; many more countries since. Piloting showed tool can provide statistically reliable data, successfully discriminate (with statistical precision) between effective behaviors that are already common and those that still need to be developed.  Can distinguish between more versus less effective groups of school How can it be adapted for different countries? -translation into the language of instruction. -adapt questions to local context , such as questions about the scope of Parent Teacher Associations (PTAs). -intended for comparison within country now

  49. Assessment Tools SSME Tools Review Imagine you can adapt this tool to use within your school. • What will this tool help you measure? • How does this tool align with tools you already have? Would the tool give you any data that you don’t already have that would be useful to you? • Does this tool align with your School Improvement Plan? How?

More Related