1 / 26

Consider the Evidence

Consider the Evidence. Evidence-driven decision making for secondary schools A resource to assist schools to review their use of data and other evidence 7 Changing and Evaluating. Evidence-driven decision making.

Download Presentation

Consider the Evidence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Consider the Evidence Evidence-driven decision making for secondary schools A resource to assist schools to review their use of data and other evidence 7 Changing and Evaluating

  2. Evidence-driven decision making This module is part of a resource about how we use data and other evidence to improve teaching, learning and student achievement Today we are looking at the final stage of this process – changing our practice and evaluating the impact of that change

  3. Speculate A teacher has a hunch about a problem or a possible action Trigger Data indicate a possible issue that could impact on student achievement Explore Check data and evidence to explore the issue Reflect on what has been learned, how practice will change Question Clarify the issue and ask a question Evaluate the impact on the intervention Assemble Decide what data and evidence might be useful Act Carry out the intervention Intervene Plan an action aimed at improving student achievement Interpret Insights that answer your question Analyse data and evidence The evidence-driven decision making cycle

  4. Changing and Evaluating Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change?

  5. The evidence-driven decision making cycle Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? > Intervene Design and carry out action Evaluate What was the impact? Reflect What will we change?

  6. Professionals making decisions How do we decide what action to take as result of the information we get from the analysis? We use our professional judgment

  7. Professional decision making We have evidence-based information that we see as reliable and valid What do we do about it? If the information indicates a need for action, we use our collective experience to make a professional decision

  8. Professionals making decisions Have my students not achieved a particular history standard because they have poor formal writing skills, rather than poor history knowledge? The answer was Yes ... so I need to think about how to improve their writing skills. How will I do that?

  9. Professionals making decisions Do any particular groups of year 11 students attend less regularly than average for the whole cohort? The analysis identified two groups – so I need to think about how to deal with irregular attendance for each group. How will I do that?

  10. Professionals making decisions You asked what factors are related to poor student performance in formal writing. The analysis suggested that poor homework habits have a significant impact on student writing. You make some professional judgements and decide • Students who do little homework don’t write enough • You could take action to improve homework habits – but you’ve tried that before and the success rate is low • You have more control over other factors – like how much time you give students to write in class So you conclude – the real need is to get students to write more often

  11. Deciding on an action Information will often suggest a number of options for action. How do we decide which action to choose? We need to consider • what control we have over the action • the likely impact of the action • the resources needed

  12. Planning for action • Is this a major change to policy or processes? • What other changes are being proposed • How soon can you make this change? • How will you achieve wide buy-in? • What time and resources will you need? • Who will co-ordinate and monitor implementation?

  13. Planning for action • Is this an incremental change? Or are you just tweaking how you do things? • How will you fit the change into your regular work? • When can you start the intervention? • Will you need extra resources? • How will this change affect other things you do? • How will you monitor implementation?

  14. Timing is all • How long should we run the intervention before we evaluate it? • When is the best time of the year to start (and finish) in terms of measuring changes in student achievement? • How much preparation time will we need to get maximum benefit?

  15. Planning for evaluation We are carrying out this action to see what impact it has on student achievement We need to decide exactly how we’ll know how successful the intervention has been To do this we will need good baseline data

  16. Planning for evaluation • What evidence do we need to collect before we start? • Do we need to collect evidence along the way, or just at the end? • How can we be sure that any assessment at the end of the process will be comparable with assessment at the outset? • How will we monitor any unintended effects? Don’t forget evidence such as timetables, student opinions, teacher observations …

  17. The evidence-driven decision making cycle Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action > Evaluate What was the impact? Reflect What will we change?

  18. Evaluate the impact of our action Did the intervention improve the situation that triggered the process? If the aim was to improve student achievement, did that happen?

  19. Evaluate the impact of our action Was any change in student achievement significant? What else happened that we didn’t expect? How do our results compare with other similar studies we can find? Does the result give us the confidence to make the change permanent?

  20. Evaluate the impact of our action A school created a new year 13 art programme. In the past students had been offered standard Design and Painting programmes, internally and externally assessed against the full range of achievement standards. Some students had to produce two folios for assessment and were unsure of where to take their art after leaving school. The new programme blended drawing, design and painting concepts and focused on electronic media. Assessment was against internally assessed standards only.

  21. Evaluate the impact of our action • Did students complete more assessments? • Were students gain more national assessment credits? • How did student perceptions of workload and satisfaction compare with teacher perceptions from the previous year? • Did students leave school with clearer intentions about where to go next with their art than the previous cohort? • How did teachers and parents feel about the change?

  22. Evaluate the intervention • How well did we design and carry out the intervention? Would we do anything differently if we did it again? • Were our results affected by anything that happened during the intervention period - within or beyond our control? • Did we ask the right question in the first place? How useful was our question? • How adequate were our evaluation data?

  23. Think about the process • Did we ask the right question in the first place? How useful was our question? • Did we select the right data? Could we have used other evidence? • Did the intervention work well? Could we have done anything differently? • Did we interpret the data-based information correctly? • How adequate were our evaluation data? • Did the outcome justify the effort we put into it?

  24. The evidence-driven decision making cycle Trigger Clues found in data, hunches Explore Is there really an issue? Question What do you want to know? Assemble Get all useful evidence together Analyse Process data and other evidence Interpret What information do you have? Intervene Design and carry out action Evaluate What was the impact? > Reflect What will we change?

  25. Future practice • What aspects of the intervention will we embed in future practice? • What aspects of the intervention will have the greatest impact? • What aspects of the intervention can we maintain over time? • What changes can we build into the way we do things in our school? • Would there be any side-effects?

  26. Future directions • What professional learning is needed? Who would most benefit from it? • Do we have the expertise we need in-house or do we need external help? • What other resources do we need? • What disadvantages could there be? • When will we evaluate this change again?

More Related