1 / 52

Best Practices and Use of Data

Best Practices and Use of Data. Module 1. Direct Access to Achievement webinar series Jan. 24, 2012. Today’s Content. Knowledge gaps identified Best practices in use of OAKS data. Knowledge Measure revealed some concerns in two key areas: .

astra
Download Presentation

Best Practices and Use of Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Best Practices and Use of Data Module 1 Direct Access to Achievement webinar series Jan. 24, 2012

  2. Today’s Content • Knowledge gaps identified • Best practices in use of OAKS data

  3. Knowledge Measure revealed some concerns in two key areas: • How to use OAKS results to guide decisions in planning instruction or intervention—connecting results to adult actions • Interpreting information from OAKS reports.

  4. Knowledge gaps-data-driven decisions • Using patterns of group performance to evaluate gaps in expected and instructed curriculum • Test-triggered clarity vs. teaching to the test • Teaching for generalizability • Monitoring implementation-cause data

  5. Practice Polling: Select the answer that best describes you: • A. Teacher • B. Administrator (Building or Central Office) • C. ESD personnel • D. Other

  6. Interpreting Reports: OAKS and instructional alignment When you want to identify possible gaps in instructional alignment with grade level content standards, which group of students would provide you with the best information?

  7. Select the best answer: • Students on the “bubble” of meeting standards. • Students who have far to go to meet standards. • Students who have far to go that also have special needs. • Students exceeding the standards.

  8. Which of the following will provide information to evaluate the implementationof a change in instructional practice? • Collecting formative data on student achievement. • Collecting summative student achievement data. • Collecting data on adult actions. • Collecting demographic data on classes in the school.

  9. Student outcomes appear to result from a hodgepodge of programs/practices without information about adult actions

  10. A program or practice doesn’t have a direct effect on a student…adults do!

  11. It is an iterative process! Analyze student data • Plan adult response: instruction, intervention, assessment • Monitor implementation • for adjustments • (adults’ actions) • Monitor implementation (adults’ actions) • Check progress of adult implementation and progress of students for impact • Adjust adult response: instruction, intervention, assessment

  12. Test-triggered clarity versus teaching to the test

  13. Using tests to clarify the curriculum is different from teaching to the test when: • Teachers use items from the test to determine their instructional curriculum. • Teachers use the results of assessment to understand the cognitive demand of the grade level standards. • Teachers use the test results to build identical items for teacher-made tests. • Teachers use test preparation materials to provide students with a variety of questions just like the test.

  14. Knowledge Gaps-Data Interpretation • Understanding the use of a scale score to ensure equivalent/comparable scores between students on computer adaptive tests. • Factors that should be considered when interpreting group reports on OAKS, particularly trend reports.

  15. Scale scores (RIT scores on OAKS) are used to: Turn to a partner and discuss your answer: Why are scale scores (RIT scores) used to report OAKS results instead of raw scores (percent correct)?

  16. When reviewing trends for an intact cohort of students, what factors are important to consider? Selected sites will report out factors.

  17. What factors do you consider when viewing this trend for these classes? School School 1 Class 1 Class 2 Class 3 Class 1 Class 3 Class 1 Class 3

  18. Module Content • Module 1—Best Practices in Use of OAKS Test Opportunities • Module 2—Going Deeper into Analysis: From Patterns and Trends to Instructional Implications • Module 3—Problem Identification, Problem Clarification and Follow-Through

  19. Follow up and continuous dialogue DATA Project Blog

  20. Best Practices Guide for Use of OAKS http://www.ode.state.or.us/wma/teachlearn/testing/admin/best-practices-in-administering-oaks.pdf

  21. Throughout the year – formative assessment During the year – interim assessment End of the year – summative assessment Balanced Assessment Plan

  22. Decision-making Matrix When to use OAKS and what to do with the results

  23. Who should take the OAKS test in the fall or early winter? Discuss the decision rules you currently use to decide whether to test a student on the OAKS in the fall or early winter: “Students are administered the OAKS in the fall or early winter when…”

  24. When is it appropriate to administer OAKS to students, in the fall or early winter? When students have demonstrated proficiency in the grade level content based on classroom-derived evidence.

  25. Check your decision rules: Are you using OAKS fall and early winter testing… • … as part of interim or predictive assessment practice for specific groups of students to: • calibrate teachers’ formative judgment? • evaluate pacing? • evaluate scope and sequence of content?

  26. Check your decision rules: Are you using OAKS fall and early winter testing… • … to provide supporting evidence to focus a student or specific group of students on enrichment activities or extending learning beyond grade level?

  27. Check your decision rules: Are you using OAKS fall and early winter testing… • …to manage computer labs and other resources? • …to meet accountability requirements? What are the pros and cons for using fall and early winter testing to manage resources or meet accountability requirements?

  28. Think through the consequences of using fall and early winter testing

  29. How are you overcoming obstacles in managing resources and meeting accountability requirements? Sites share strategies that have mitigated the obstacles.

  30. How do you respond to students’ results from fall or early winter testing on OAKS? Students who achieve a score that meets or exceeds standards: Discuss your next steps for these students.

  31. Response to fall or early winter testing Meets: • Retest only if consistent with district procedures and if a different result is expected, based on additional classroom-derived evidence and/or after growth in learning is achieved. • Use instructional time to advance knowledge and skills Exceeds: • Utilize instructional time to advance knowledge and skills or to develop additional skills. • Retesting on OAKS is never appropriatefor students who exceed the standard!

  32. Response to fall or early winter testing Does Not Meet • Investigate potential causes for the discrepancies between classroom derived evidence and summative results • Resolve any data discrepancies. • Provide targeted instructional supports where appropriate. • Re-test only if consistent with district procedures and if a different result is expected based on additional classroom derived evidence and/or after growth in learning is achieved.

  33. Decision Matrix • Who should take the OAKS test in the late winter/spring?

  34. When is it appropriate to administer OAKS to students in the late winter/spring? Test all students who do not have a valid test for the current school year.

  35. Check your decision-rules:Are you using OAKS late winter/spring testing • To ensure each student has had sufficient instructional time and support or intervention as needed to meet grade level standards?

  36. Check your decision-rules:Are you using OAKS late winter/spring testing • To meet accountability requirements? • To align with the direction of the SMARTER common assessment? SMARTER will be administered in the late winter/spring rather than throughout the year.

  37. What are the pros and cons associated with testing students in late winter/spring?

  38. How do you respond to the results of late winter/spring testing? Students who achieve a score that meets or exceeds standards—discuss your next steps for these students.

  39. Response to late winter/spring testing Meets Exceeds Utilize instructional time to advance knowledge and skills or to develop additional skills. • Review and revise instructional planbased on the collection of formative and summative data • Re-test only if consistent with district procedures and if a different result is expected based on additional classroom derived evidence and/or after growth in learning is achieved • Retesting on OAKS is never appropriate for students who exceed the standard!

  40. Response to late winter/spring testing Does Not Meet: • Provide targeted instructional supports where appropriate. • Re-test only if consistent with district procedures and if a different result is expected, based on additional classroom- derived evidence and/or after growth in learning is achieved

  41. When is the use of OAKS not beneficial? • When testing students who have already met or exceeded the state’s achievement standard for the grade level. • When classroom-derived evidence indicates that achievement is far below the state’s achievement standard. • When administering any assessment to a student early without a specific plan to use the results. • When testing supplants instruction because of resource management.

  42. How will these best practices help prepare teachers and students for CCSS and SMARTER?

  43. Summary • Knowledge Measure revealed a few areas of concern. This content is important to address and reinforce. • OAKS Best Practices provides a useful guide and template formaking an intentional connection between student data and the adult actions that may impact student data.

  44. Blog Assignment • Blog address: • DATA Project page > Webinar series > Blog • www.oregondataproject.org/content/advanced-data-training-blog • Register online • Post group work, if possible

  45. Blog Assignment • What adult actions, based on student data, are you monitoring as follow up to data team meetings?

More Related