1 / 30

Summer Data Camp 2013

Summer Data Camp 2013. Data Driven Instruction Support Teams. NY State Public School ELA 4 th Performance vs. Free-Reduced Rates. 100%. 90%. 80%. 70%. Pct. Proficient. 60%. 50%. 40%. 30%. 20%. 10%. 10%. 20%. 30%. 40%. 50%. 60%. 70%. 80%. 90%. 100%.

rosellag
Download Presentation

Summer Data Camp 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Summer Data Camp 2013

  2. Data Driven Instruction Support Teams

  3. NY State Public School ELA 4th Performance vs. Free-Reduced Rates 100% 90% 80% 70% Pct. Proficient 60% 50% 40% 30% 20% 10% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Pct. Free-Reduced Lunch

  4. NY State Public School ELA 4th Performance vs. Free-Reduced Rates 100% 90% 80% 70% Pct. Proficient 60% 50% 40% 30% 20% 10% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Pct. Free-Reduced Lunch

  5. THE FOUR KEYS: • DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: • 1. ASSESSMENTS • 2. ANALYSIS • 3. ACTION • 4. in a Data-driven CULTURE

  6. Let’s assess the present state of data-driven instruction and assessment at your district…

  7. So WHY are you here?

  8. THE FOUR KEYS: • DATA-DRIVEN INSTRUCTION AT ITS ESSENCE: • ASSESSMENTS (Interim, Aligned, Reassess, Transparent) • ANALYSIS • ACTION • in a Data-driven CULTURE

  9. ASSESSMENT BIG IDEAS: • Standards and objectives are meaningless until you • define how to assess them. • Because of this, • assessments are the starting point • for instruction, not the end.

  10. Results from Districts that Implemented DDI, using “Assessments for Learning”

  11. DATA-DRIVEN RESULTS:

  12. Greater Newark Charter: Achievement by Alignment

  13. Holabird Academy: Coaching to Achievement

  14. Excellence Charter School—3rd Grade Math *District, NYC, and state results are for 2006.

  15. OF vs FOR Assessment of Learning Assessment for Learning How can we help students learn more? Used to inform students about themselves and how teachers can support student growth. The building blocks, foundation for change Ex. Interim Assessments, At-the-Moment assessments not used for report grade • How much have students learned up to this time • Check for or verify learning • Assessment for accountability • Ex. State Assessments, Regents, Local Assessments, and Classroom test used to give a report card grade

  16. Typical Grade Book Test Information

  17. eDoctrina allows the analysis of assessment results by: • ITEM • STUDENT • CLASS • CLASS TO CLASS

  18. Question by Question Analysis

  19. Standard by Standard Analysis

  20. Student by Student Analysis Have student explain why

  21. ASSESSMENTS: • PRINCIPLES FOR EFFECTIVE ASSESSMENTS: • COMMON INTERIM: • At least quarterly • Common across all teachers of the same grade level • DEFINE THE STANDARDS—ALIGNED TO: • To state test (format, content, & length) • To instructional sequence (curriculum) • To college-ready expectations

  22. ASSESSMENTS: • PRINICIPLES FOR EFFECTIVE ASSESSMENTS: • REASSESSES: • Standards that appear on the first interim assessment appear again on subsequent interim assessments • WRONG ANSWERS: • Illuminate misunderstanding • TRANSPARENT: • Teachers see the assessments in advance

  23. ASSESSMENT BIG IDEAS: • In a multiple choice question, • the options define the rigor. • In an open-ended question, • the rubric defines the rigor.

  24. 1. 50% of 20: 2. 67% of 81: 3. Shawn got 7 correct answers out of 10 possible answers on his science test. What percent of questions did he get correct? 4. J.J. Redick was on pace to set an NCAA record in career free throw percentage. Leading into the NCAA tournament in 2004, he made 97 of 104 free throw attempts. What percentage of free throws did he make? 5. J.J. Redick was on pace to set an NCAA record in career free throw percentage. Leading into the NCAA tournament in 2004, he made 97 of 104 free throw attempts. In the first tournament game, Redick missed his first five free throws. How far did his percentage drop from before the tournament game to right after missing those free throws? 6. J.J. Redick and Chris Paul were competing for the best free-throw shooting percentage. Redick made 94% of his first 103 shots, while Paul made 47 out of 51 shots. Which one had a better shooting percentage? In the next game, Redick made only 2 of 10 shots while Paul made 7 of 10 shots. What are their new overall shooting percentages? Who is the better shooter? Jason argued that if Paul and J.J. each made the next ten shots, their shooting percentages would go up the same amount. Is this true? Why or why not?

  25. THE FOUR KEYS: • ASSESSMENTS (Aligned, Interim, Reassess, Transparent) • ANALYSIS • ACTION • in a Data-driven CULTURE(Vision, Leadership, Calendar, PD)

  26. Presentation of the District’s DDI • Vision • Leadership Team • Calendar • PD for 2013-14

More Related