1 / 37

Making your NAPLAN Data Count

www.decd.sa.gov.au/quality Reflect… Improve… Achieve. Making your NAPLAN Data Count.

shakira
Download Presentation

Making your NAPLAN Data Count

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. www.decd.sa.gov.au/quality Reflect… Improve… Achieve Making your NAPLAN Data Count ”In an ideal world, the teacher would have the precise and current knowledge of each student’s starting points and also of what assistance each student requires to move to the next level”. (Fullan, et al 2006).

  2. What’s New in 2012 • 5 years of data available for comparisons • Access to Question Item Analysis and Student Consolidated Reports in updated Student Data Warehouse, including ‘How to’ videos • 2013 NAPLAN Tests: Tuesday, Wednesday, Thursday – 14, 15, 16 May. Writing genre: Persuasive, Marking Rubric as per 2012 • Check MySchool website for ICSEA comparisons – currently review of ICSEA happening before release of My School 4.0 • SCSEEC Second Stage National Report due for release in December

  3. Trends continuing from 2011 • SA results generally remain stable, with significant improvements in Year 3 and 5 Spelling and Year 7 Grammar & Punctuation and significant declines in Year 5 Grammar & Punctuation and Years 5, 7 and 9 Writing • Literacy results are better than Numeracy results • Year 7 and 9 means are much closer to national means than year 3 and 5 means • Lower percentages of SA students in the top two proficiency bands than Australian averages, all aspects, all year levels • Improvement in NMS compared to Australian NMS since 2011 shows improved support for students achieving in the lower bands

  4. Data driven cycle of improvement using DIAf In implementing ongoing DIAf processes, sites and regions collect and analyse data to inquire collaboratively into practice and performance as part of Self Review and Performance Reporting. The findings inform our plans, strategies and priorities as part of Improvement Planning and targeted actions for Intervention and Support at all levels.

  5. Demographic Data Perception Data Process Data Learner Achievement Data Multiple MeasuresFramework(Bernhardt, 2004) e.g. Enrolments, Attendance, ATSI/ESL, Medical Conditions, Staff/Community Profile, Physical, Financial Assets e.g. Rubrics, Meeting Minutes, Flowcharts, Programs, Improvement Plans, Instructional Time, Policy/ProcedureDocuments, Photo/Video e.g. Parent, Staff Opinion/Surveys – beliefs, values, attitudes e.g. EYLF, Running Records, NAPLAN, RRR, Standardised Tests (e.g. Waddingtons/ Westwood), Teacher Assessments, SACE/TER, A-E, Portfolios

  6. How are we doing?A process for analysing NAPLAN • 1. Big Picture - School Summary: Aspect and Year Level Analysis ( Using: Mean Scores,Proficiency Bands and National Minimum Standard) • 2. Big Picture – School Summary: • Analysis of Progress/Growth • (Using: % U,M,L) WHAT DO WE SEE IN THE DATA?5 • 3. Big Picture – School Summary: • Analysis of Performance Targets • 4. Class/Question Analysis • Analysis of Question Items • (Using: % Correct) • 5. Individual Student Analysis

  7. Measures required for Analysis • Means (and Standard Deviation) • Proficiency Bands • National Minimum Standard (NMS) • Progress (% U, M, L) • Individual items (ACARA will be including gain scores (mean and median) and disaggregation of data by statistically similar schools on the My School site when this data is released)

  8. SA Results - Means Year 3,5,7,9 students in Reading, by State/Territory, 2012: Most SA within 15 points of Aust. MeanSA Year 7 and Year 9 are very similar to Aust. Means and NMS • If the mean is the average, why are there 3 states higher than the Aust. mean and 5 states lower than the Aust. mean?

  9. Comparison of Means - most SA 2012 within 15 points of Australia 2012

  10. SA Results – NMS: most SA within 1.5% of Australia, with some differences noted for Year 3 & 5

  11. National Comparisons 2008-2012, Year 3 reading Over time, SA stable, while improvement recorded nationally

  12. SA contextSocio-Economic Index,% Population classified in the lowest 20% Geo-location remoteness

  13. What assessments or supports are in place for students that are ‘Absent’ or ‘Exempted’ to show that ALL students are learning? Check participation rates. Where levels are less than 90%, data to be used with caution for whole school decision making. Check which aspects achieve highest/lowest results. Check largest increases/decreases (by looking for changes over time and in Mean Score as Proficiency Band e.g. 6 7; or differences in mean scores of ±15). Compare your results to your Index Category results over time. Check which aspects achieve highest/lowest results. Check largest increases/decreases (by looking for changes over time and in Mean Score as Proficiency Band e.g. 6 7; or differences in mean scores of ±15). Compare your results to your Index Category results over time. 1. Big Picture School Summary – Mean Scores

  14. Below the national minimum standard 1000 686 634 582 Means Means 530 Means 478 426 Means 374 322 270 0

  15. Check levels in lowest 2 bands (i.e students at or below NMS). Will later need to check who are these students as part of ‘individual student level analysis’ 1. Big Picture School Summary – Proficiency Bands Check increases/decreases in highest 2 proficiency bands over time for each aspect. Check which band(s) are the majority of students located ( i.e. which aspects and year levels) over time?

  16. Check National Minimum Standard %’s. These need to be analysed in conjunction with detailed Proficiency Band reports via www.eduportal.sa.edu.au (as NMS only indicate demonstration of the most ‘basic’ elements of Literacy/Numeracy). 1. Big Picture School Summary - NMS

  17. Check which aspects are highest/lowest %’s in lower/upper progress. Check against previous years data for any consistent trends. Check which aspects are highest/lowest %’s in lower/upper progress. Check against previous years data for any consistent trends. 2. Big Picture - Progress Progress Report (access via Student Data Warehouse) Note: If there is a representation greater than 30% in the ‘L’ lower 25% (or quartile), students are not “progressing” as much as students in other schools. Similarly, if ‘U’ upper 25% is greater than 30% then students are “progressing” at a greater rate than students in other schools.

  18. Self Review: Analysis 1. OVERALL SCHOOL PERFORMANCE - MEAN SCORES, PROFICIENCY BANDS & NMS 2. OVERALL SCHOOL PERFORMANCE – PROGRESS/GROWTH

  19. Check whether any NAPLAN targets are on track to be met. To further develop/refine targets refer to www.decd.sa.gov.au/quality > Improvement Planning 3. Big Picture – Analysis of Performance Targets

  20. Self Review: Analysis 3. OVERALL SCHOOL PERFORMANCE – PROGRESS AGAINST TARGETS

  21. Performance Reporting: Acknowledgment of achievement & Progress • Celebrate achievements and progress made towards goals and priorities as evidenced by the analysis of data • Consider how will you share, acknowledge and celebrate the progress made with students, staff and the governing council/community? • Include any achievements arising from the question item/individual analysis level.

  22. What question items are most commonly answered correctly/incorrectly compared to national results and/or index of category?Are these patterns the same in previous year levels and aspects? What question items are most commonly answered correctly/incorrectly compared to national results and/or index of category?Are these patterns the same in previous year levels and aspects? Are large groups of students attracted to an incorrect answer? If so, why? What about open response questions? 4. Question Item Level Analysis Check for largest/smallest positive and negative differences. Are there any patterns identified? Check for largest/smallest positive and negative differences. Are there any patterns identified?

  23. Student Reports: Skills summary • The matrix of skills on the back page of the student report shows a ranking of the skills by bands from lowest to highest. • A student is most likely to be able to demonstrate the skills and understandings described in their band level and in the bands below.

  24. Writing (Persuasive) • Using the 2012 Writing Marking Rubric to map the achievement of students in your school against the achievement of students across the nation.

  25. Marking Criteria & Score Points:Student reports In the writing test all students wrote a persuasive text to the prompt‘Too much money is spent on toys and games’. Note that the 2012 persuasive writing scale is the same as the 2011 scale and results in writing cannot be compared to the writing scale used in 2008, 2009 and 2010 when students wrote a narrative text.

  26. Self Review: Analysis 4. CLASS/QUESTION LEVEL ANALYSIS

  27. Self Review: Analysis ANALYSIS OF TEACHER BASED ASSESSMENT DATA NAPLAN CONNECTIONS TO ASPECTS OF THE CURRICULUM AND TEACHING PRACTICE

  28. Which students are below national minimum standard? Which students are progressing well/not as well as expected in each band? 5. Individual Learner Analysis - Proficiency Bands & NMS

  29. Which students answered the question correctly/incorrectly? What skills were being assessed by these questions? (see Skills Matrix/Curriculum links)? 5. Individual Learner Analysis - Question item level What are the possible explanations for why students chose those answers? Check test booklets or additional class assessments.

  30. Do other school based assessments verify or indicate more detailed learner needs for intervention and support ? 5. Individual Learner Analysis … Intervention and Support : What more do we need to do to ensure success for each learner? Student Consolidated Report (access via Student Data Warehouse)

  31. Self Review: Analysis INDIVIDUAL LEARNER ANALYSIS - OVERALL PERFORMANCE

  32. Appendix 1. Analysis Template 1 Synthesis of results for – Individuals and Whole School Implications

  33. Where to from here? … Improvement Planning1) What patterns emerge from the above analysis for the whole school, year, class level groupings? 2) What are the next steps or implications for improvement planning at the whole school, year and class level? Note 1: Identify the learning area your site has identified as an area to improve? (e.g. focus on reading comprehension). Note 2: Consider the implications for a whole school approach and the processes necessary for effective planning for improvement. This could include the refinement of site priorities, goals, targets, additional planning discussions with staff, further evaluation of current strategies. Note 3: For further information about improvement planning visit www.decd.sa.gov.au/quality WHAT MIGHT WE DO ABOUT IT? WHY ARE WE SEEING WHAT WE ARE?

  34. Where to from here? … Improvement Planning3) What are the next steps or implications for pedagogy, curriculum and assessment for the whole school, year and class level?Note1: Consider what high leverage teaching and learning strategies would research or best practice suggest about possible ways for how to improve student learning in the area(s) identified (i.e. links to local, national & international research/practice – including EYLF, TfEL, Lit.Sec, DIAf, Aust. Curric.; evaluate existing professional learning practices and literacy/numeracy projects/programs).Note 2: Consider the extent to which staff have shared/common understandings of the required content knowledge about the curriculum outcomes and concepts taught.Note 3: How will you articulate these strategies in your site improvement plan? Who will be responsible for actioning these and by when?Note 4: As part of your next steps you may consider analysis at the individual level (see below) or further investigation of key drivers/root causes into teaching and learning. WHAT MIGHT WE DO ABOUT IT? WHY ARE WE SEEING WHAT WE ARE?

  35. Where to from here? … Improvement Planning 4) At the individual level, who needs extension or specific intervention and support based on this analysis? Which groups within classes would benefit from explicit instruction or extension in particular skills? Note1: Are there any specific teaching and learning strategies that need to be researched, investigated or actioned to support improvement for individual students in the areas identified refer to Wave model of Intervention & Support on DIAf website? Note2: For the questions answered incorrectly, check if the concepts were taught/not taught and assess how effectively the concepts were taught for these students. Consider whether the student was ready to learn the particular concept and/or the extent to which transferable learning opportunities were provided for the concept. Note3: How will you articulate these next steps or teaching and learning implications in your intervention and support programs, plans, practice and in taking action to monitor and improve outcomes for these individual learners? WHAT MIGHT WE DO ABOUT IT? WHY ARE WE SEEING WHAT WE ARE?

  36. Useful National, State and Regional Contacts • Performance, Analysis, Reporting Consultants, Curriculum Consultants, Early Childhood Consultants, Aboriginal Education Consultants (Regional Office Contacts) • Media Contact Lynne Hare, Manager, Media Liaison, 8226 7998 • Data & Educational Measurementwww.decd.sa.gov.au\accountability(and use ‘Assessments’ and ‘Student Data Warehouse’ tabs in menu at top of page). • DIAf Resources http://www.decd.sa.gov.au/quality • MCEECDYAhttp://www.mceecdya.edu.au/mceecdya/ • NAPLANhttp://www.nap.edu.au/ • ACARAhttp://www.acara.edu.au

  37. Useful Literacy & Numeracy Contacts & Resources • Literacy Secretariathttp://www.decd.sa.gov.au/literacy (refer to practical guides for classroom teachers to engage and explore Literacy: Spelling, Reading, Persuasive Writing, Assessment etc) • Primary Maths/Science Strategyhttp://www.scimas.sa.edu.au/scimas • Teaching and Learning Serviceshttp://www.australiancurriculum.edu.au • The Australian Curriculumhttp://www.australiancurriculum.edu.au

More Related