1 / 46

NAPLAN Workshop

NAPLAN Workshop. Assessment for Better Learning using NAPLAN Data. Presented by Peter Congdon, Principal Consultant – Kmetrics On behalf of the VCAA. Workshop structure. Main themes

amir-hardin
Download Presentation

NAPLAN Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NAPLAN Workshop Assessment for BetterLearningusing NAPLAN Data Presented by Peter Congdon, Principal Consultant – Kmetrics On behalf of the VCAA

  2. Workshop structure • Main themes • How curriculum leaders and classroom teachers can use their school-level data to analyse the impact of their school’s learning programs. • How classroom teachers can use the responses to questions on the NAPLAN 2014 tests as a diagnostic tool to inform future teaching.

  3. Workshop Content • NAPLAN Data Service reports and functions • Methods of using the data and results for monitoring and improvement purposes • Working with data, interpreting data, describing data and developing an informed response to the data.

  4. Context – Professional practice • Using assessment data effectively has become embedded in teaching expectations and school improvement processes. • Use of data: • National Professional Standards for Teachers – Australian Institute for Teaching and School Leadership • Standard 5 - Assess, provide feedback and report on student learning

  5. Context - To use student data to improve teaching practice. Teachers need to be able to do the following: • Find the relevant pieces of data in the data system or display available to them (data location) • Understand what the data signify (data comprehension) • Figure out what the data mean (data interpretation) • Substantive and contextual • Select an instructional approach that addresses the situation identified through the data (instructional decision making) • Frame instructionally relevant questions that can be addressed by the data in the system (question posing) Teachers' Ability to Use Data to Inform Instruction: Challenges and Supports. U.S. Department of Education Office of Planning, Evaluation and Policy Development http://www2.ed.gov/rschstat/eval/data-to-inform-instruction/report.pdf

  6. https://naplands.vcaa.vic.edu.auHelpdesk phone 1800 648 637 AIM results (2007 and earlier) are no longer available

  7. Reference documents • Assessment materials • Test performance & content summary guides • Reporting guides • Descriptive exemplars of marking guides • Analysis strategies • Online tutorial assistance for reports can be accessed at http://usingassessmentdata.vcaa.vic.edu.au/naplan/index.aspx All available within the NAPLAN Data Service to support use of the results

  8. Box and Whisker Charts National State School Reference groups Focus group

  9. Normal Distribution Number of students if total = 100 10 15 25 25 15 10 90th percentile 75th percentile 50th percentile 25th percentile 10th percentile

  10. Skewed Distribution Number of students per 100 10 15 25 25 15 10 90th percentile 75th percentile 50th percentile 25th percentile 10th percentile

  11. NAPLAN Reporting Bands

  12. Example NAPLAN Summary Year 7 • What are the main features of these results? • Strongest in Writing and Spelling • Lower students not as low as State’s low students • Higher students not as high as State’s higher students in Reading, G&P and Numeracy • Is this a reflection of the school’s teaching program, and/or a feature of this cohort? • How much of these differences are due to imprecision?

  13. NAPLAN Year 7 Summary 2014 2013

  14. Year 7 results • Usually, Year 7’s have only been at your school for a few months prior to testing. • Results can reflect feeder school programs. • Consider grouping students by main feeder schools and sharing results – network.

  15. School Summary Exercise - 5 mins • Review your School summary report(s) • Address the following • Strongest in; • Lower students compared to State’s low students; • Higher students compared to State’s higher students; • Major influence on results include; • Strategies to consider;

  16. Trend Data • Find evidence of the impact of change over five years • Shows the range of student achievement levels, Box and Whiskers • Plots the mean student achievement level;

  17. Main sources of variability: • Different students • Work ethic • Behaviour • Home support • Measurement imprecision • Test properties • Equating • Group size • School • Leadership • Resources • Teacher • Effectiveness • Program • Content • Alignment

  18. Five Year Trend Exercise – 5 mins Review your Five Year Trend report(s) Address the following • Compare performance relative to Stategroup • High = Top 25% v State Top 25%, • Medium = Middle 50% v State Middle 50% • Low = Bottom 25% v State Bottom 25% • Reading • Writing • Spelling • Grammar & Punctuation • Numeracy • Identify Influencers on results • Cohort • School • Teacher • Programs • Main sources of variability: • Different students • Work ethic • Behaviour • Home support • Measurement imprecision • Test properties • Equating • Group size • School • Leadership • Resources • Teacher • Effectiveness • Program • Content • Alignment Choose one

  19. Group Summary Report How do our groups stack up against the State groups? What does this tell us about; Cohort, School, Teachers, Programs.

  20. Group Summary Exercise – 5 minutes Review your Group summary report(s) Address the following • Compare the performance of each group relative to State group in one domain • High = Top 25% v State Top 25%, • Medium = Middle 50% v State Middle 50% • Low = Bottom 25% v State Bottom 25% • Girls V State Girls • Boys V State Boys • LBOTE V State LBOTE • ATSI V State ATSI • Influencers on results • Cohort • School • Teacher • Programs • Main sources of variability: • Different students • Work ethic • Behaviour • Home support • Measurement imprecision • Test properties • Equating • Group size • School • Leadership • Resources • Teacher • Effectiveness • Program • Content • Alignment

  21. Assessment Area Report Raw score average, State (36 items @ 60% correct) = 36*.60 => 21.6 Raw score average, School (36 items @ 52% correct) =36*.52 => 18.7 Number of items Percentage of items answered correctly in short answer questions

  22. Assessment Area Exercise – 2 minutes • Identify if any dimensions have been flagged as significantly different from the State • Calculate or estimate the Raw score difference between your students and the State on one or more dimensions

  23. Writing Criteria Report

  24. Writing Criteria Exercise – 2 mins. Compare modal scores. Modal score = most common score Which criterion are you relatively strongest on? School State 3-4 4 2 3

  25. Item Analysis Report • How classroom teachers can use the responses to questions on the NAPLAN 2014 tests as a diagnostic tool to inform future teaching. Finding skills of relative strength or weakness – Graphical format Understanding student weaknesses – Table format

  26. Item Analysis Report Test and group details Link to some test details Link to test items

  27. Item Analysis Report – Graph Finding skills of relative strength or weakness Harder than State = Easier than State =

  28. Item Analysis Understanding student weaknesses

  29. Item AnalysisUnderstanding student weaknesses Half of the group could do it – B Remainder had problems. Were they related to Format, language, concept, process, knowledge, skill,...? 44+50 = 94 50 is half of 100 , 22 is half of 44 A D

  30. One quarter of the group could do it – option A Remainder had problems. Were they related to Format, language, concept, process, knowledge, skill, opportunity to learn,...?

  31. Student Responses - Individuals

  32. Item level Diagnostics • By comparing your students’ success at the item level, to that of all other students in the state, you can.. • Look for relative differences as the test progresses • Did our students answer all the items? • Were they consistently above, below or similar to the rest of the state? • What do any difference represent? Put it into context • Is language or vocabulary an issue? • Is test taking an issue – format, motivation, terminology? • Are there items that represent areas of the curriculum not yet introduced? • How does this compare to the rest of the state? • Are there areas taught, but not as well as expected? • Consider what is happening at other Year levels – curriculum mapping. • Where are the relative strengths – how can you learn from them? Peter Congdon - Kmetrics

  33. Item Analysis Exercise – 2 minutes • Identify items indicating relative strengths and weaknesses • Follow up (for class room teachers) Understand and capitalise on strengths Investigate relative weaknesses and develop a plan in response Discuss and share with colleagues

  34. Zone of Proximal Development Level of potential after other steps have been made. Zone of Proximal Development: What students are capable of learning with the guidance and support from teachers and peers. What students can already do independently. Vygotsky and other educational professionals believed education's role was to give children experiences that were within their zones of proximal development, thereby encouraging and advancing their individual learning

  35. Substantive descriptions of achievement levels Using the NAPLAN items to identify skills, knowledge, procedures... National bands – parent reports skill descriptors, www.nap.edu.au

  36. Describing Zone of Proximal Development – for most raw scores of 5-6 Easier drafting blizzard tertels Harder Seriusly orkwardly

  37. Relative Growth • How is relative growth defined? • Each student’s level of relative growth is determined by comparing their current year NAPLAN result to the results of the group of all ‘similar’ Victorian students. ‘Similar’ students are defined as those that had the same NAPLAN score two years ago. • Compared to these similar students, if a student’s current NAPLAN score is in the : • highest 25%, their growth level is categorised as ‘High’. (Green) • middle 50%, their growth level is categorised as ‘Medium’ (Yellow), and • lowest 25%, their growth level is categorised as ‘Low’ (Red). • Note that the percentages within each category will vary from school to school.

  38. Relative Growth +138 +18 -30 +15

  39. Relative Growth Exercise – 2 minutes • Identify domains with low relative growth greater than 25% • And or • Identify domains with high relative growth greater than 25% • Was the relative growth even across the starting Bands • Follow up (Co-ordinators and class room teachers) implementation and effectiveness of differentiation

  40. Working with NAPLAN Data • Principal • Analyse Summary and Trend results • Conduct program evaluations • Facilitate staff access to work with results • Co-ordinator • Map the relatively high and low performances against the delivery of the curriculum • Look for class/group differences • Work with colleagues . • Class room teacher • Diagnose misconceptions, weaknesses & strengths. • Develop teaching plans in response.

  41. Reporting Back • Summative position • Overall location of the students and subgroups • Shape of the distributions • Are students being left behind? • How spread out are they? • Is there too much focus on the low achieving students - top students held back? • How would describe the location of your students against the state? • How spread out are your students in the different dimensions? • Trends • How have your results changed over time? • Consider both the location and the shape of the distribution. • Are there identifiable factors that may be contributing? • Student level effects: motivation, engagement, home • Teacher level effects: method, style, experience, support, workload • School level effects: leadership, resources, programs • Growth, • Are students maintaining their position relative to the state? • Is the top growing as fast as the bottom relative to the state? • Is growth even across the dimensions? • Curriculum Mapping • Item level Diagnostics • Can you find relative strengths and weaknesses? • What can you and the school adjust based on these findings? • Link to curriculum scope and sequence • Link to programs and pedagogy Peter Congdon - Kmetrics

  42. Thank you for attending Please use the rest of this time to go over your results, and clarify your interpretations. PowerPoint presentation available at www.kmetrics.com.au Further help is available by contacting me directly Peter Congdon Principal Consultant Mobile: 0434 000 561 email: peter@kmetrics.com.au web: www.kmetrics.com.au

More Related