1 / 55

Language Arts Day!

nita
Download Presentation

Language Arts Day!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Language Arts teachers are invited to gather at these sessions to study NeSA-Reading and NeSA-Writing data and use this information to consider how to continue to improve student achievement. In addition to the data analysis and NeSA updates, we will study recent research, pedagogy, and strategies directly related to language arts instruction. Language Arts Day! October 24, 2011

  2. Purposes Understand NeSAprotocol, resources, and results Study NeSA results Use data to inform decisions for improving student achievement Study explicit instruction Share experiences, tips, ideas

  3. Getting Started • Introductions • Norms • Parking Lot • Wikispace: http://esu6la.wikispaces.org • Agenda • Handouts & Copies • Survey

  4. 2011 NeSA Testing • Think • What were the challenges of the NeSA-R or NeSA-W? • How did (or might) these be overcome? • What went well? • Have you made any curricular adjustments? • Pair • Someone with similar responsibilities • Share • 3-5 minutes

  5. Interaction Sequence Ask all student the question. Pause (3+ seconds). Put students on-the-clock. “You have 2 minutes to share your answer with your partner.” Students share their thoughts with a partner. Select student(s) to respond. Purposeful Selection: Call on students you have visited. Random Selection: Call on students so every student has an opportunity to be selected. Volunteer Selection: Allow volunteer responses. APL (Sharer, Anastasio, & Perry, 2007, p. 80-85) • Conference with 1 or 2 pairs • Check student answers • Probe • Provide answers when missing

  6. NeSA Soup! DAC CAL DRC DRS AYP TOS PLD C4L

  7. Understanding the NeSA-R & NeSA-W http://www.education.ne.gov/assessment/NeSA_Presentations.htm

  8. Nebraska schools should use NeSA data to . . . • Provide feedback to students, parents and the community • Inform instructional decisions. • Inform curriculum development and revision. • Measure program success and effectiveness. • Promote accountability to meet state and federal requirements.

  9. NeSA is . . . • A criterion-referenced summative test. • A measurement of the revised Nebraska Reading Standards specific to vocabulary and comprehension. • A tool including 45 to 50 multiple-choice items. • A test administered to students online OR paper/pencil.

  10. What are . . . Tables of Specification

  11. What are . . . Performance Level Descriptors

  12. NeSA . . . • Produces a raw score that converts to a scale score of 0-200. • Allows for students to be classified into one of three categories: Below the Standards, Meets the Standards, Exceeds the Standards. • Provides comparability across Nebraska school buildings and districts.

  13. SCALE SCORE – a student’s transformed version of the raw score earned on NeSA ~NeSA Terminology~

  14. How are performance levels determined? • Cut score processes: • Contrasting Group Method – 400+ teachers • Bookmark Method – 100+ teachers • State Board of Education Reviewed • Examined results of both processes • Examined NAEP and ACT results for Nebraska • Made decisions within recommended range at public meeting

  15. RAW SCORE – the number of items a student answers ‘right’ on NeSA-R on NeSA Reports on Conversion Chart ~NeSA Terminology~

  16. What is the difference between a raw score and a scale score? What is a raw score? A raw score is the number of correct items. Raw scores have been typically used in classrooms as percentages: 18/20= 90% correct. ~NeSA Terminology~

  17. What is a scale score? A scale score is a “transformation” of the number of items answered correctly to a score that can be more easily interpreted between tests and over time. The scale score maintains the rank order of students (i.e., a student who answers more items correctly gets a higher scale score). For NeSA, we selected 0-200 and will use it for all NeSA tests, including writing. ~NeSA Terminology~

  18. Why convert raw scores to scale scores? Raw scores are converted to scale scores in order to compare scores from year to year. Raw scores should not be compared over time because items vary in difficulty level. Additionally, raw scores should not be compared across different content area tests. Scale scores add stability to data collected over time that raw scores do not provide. ~NeSA Terminology~

  19. On score reports why is the . . . SCALE SCORE CONVERTED TO PERCENTILE RANK? The percentile rank was placed on the score reports because our Technical Advisory Committee felt that parents would want to know their child’s position in relation to other test takers. A percentile rank of 84 means the child scored better than 84% of the students who took the test that year. ~NeSA Terminology~

  20. What does the Scale Score Look Like in Action? Although the test items are comparable, they are different.

  21. Scale Score • Think • What are the key points about scale scores that you would share with a parent who has questions about the NeSA-R? • Ink • Write 2-3 points. • Link • Find a partner; give one, and get one. Repeat.

  22. Name That Concept! AKA Talk a Mile a Minute and Password

  23. NeSA Related Terms Table of Specifications (TOS) District Assessment Contact (DAC) Scale Score Adequate Yearly Progress (AYP) Percentile Rank

  24. NeSA Related Terms Check 4 Learning (C4L) Raw Score Proficiency Level Descriptions (PLDs) Data Reporting System (DRS) Criterion-Referenced Assessment

  25. NeSA Reports What can we learn from this report? Do we have other data to support these results? What are the implications of this report?

  26. NeSA REPORTS • Individual Student Report • School Student Roster • School Indicator Summary • School Performance Level Summary • District Reading Indicator Summary • District Performance Level Summary • District Report of School Performance

  27. Individual Student Report

  28. School Student Roster

  29. School Indicator Summary

  30. School Performance Level Summary

  31. District Reading Indicator Summary

  32. District Performance Level Summary

  33. District Report of School Performance

  34. Step 1:Define the Situation What can we learn from each report? What is the data telling us (strengths and concerns)?

  35. Step 2:Establish hypotheses Why are we getting these results?

  36. Step 3:Verify / Refute Hypotheses Do we have other data to support these results?

  37. Step 4:Create the Action Plan How can we use this NeSA data? What is the goal? (How much change is expected and by when?) What will be done to reach the goal(s), and how will progress toward goal(s) be measured?

  38. Curriculum Alignment Examine PLDs and Tables of Specification. Are the tested indicators in our curriculum? -- Where? When are they taught? How are they instructed? At what DOK (Depth of Knowledge) level? By whom?

  39. Instructional Effectiveness Examine PLDs and Tables of Specification. Do our students have opportunity to learn (and practice) the tested indicators? Is our instruction efficient and effective? How are students performing on the indicators on a day-to-day basis? Are we assessing them locally to find out what they know and can do?

  40. Test Preparation Have our students used practice tests? Are our students familiar with the testing tools? Are we familiar with appropriate accommodations? Have we applied them?

  41. March 26 - May 4, 2012 2012 NeSA-R Testing • Grades 3-8, 11 • Standardized, secure testing procedures • Paper and pencil or online (already submitted) • Two independent sessions • Untimed • Cuts remain the same (0-84, 85-134, 135-200) • What can you do or not do? (pages 19-21 & 30-34 in SAA-8)

  42. 2012 NeSA-W Testing • January 23 – February 10, 2012 Grade 4 • narrative • two 40-minute sessions (timed) • #2 pencil • holistically scored in 2012 (analytically scored in 2013) • same cut scores as previous years (new in 2013)

  43. 2012 NeSA-W Testing • January 23 – February 10, 2012 Grades 8 & 11 • descriptive (8); persuasive (11) • online test administration • one ~90-minute session (untimed; 2011 avg. = 45-65 min.) • analytically scored (composite + 4 weighted domains) • online dictionary and thesaurus; no spell-check • 6,000 character limit (approx. 3 pages) (SAA-8, p. 45-51)

  44. 2012 NeSA-W Testing • January 23 – February 10, 2012 Grades 8 & 11 (cont.) • software update (wrap, spaces, dictionary) • no “tab” (advise students to use 3-5 spaces) • font size, spacing, margins do NOT affect scoring • new cut score set in April 2012 • composite score converted to scale of 0-70 • can print practice & operational tests (SAA-8, p. 45-51)

  45. The way I see it…  • We have 3 years of data to consider. • We can do some things right now… • test procedures, format • general test-taking skills • motivation • initial analysis, hypotheses, instructional change • curriculum alignment • effective instruction • …and, we need to have a long-term, sustainable approach. • analysis of trends • hypotheses, instructional change, study results, etc. (PDSA cycle) • diagnosis/intervention plan

  46. 2011 Statewide Results

  47. 2011 Statewide Results

  48. Improving Adolescent Literacy… http://ies.ed.gov/ncee/wwc/PracticeGuide.aspx?sid=8

  49. Explicit Instruction http://explicitinstruction.org/?page_id=80

More Related