1 / 37

SAGE Student Assessment of Growth and Excellence

SAGE Student Assessment of Growth and Excellence. Utah’s Computer Adaptive Assessment. Judy W. Park, Ed.D . Associate Superintendent Utah State Office of Education October 22, 2014. SAGE assessment SYSTEM. Formative Optional instructional tools for teachers Interim

weston
Download Presentation

SAGE Student Assessment of Growth and Excellence

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SAGEStudent Assessment of Growth and Excellence Utah’s Computer Adaptive Assessment Judy W. Park, Ed.D. Associate Superintendent Utah State Office of Education October 22, 2014

  2. SAGE assessment SYSTEM • Formative • Optional instructional tools for teachers • Interim • Optional Fall and Winter Testing • Summative • Spring End of Course Testing

  3. SAGE assessment SYSTEM • Computer Adaptive Testing • English Language Arts • Grades 3 – 11 • Mathematics • Grades 3 – 8, Math I, Math II, Math III • Science • Grades 4 – 8, Earth Science, Biology, Physics, Chemistry • Writing • 60 minute Writing Prompt(Opinion/argumentative) • 60 minute Writing Prompt (informative/explanatory) • Grades 3 – 11 • Replaces Direct Writing Assessment (DWA) • Grades 5 & 8

  4. Historical Review • 2007 • Governor Huntsman’s Blue Ribbon Panel on Assessment • Recommendations approved by the Panel and State Board of Education • Implement EPAS (ACT) testing for all 8th, 10th and 11th grade students • Implement Computer Adaptive Testing to replace CRTs. • 2008 • SB 2002 Establishes K-12 Computer Adaptive Pilot • Sevier District • Juab District • 2010 • SB 16 Allows computer adaptive testing to replace CRT testing • Number of pilot schools/districts increase

  5. Historical Review • 2011 • Computer adaptive testing allowed for federal accountability • Number of schools/districts in pilot increases • 2012 • K-12 Computer Adaptive Testing Pilot • Charters - 8 • Districts - 10 (some to all of schools) • HB 15 – allocates $6.7 M for state-wide implementation of computer adaptive testing

  6. Pilot Success (6 years) lessons learned • Adaptive Testing • Low achieving to high achieving students • Interim Testing – multiple times during year • Fall to Spring growth • Spring to Spring growth • Robust Reporting • Formative testing

  7. Historical Review - 2012 • HB15 provides $6,700,000 on going funding for Adaptive Assessment • SB 97 provided one time $7,600,000 for technology • State Board of Education appoints RFP committee • RFP written, statewide review • State Board of Education appoints RFP selection committee • Proposals reviewed, scored and vendor selected

  8. American Institutes for Research (AIR) • Washington D.C. based non-profit • Only organization currently delivering statewide, online adaptive tests approved for ESEA accountability • 1,600 people working in the areas of assessment, education research and technical assistance, health, human development, and international development.

  9. Historical Review – 2013 SAGE developed by Utah Educators • Every test question has been developed and/or reviewed by 293 Utah residents • Educators • Parents • Stakeholders focused on fairness (cultural, gender and ethnic sensitivity) • Every test question was developed and/or reviewed in 4-5 separate committees. After each committee review, there was editing/revising of the questions to incorporate the feedback from the committee. • Development committee • Content committee • Fairness committee • Passage review committee • Parent review committee

  10. Historical Review – 2013 SAGE developed by Utah Educators/Residents • SAGE item bank (number of questions) • 11,783 questions • 400 – 450 questions per grade/course • 5,533 new Utah developed questions • 5,599 questions previously used in the Utah CRTs • 651 questions from Delaware and Hawaii • These questions were reviewed in every Utah committee except development

  11. How did Parents review the SAGE Test? • 15 member parent panel • Chosen by Speaker of the House, Senate President and State Board of Education Chair • At least two parents reviewed every question, and some parents viewed all questions • Of 11,773 test questions, only 43 items were removed (.004%)

  12. SAGE - 2014 Comments from the Field • “The SAGE system is overwhelmingly working. It has been a really positive experience.” Hal Sanderson, Canyon District Assessment Director • “CRTs were so easy they were boring. SAGE made me think.” Moab HS student • “I hate these tests! I can’t just guess anymore, I actually have to think and show that I understand.”  Woods Cross HS student • “The new item types really make me have to look at my instruction.  No longer will my students do well on tests because they can read well, they really have to know the content.” Weber District Jr. High Science Teacher

  13. SAGE - 2015 Feedback to inform 2015 • Focus Groups • Teacher Surveys • Solicit feedback in all presentations/meetings • E-mails and phone calls

  14. SAGE Results Reality Simple Equation New, more rigorous standards + New, more rigorous assessments = Reduced % of students proficient

  15. SAGE Results Reality • Reduced proficiency is • A result of more rigorous standards • A result of more rigorous assessments • A result of raising the bar/expectations for all students • Reduced proficiency is not • Decreased student performance • Decreased instructional excellence • Decreased school achievement • Student proficiency will increase as students, parents and teachers work together implementing the standards and assessments

  16. SAGE Results • 2014 SAGE results in October • Beginning November 2014 • SAGE results are immediate • Extensive Reporting System • Student Reports for Parents/Students • Classroom Reports for Teachers • School and District Reports for School Administrators • Interim and Summative results link to formative tools

  17. SAGE & Accountability • SAGE Proficiency • Percentage of students proficient • 2014 is baseline year • State and Federal Reporting • Grading Schools • UCAS Federal Accountability • PACE Report Card

  18. SAGE Test Questions http://sageportal.org/training-tests/

  19. www.schools.utah.gov

  20. Why new standards? Why now? Old standards were not adequate for success after high school. 40% of students in college need remediation in at least one academic subject. US Chamber of Commerce ranks Utah students low in post-secondary workforce readiness. By 2020, 74% of jobs will require more than a high school diploma. Prosperity 20/20 and the governor agree: We must raise the bar for students of all ages. 90% of elementary students must achieve math and reading proficiency by the end of third grade by 2020 66% of Utah residents should achieve post-secondary training by 2020 http://prosperity2020.com/

  21. Utah’s Leaders support new standards “What I do believe and what I do support is that…we need to have high standards, and that’s not in just math and…language arts/reading, it’s in all of our curriculum. We need to have high standards. We need to, in fact, raise the bar. I think everybody understands that. And I haven’t met anybody yet that doesn’t agree with that.” http://www.schools.utah.gov/fsp/College-and-Career-Ready/Meetings/2012-Spriing-Directors/Common-Core-FACTS---Brenda-Hales.aspx

  22. What are Utah’s New Standards? • Standards are the expectations for what students should know and be able to do. • They are not curriculum. • State Board of Education approved the Utah Core standards for English language arts and mathematics in 2010. • Standards meet nationally and internationally competitive benchmarks For additional information about standards: Understanding the Utah Core(http://www.utahpublicschools.org/Utah-core-links.html) Governor’s Utah Core Standards page.(http:www.utah.gov/governor/standards)

  23. Example of an old item measuring an old standard vs a new item measuring new standards Old Item example New Item example

  24. How will SAGE be reported? Scale scoreandproficiency level for each test taken by a student. The scale score shows the student’s performance on a test, converted to a common scale (number, 100-900). SAGE has vertical scales in mathematics and English language arts. These scales link the subject-based assessments from grade to grade to provide data on student growth over time. Example: 3rd grade ELA to 4th grade ELA SAGE Science assessments do not include a vertical scale because proficiency in one grade or course does not necessarily rely on content from the previous grade or course. Example: chemistry to physics

  25. How will SAGE be reported? • Proficiency levels indicate the student’s progress towards College and Career Readiness (CCR) • Scale Scores vs. Proficiency Levels • Scale scores indicate the individual level of what a student knows and is able to do. • Proficiency levels interpret the scale score into categories: Highly Proficient, Proficient, Approaching Proficient, and Below Proficient. Level 4 Highly Proficient On Track for CCR (proficient) Level 3 Proficient Level 2 Approaching Proficient Not on Track for CCR (not proficient) Level 1 Below Proficient

  26. SAGE Results: Spring 2014 Operational Field Test

  27. How was a proficient score determined? Educators took tests Viewed preliminary results Considered national data Participated in “bookmarking” Determined preliminary proficiency levels

  28. . Most DifficultItem 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 Easiest Item What performance data did educators use to inform the proficiency levels in SAGE? • Looked at all items from least to most difficult • Used expert experience and judgment to determine proficiency levels Proficiency ApproachingProficiency • The participants of a large stakeholder group that includedstate school board members, superintendents, community advocates, parents, and the Governor’s Office reviewed the results and affirmed the recommendations from standard setting. • Utah State Board of Education approved the cut (proficiency) scores on Sept. 5 2014.

  29. What performance data did educators use to inform the proficiency levels in SAGE? • Educators took into account Utah’s expectations along with nationally recognized and accepted assessments to move toward the broader goal of preparing students for success in college and the workplace. • Educators considered information from national indicators:   • ACT (American College Test) • NAEP (National Assessment of Educational Progress) data  For more information about these assessments, please visit: http://nces.ed.gov/nationsreportcard/ http://www.act.org

  30. How does SAGE compare to the Old CRT’s Comparisons are not typically valid. Both tests measure academic knowledge and skills, but use different methods for doing so. Year-to-year trend data for SAGE will be available in the coming years. Comparing Different Performances:Don’t! Proficiency levels have changed. What was good enough in the past no longer is. Imagine a test that evaluates a student’s running speed. In the past, where we may have told a student her running was fast enough, we now have a higher expectation and a more rigorous test. The same student’s running must improve, and in addition she must jump hurdles before we can say she is fast enough. And in the academic world, “fast enough” means ready for college and career.

  31. What do the new scores look like? • As expected, fewer students are proficient. • Students do not suddenly know less. Teachers are not teaching less. • The bar measuring expectations on the learning continuum moved. Teachers and students will need time to make the adjustment. • Of course, this is NOT an indication of decrease in student achievement, rather it reflects an increase in expectations.

  32. What do the new scores look like? • Typically, scores shift downward after new standards and assessments. Look at Kentucky and New York: Utah Kentucky New York

  33. How will families see their data and be able to respond? • Families will receive their individual school report from their teacher and local school on or after October 27th. • Teachers will be available to explain the new reports and interpretive guides will be provided. • Families and Utah educators can work together using this data to improve each student’s post high school success. • If your student is not yet proficient on one or more of the SAGE assessments, talk to his or her teacher to understand the plan to get there.

  34. Teachers speak about the New test http://stream.schools.utah.gov/videoarchive/assessment/sage%20(2).mp4

More Related