1 / 16

What Are States Doing to Prepare For the Next Generation of Assessments?

What Are States Doing to Prepare For the Next Generation of Assessments?. Planning For 2014-2015 and Beyond John Olson Barry Topol National Conference on Student Assessment New Orleans June 25, 2014. Overview.

deidra
Download Presentation

What Are States Doing to Prepare For the Next Generation of Assessments?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What Are States Doing to Prepare For the Next Generation of Assessments? Planning For 2014-2015 and Beyond John Olson Barry Topol National Conference on Student Assessment New Orleans June 25, 2014

  2. Overview • In 2012, ASG initiated a new state assessment survey program that focused on state planning for 2014-15 when new state- and consortia-led assessments are slated to be implemented. • In 2013, a second round of state surveys took place May-August with a total participation of 42 states, compared to 33 states in the first round. • The increase in the number of states surveyed was driven partially by the desire of states to know more about what other states are doing to prepare for 2014-15.

  3. 2013 Survey Topics • The latest survey topics included: • plans and recent changes in state assessment programs and consortia membership • funding and costs for current and future assessment components • state implementation of technology • future plans for assessment sustainability • test security issues • other issues related to the assessment consortia and plans for the future

  4. Survey Database Goals of the state assessment survey program: Gather important information on key issues and challenges states face in transitioning to the new assessment systems Summarize how states are responding to these issues and challenges Make all data available to participating states in an easy-to-use accessible format All data from the states, including state assessment cost data, were included in a searchable database and Excel file, which ASG delivered to participating states in January 2014.

  5. Highlights of State Survey Findings • Technology implementation was still the number one concern of states; however states reported some encouraging progress on using technology • Costs for the new assessments continued to be a concern to some states • Many states said they were “fully committed” to their consortium; at the time Plan A for most was still the PARCC or SBAC assessment, Plan B usually was to continue the current state assessment • States said test security was becoming an increasingly important issue and concern for them, and many useful new documents and resources were mentioned, e.g. CCSSO TILSA Test Security Guidebook, NCME whitepaper on test integrity, Operational Best Practices report, etc. www.assessmentgroup.org

  6. Session Presenters • This session will focus on results of the data collection efforts for the most recent survey period and present key findings and information from states. Three state representatives will share their latest plans and perspectives on preparing for 2014-15 and address critical issues in their states. • Juan D’Brot, West Virginia DOE • Roger Ervin, Kentucky DOE • John Weiss, Pennsylvania DOE • Barry Topol, ASG (Discussant)

  7. State Assessment Costs • ASG has attempted to put all states on common footing in reporting the state assessment cost numbers • We use NCLB mandated grades (3-8, plus one year of HS) and domains (math, reading, writing, science) only for the first set of calculations • EOC assessments that are also used for accountability purposes are factored into the cost calculations for the appropriate grade(s) • Extra grades tested in math/reading, writing and science are excluded from cost figures except in calculating the total assessment spending per student number (last column) • ASG cost figures are therefore, potentially lower than what others report as spending on consortia equivalent assessments www.assessmentgroup.org

  8. State Assessment Costs • The average PARCC state pays $35 per student for math/ELA (incl. writing) while the average SBAC state pays $19 for the same grades and subjects • The simple average of all states spending on math/ELA (incl. writing) is $28 and the weighted average spending is $25 • The average PARCC states spends $54 per student for all assessment spending while the average SBAC state spends $36 a student • The simple average of all states spending for all assessments is $50 and the weighted average is $47 www.assessmentgroup.org

  9. State Assessment Costs • 11 SBAC states surveyed spend more than $22.30 for M, R and W while 8 spend less • 7 PARCC states surveyed spend more than $24.50 for M, R and W, while 2 states spend less and 1 state spends the same • Note our methodology could result in a lower spending estimate for M, R and W than states report to others www.assessmentgroup.org

  10. Summary and Conclusions www.assessmentgroup.org

  11. Summary and Conclusions • Since the survey was conducted, Common Core backlash (national test, cost, student data privacy) now taking hold, leading to several states recently dropping out of the assessment consortia • More states could leave the consortia in the coming months • There is still time for states to develop their own assessments based on the standards they choose • But, they need to move quickly • Some test vendors are also developing common core based assessments as an alternative to the consortia • States want rigor AND reasonable cost • Consortia developed tests are seen as the highest quality alternative by most states www.assessmentgroup.org

  12. Summary and Conclusions • With PARCC reduction in anticipated per student summative assessment price to $24.50, there are now 3 affordable alternatives below the average price a state pays today for its summative assessment • Potential future products from the consortia as well as the promise of unified assessment systems and a common reporting scalemake their products compelling www.assessmentgroup.org

  13. Summary and Conclusions • States appear to making good progress in moving to online assessment. The future may finally be getting here. • Almost all new assessment implementations are online • Roughly 2/3 of surveyed states are doing significant testing online • However, states still extremely concerned with their ability to implement full scale online assessment (number 1 issue) • State funding for technology is not forthcoming. Only CA has received significant $ for technology • Will other funding sources (eRate, philanthropic sources) fill the gap? • Strange things seem to happen when implementing OLA at high volumes. Recent experience in online testing is not comforting (e.g., OK, MN, IN, KY) although consortia field tests went well. www.assessmentgroup.org

  14. Summary and Conclusions • AI is another critical element in test affordability but its current use in statesis limited and we hear efforts to score new item types have been disappointing. • The vendor/customer partnership required to make AI successful has been slow to develop • Most remaining state assessment departments that we surveyed were willing to stick with their consortium • Many like the proposed rigor of the tests • Most want an integrated system (SBAC) and one system for both elementary and high school grades (both), but others are moving to use a mixed approach www.assessmentgroup.org

  15. Summary and Conclusions • Contingency planning more prevalent than in prior surveys which also provides some comfort to states • Most states plan is to use their existing test augmented with CCSS items if an alternative assessment is needed for 2014-2015 • However, the decision to stay or leave is a political decision, not an assessment one • States appreciate getting the data from our survey and having a broader perspective on the issues affecting many states • ASG thanks states for their participation in the survey program and welcomes ideas to make the process faster and more efficient, as well as suggestions for new topics • The next survey kicks-off in July! www.assessmentgroup.org

  16. Contact Info John F. Olson 617-965-1490 jolson@assessmentgroup.org Barry Topol 210-859-9920 btopol@assessmentgroup.org

More Related