1 / 27

The Importance of Quality Education Data

The Importance of Quality Education Data. Marilyn Seastrom National Center for Education Statistics National Forum on Education Statistics 2012 Summer Forum Meeting . Data-Driven Decision Making is an ED Priority. Secretary Duncan on the importance of data

electra
Download Presentation

The Importance of Quality Education Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Importance of Quality Education Data Marilyn SeastromNational Center for Education Statistics National Forum on Education Statistics 2012 Summer Forum Meeting

  2. Data-Driven Decision Making is an ED Priority Secretary Duncan on the importance of data “we need robust data systems to track student achievement and teacher effectiveness.” People “. . .need to understand data.” “We must tell the truth and we must tell it clearly. We cannot communicate an undecipherable code.” “There's so much opportunity for growth and progress in this area. . . . It takes courage to expose our weaknesses with a truly transparent data system. It takes courage to admit our flaws and take steps to address them.” 2

  3. Data-Driven Decision Making The Elementary and Secondary Education Act (ESEA) calls for public reporting of results from state assessments of student achievement by subject, grade, overall and for student subgroups These data and others at the school, district, and state level form the basis for monitoring accountability. -Gender, -Race and ethnicity, -English proficiency status -Migrant status, -Disability status, and -Economic status 3

  4. Data-Driven Decision Making A 2012 search of the ED web site for this term finds over 2,000 entries for the use of data At the local level, For instructional decisions, To measure accountability, To award formula grants for literacy programs, To ensure disadvantaged students receive maximum benefits, For improvement of neighborhood schools, etc. 4

  5. Data-Driven Decision Making Education data are used in determining the allocation of program funds -Special education, -Adult education, -Title I, -School Improvement Grants, -Race to the Top, -Early Learning Challenge, -Investing in Innovation, -Teacher Incentive Fund, -Promise Neighborhoods. 5

  6. Data-Driven Decision Making The increased focus on education data is also evident in the increased use of education data in the press— Graduation and dropout rates, Performance on international assessments of mathematics and science, and Performance gaps between subgroups of students on academic performance assessments. 6

  7. Data-Driven Decision Making The use of education data to inform discussions among policy-makers, in the press, and within the general public continues to grow. As a result, the increased emphasis on the use of data has shined a bright light on the quality of the data. Thus the emphasis on “The Importance of Quality Education Data.” 7

  8. What is Data Quality Information Quality Guidelines are a requirement for ED and other agencies subject to the Paperwork Reduction Act, for ensuring and maximizing the quality of information disseminated by Federal agencies; where quality is measured by the objectivity, utility, and integrity of such information; Ensuring that influential data meet higher standards of reproducibility. 8

  9. Information Quality Guidelines Objectivity involves a focus on ensuring accurate, reliable, complete and unbiasedinformation; Utility refers to the usefulness of the information to its intended users, including reproducibility and transparency of the information; and Integrity refers to the security of information, to ensure that the information is not compromised through corruption or falsification. 9

  10. NCES Mission and Duties • To collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations. Such data must • Meet the highest methodological standards; • Be objective, secular, neutral, and nonideological and free of partisan political influence and racial, cultural, gender, or regional bias; and • Be relevant and useful to practitioners, researchers, policymakers, and the public. 10

  11. NCES Mission and Duties • The Statistics Commissioner may, as appropriate, use information collected from, among others— • States, local educational agencies, including information collected by States and local educational agencies for their own use. • NCES may establish a national cooperative education statistics system for the purpose of producing and maintaining, with the cooperation of the States, comparable and uniform education information and data. 11

  12. National Forum on Education Statistics • Created in 1989 to collaboratively pursue improvements in our education data system. • Published 40 reports on topics on a wide range of topics • Data Handbooks and Education standards and codes, • Education Facilities and Technology, • Privacy and Safety/Security, and • Researcher Access to Education Data. 12

  13. Forum in the Early Years • The first Forum report in 1990 “A Guide to Improving the National Education Data System” • Credo: Good data help to make good policies. • Objective: To provide adequate, timely, useful, accurate, and comparable data to education policymakers at all levels. • Focused on recommendations for reporting at the state and national levels. 13

  14. Forum Now • Original credo and objective—still relevant • Credo: Good data help to make good policies. • Objective: To provide adequate, timely, useful, accurate, and comparable data to education policymakers at all levels. • To serve current data needs, the focus must expand to include data at all levels—schools, districts, state, and national. 14

  15. Data Review Procedures at ED • State data coordinators submit data through EDFacts. • Edit reports from initial EDFacts edits. • Additional edits by Census/NCES flag potential errors • Potential errors in State data and related local data are Critical, • Other potential errors in local data are non-Critical. • For Critical errors, States must confirm or correct. 15

  16. Time for a Change: Spring 2012 • Current edits • Do not take school or district size into account, • Compare current year with prior year to identify differences of +/-20%, • Yield too many false positives. • ED initiated a contract vehicle to redesign the CCD data editing procedures. 16

  17. Time for a Change: Spring 2012 • May 4, U.S. News and World Report released “America’s Best High Schools 2012” with gold, silver, and bronze awards to close to 5,000 top performing schools. • Based on CCD data and other sources. 17

  18. Time for a Change: Spring 2012 • May 9, Las Vegas Sun—13th place ranking of Green Valley High School (GVHS) is based on incorrect data. • Rankings included 2009-10 CCD data with 477 students and 111 teachers (a 4 to 1 ratio), and an AP exam pass rate of 100 percent. • School data show 2,850 students and 111 teachers (24 to 1 ratio), and an AP exam pass rate of 64 percent. • Questions also raised about data for Clarke County. 18

  19. A Call to Action NCES received a late afternoon call from Las Vegas Sun reporter on May 8th . The next day reporters inquired about the data submission and data verification procedures. ED initiated an investigation. First submission was correct for GVHS, but another school triggered a critical error. Wrong file uploaded to correct that error, incorrectly changing data for GVHS and 6 Clarke County schools. 19

  20. A Call to Action ED’s investigation—neither set of edits were repeated on the resubmission. Programming error on the initial edit. Time constraints, and decision based on incorrect information for additional Census/NCES edits. Additional problems were identified in California. Errors in grade specific school level enrollment counts fell into the non-critical edit set and were not pursued. May 11, Why are bad data still on the NCES web? 20

  21. NCES Response Identified errors were set to missing in on-line data tool and posted to an errata report, with a letter of explanation from the NCES Commissioner. Reviewed school and school district enrollment, teacher count, and pupil to teacher ratio, grade 12 enrollments, and free and reduced price lunch data for 2009-10 and 2010-11 first of the 5,000 ranked schools and then of all schools and districts. Applied new editing strategies. Identified errors set to missing and posted to errata list. 21

  22. NCES Response Error Reports were generated and sent to States with a request for corrections or explanations of why the identified values are correct. Request for quick turnaround to minimize the delay in issuing corrections to 2009-10 data and in the release of the 2010-11 data. Additional submissions will be incorporated into future revisions. 22

  23. New Edit Procedures Compute average variation over the prior 4 years. Based on differences between each year and the other 3 years. Compute average variation between the target year and the four preceding years. Average of difference between target and 4 prior years. To identify potential errors, compare average variation for target year to that for prior years relative to established parameters. For example, more than 3 times and greater than 10 . 23

  24. New Edit Procedures Analyze changes in ratios that use the counts. Membership & Student Teacher Ratio Teachers & Student Teacher Ratio Free Lunch & Free Lunch to Membership Ratio Grade 12 count & Percent Grade 12 Students BOTH ratio and count must be identified as potential errors. Establish parameters based on largest outliers to minimize number of false positives. 24

  25. Next Steps Feedback to date—with few exceptions, the potential errors identified using the new edits were either corrected or attributed to the introduction of virtual learning opportunities. Refine and expand new editing approaches. Add the identification of schools with virtual learning students to the data collection. Solicit additional input from data providers in schools, districts, and states. 25

  26. Next Steps State and national data are really only as good as the local data they are built from. To provide good data to help make good policies, it is imperative that we all work together to produce high quality data at all levels. 26

  27. NCES Contact Information NCES website: http://nces.ed.gov/ NCES newsflash: sign up at http://ies.ed.gov/newsflash/ CCD website: http://nces.ed.gov/ccd/ CCD Errata Report: http://nces.ed.gov/ccd/elsi/ Marilyn Seastrom Marilyn.Seastrom@ed.gov (202) 502-7303 Thank you! 27

More Related