1 / 47

Welcome! 2012 NCES Common Core of Data Non-Fiscal Coordinator Training

Welcome! 2012 NCES Common Core of Data Non-Fiscal Coordinator Training. Patrick Keaton and Robert Stillwell NCES. Overview. What is CCD and how is it used Recent data issues CCD Process (Collection, Editing, Publishing) Other topics Determining the values of unknown data Virtual schools

moke
Download Presentation

Welcome! 2012 NCES Common Core of Data Non-Fiscal Coordinator Training

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Welcome!2012 NCES Common Core of DataNon-Fiscal Coordinator Training Patrick Keaton and Robert Stillwell NCES

  2. Overview • What is CCD and how is it used • Recent data issues • CCD Process (Collection, Editing, Publishing) • Other topics • Determining the values of unknown data • Virtual schools • Awards

  3. The Common Core of Data • NCES’ annual collection of basic data on public schools systems in the states and state-level jurisdictions in the US and affiliated land areas. • CCD nonfiscal information is obtained from state-level education officials through the EDEN/EDFacts data collection system.

  4. CCD Surveys/Components There are 8 components in the Common Core of Data. Nonfiscal • School, Agency, and State Universe files • Agency and State Dropout and Completer files Fiscal • Agency and State Universe files • Teacher Compensation Survey

  5. Where Can I Find the CCD? CCD Home Page: nces.ed.gov/ccd On the NCES page, you can also find the data tools, such as ElSI, BAT, and the locators.

  6. CCD Mission • NCES is committed to providing the public with a complete and accurate accounting of the U.S. Public Elementary and Secondary Education System • Rather than having the data users contact each state independently, we would rather have one place to access all the data. • Our goal is to make provide accurate and consistent data in a timely manner.

  7. How are CCD Data Used? • Utilized by policymakers, news agencies and the general public to understand the state of the nation’s public education system. • Cited in many nationally recognized reports, such as the Condition of Education and various news articles • Used as a sampling frame for surveys to limit the burden on schools and districts By creating a robust and accurate clearinghouse of data, users can access the information they need.

  8. How are CCD Data Used? • Other Organizations use it to aid schools: • E-Rate, REAP, Title 1 • Private Organizations (Verizon, Gates Foundation) • Federal offices (FAFSA, Homeland Security) More accurate data = better decisions.

  9. Recent Data Issues • Recently data errors were identified that called into question the credibility of these school and agency level data. NCES has worked with SEA’s to identify, explain, and correct unlikely data anomalies. The updated 2009-10 and 2010-11 data files will be released in the coming weeks. • Due to the increased use and scrutiny of these school and LEA level data NCES has changed its policies governing the review and publication of CCD data. Major data anomalies must be explained for inclusion. • In the future CCD will include full edit specifications and flag the results of those edits on our data products.

  10. The CCD Process • There are three major steps in the CCD Process • Collection • Editing • Publishing

  11. The CCD Process • Key Players • National Center for Education Statistics (NCES) – CCD Data Stewards • EDFacts – collects the CCD Non-Fiscal data, and sends data extracts to the U.S. Census Bureau. • U.S. Census Bureau – edits the CCD Non-Fiscal Data, and works with EDFacts and CCD Coordinators to verify data and revise errors. • Census and other contractors– works with NCES to publish the CCD data.

  12. The CCD Process - Collection • States submit CCD data to EDFacts. • Please review your Match, Summary, and Edit Reports in EDFacts. • For the most part, Census edits match EDFacts edits. • These edits are designed to show you potential errors as soon as you submit the data. We want to show you problems in the data that you may have missed.

  13. The CCD Process - Collection • Every year, NCES, Census and EdFacts meet to discuss the edits that you see. We try to determine which edits need to be added, removed or adjusted. • We realize the burden on states to provide ED with quality data. We want to make sure that your time is not wasted on faulty error reports.

  14. Schedule for 2011-12 CCD-Related Data

  15. The CCD Process - Collection • Importance of timely data • It is critical that states meet the deadlines to the best of their ability • If the states can not meet the deadline, please inform EdFacts and Census so they can plan accordingly. • Continue to update data in EdFacts as more accurate and complete data is available

  16. The CCD Process – Editing • CCD Mission: NCES is committed to providing the public with a complete and accurate accounting of the U.S. Public Elementary and Secondary Education System • Error Reports should be used to help guide the cleaning of both NCES CCD data and SEA data. • Edits are made when the SEA can not or will not provide either a full explanation for an identified anomaly or update the data with more accurate data • Imputations are made at the state level

  17. The CCD Process – Editing • Current Goals: • Reduce the number of false positive error identifications • Create edits to check each variable at every level of reporting and aggregation • Evolve edits to be increasingly sophisticated and flexible • Match edits to collection cycle • Insist upon, capture, and report any explanation for identified errors • Suppress or impute any data that cannot be explained • Future Goals • Put results of error testing along-side data (especially preliminary data)

  18. The CCD Process – Error Reports • Importance: These are the main method to inform you of the potential errors we have found in your data. • False Positives: New edits are designed to significantly reduce the number of false positive errors identified • Use moving averages • Control for quantifiable and unquantifiable attributes • Continuous improvement – we consider the evolution of our edits key to moving forward • Now broken up by EDEN files • How else can we make these more digestible? • Include sub-reports broken down by LEA?

  19. The CCD Process – Error Tests • Error reports generated by submitting data to a number of tests • Match Tests: Status checks, school types, characteristics • Data Validity Tests: Do these data conform to established parameters (numeric, string)? Are addresses real? Is the address duplicated too many times? • Internal Validity Checks: • Grades offered vs. Reported Enrollment • Subtotals vs. Detail • Teachers vs. Membership (a school with membership must have teachers) • Cross-level Validity Checks: • School vs. State, LEA vs. State, School vs. LEA • Longitudinal Validity Checks: • Current vs. Prior Year • 5 Year Checks

  20. The CCD Process – Error TestsMatch Tests • Match Tests are the most fundamental tests performed • A school must close, it cannot disappear • A school must open (new, added, future, re-opened), it cannot simply appear • Decisions to close a school should not be taken lightly • Lose the longitudinal track record of a school • Unable to test for effectiveness of programs or practices • A school is not a name or a building • When a school changes substantially such that it is no longer comparable to prior years it should close • School Type, Magnet Status, Charter Status, Grades are all factors to weigh when deciding to continue or close a school

  21. The CCD Process – Error TestsData Validity Tests • Simple data type tests • Numeric data in Numeric Fields • String data in String Fields (length considerations) • Address testing • We require a valid location address for a school • 123 Fake Street • LEA Addresses for all associated schools • Census Analysts will look to outside sources to update any addresses that appear inaccurate • Census will inform all states of any inaccurate addresses found and the address that is used in replacement

  22. The CCD Process – Error TestsInternal Validity Tests • Sum-Checks: Do the details sum to the totals and subtotals • Grades Offered vs. Membership • If a grade is offered something should be reported for the grade level subtotal • Grades Offered are used for grade span, school level, and are cause for a large portion of the errors identified in a school or LEA • Membership and Teachers • If a school has students it must have teachers • Students and teachers should be reported at the school they sit/work in • It makes no difference who the teacher’s contract is with, a teacher is a teacher

  23. The CCD Process – Error TestsInternal Validity Tests—Cont. • Staff • All staff counts are FTE counts • LEAs must have LEA staff • LEAs that operate schools with students must have guidance councilors, librarians, support staff, administrators

  24. The CCD Process – Error TestsCross-Level Validity Tests • Reporting at the school, LEA, and state levels • Nearly impossible to have more of {students, teachers, staff} at the school or LEA level than at the state level • Comparisons between the school and the LEA are less valid due to differences in data definitions • Where a student sits and a teacher teaches vs. who pays for them • Some comparability is still expected • School level charter status flags vs. LEA type

  25. The CCD Process – Error TestsLongitudinal Validity Tests • Current Year vs. Prior Year Percent Change • Trigger if CY is more than a 10% difference from PY at the SEA level • These checks are being phased out • New Methodology: 5 year stability comparison • Average amount of differentiation for all possible comparisons across 4 prior years is computed (PYD) • Average amount of differentiation between current year and 4 prior years is computed (CYD) • If CYD > [3] * PYD and CYD > [10] • Both the targeted count, usually a subtotal, and an applicable ratio must both trigger for the error to move forward • Tolerances based on previous year across all states such that only biggest outliers are identified

  26. The CCD Process – Error Tests5 Year Stability Comparison • Membership & student teacher ratio • Teachers & student teacher ratio • Non-white students & percent non-white • Male students & percent male • Elementary students & percent elementary • Free lunch eligible & percent eligible • Administrators & administrator to school ratio • Support staff & support staff to student ratio • Guidance counselors & counselor to student • And on, and on, and on…

  27. The CCD Process – Error Reports • What should you do? • Respond • Ask districts to provide an explanation • Check your coding • If you receive too many errors on any one type then there is likely a systematic problem that either needs to be resolved or can be explained • What if the district has already validated the data? • Ask them to provide full explanation • Send them the error reports to respond to • Inform them that without explanation their data will not be accepted and will not be included on the CCD • What if the data is not in my system? • Report it as “unknown,” “missing,” “uncollected.”

  28. The CCD Process – Error Reports • What will CCD do? • Accept and document valid and comprehensive explanations • Ask for more information/resubmissions • Reject data that has been identified but cannot be explained • Suppress, flag or impute all rejected data items • The CCD will no longer report data that doesn’t meet our quality standards • This is a moving target. As our edits improve so too will our standards.

  29. The CCD Process – EditingMoving Forward • Edits will be run as data come in and reports will be relayed to data submitters within a day of submission • Systems are being set up to facilitate this process • Edit reports will be broken down by data group and by LEA to help you to communicate with SEA and LEA staff • Full reports and individual LEA reports will be provided to SEA Coordinators for review and (if necessary) dissemination to LEAs • Data will be released at pre-determined times (quarterly) • Error flags and explanations will accompany data release • If your data is not in then we will not hold up other state’s data release • Errors can be resolved and will be reported on an ongoing basis

  30. The CCD Process - Publishing • Once NCES has received the data from Census, NCES can publish the data. • The data is published through: • Data files • Data tools • Reports

  31. The CCD Process – Publishing - Current • Typically, NCES has released their main files in three versions: • Preliminary – Subset of the main data file • Mainly names, addresses, phone numbers, NCES ID, unedited member and staff count • Only released at the school and agency level • Final – Full data files at all three levels • One Year Later – Edited state level to correct issues

  32. The CCD Process – Publishing - Future • NCES will begin publishing the data in a different way • Preliminary • State submitted data • Full data file • Minor edits with flags indicating potential data issues • Provisional – Full data files at all three levels – This file will be updated quarterly • Final – LOCKED data files

  33. Other CCD data issues

  34. Determining the Values of Unknown Data • Whatare the issues? NCES has to interpret the value of data that is not submitted into EdFacts. We have to decide if the value is intended to be: • Zero • Missing • Not Applicable

  35. Determining the Values of Unknown Data What is the difference between Missing, Not Applicable and 0? • Missing indicates that the situation exits but you are unable to report the data at this time • Not Applicable means that the value does not apply to that school/agency (9th graders at a K-6 school) • Reporting zero indicates that the value is a true reported 0.

  36. Determining the Values of Unknown Data • What does NCES do to resolve the issue(s)? • NCES has developed various business rules to determine the value. • For example, if the school doesn’t offer second grade, then the unreported number of second graders would be not applicable. • If the entire state has unreported data (magnet flags for example), then we need to determine if it is missing or not applicable.

  37. Determining the Values of Unknown Data • What can states do to help prevent/resolve the issue(s)? • Use the business rules to help you lessen your burden: • If you report the school offers a grade, the data is expected for that grade. • Don’t report more grades than they actually offer (vocational schools offering K-12). • If you report zero for a subtotal, you do not need to • report the details.

  38. Determining the Values of Unknown Data • The broken record… Schools with membership but no teachers or staff • The purpose of the teacher and staff counts at the school level are to inform data users as to how many persons are employed in the instruction of the student population. If, for whatever reason, you are unable to determine the number of teachers/staff at a given school, please report the count as missing/unknown. Do not report these data as zero or leave the count out entirely.

  39. Determining the Values of Unknown Data • Questions?

  40. Virtual Schools • Whatare the issues? • NCES is working on collecting data on if a school is a virtual school • Since there are many different types of virtual schools out there, we would like to make sure we ask the right questions.

  41. Virtual Schools • Whatare the questions? • Does this school offer virtual courses? • How many students attend these courses? • How many students attend via virtual courses exclusively? • What suggestions would you give us about collecting this data?

  42. Virtual Schools • Questions?

  43. Reconstituted schools • Starting with the 2010-11 school year collection, the reconstituted flag has been added to the N029 File Specs • Definition – An indication that the school was restructured, transformed, or otherwise substantially changed as a consequence of the state’s accountability system under the ESEA or state law, or as a result of a School Improvement Grant (SIG), but is not recognized as a new school for Common Core of Data (CCD) purposes. • NCES will add this flag beginning with the 2010-11 CCD files.

  44. Reconstituted Schools • Questions?

  45. Awards • Criteria for awards: • Following, are the four criteria used for the 2010-11 awards: • 1.     Submit all 10-11 data groups that comprise CCD (directory, grades offered, FRL, LEP, membership, staff, and CCD school) by April 30th. • 2.     Submit all 10-11 data groups that comprise CCD (directory, grades offered, FRL, LEP, membership, staff, and CCD school) by May 31st.

  46. Awards • Criteria for awards: • 3.     Submit 09-10 data (including dropout/completer data) that is of good quality—the number of missing items not to exceed 1 or 2. • 4.     Work cooperatively with Census to provide necessary corrected data. • States that met all four criteria received a golden award and states that met three criteria received a silver award.

  47. Contact Information

More Related