1 / 59

Verifying Yearend Accountability Data

Verifying Yearend Accountability Data. by Angie Crandall MVECA. Presentation Goal. To provide information and insight about reading and interpreting accountability-related ODE/EMIS data verification reports.

mave
Download Presentation

Verifying Yearend Accountability Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Verifying Yearend Accountability Data by Angie Crandall MVECA OEDSA

  2. Presentation Goal • To provide information and insight about reading and interpreting accountability-related ODE/EMIS data verification reports. • NOTE: ODE/EMIS changes might occur in the future that could affect the content of this presentation. • The ODE/EMIS web site and official ODE communications will contain the most accurate, updated information. OEDSA

  3. Objectives To learn about: • Resources that explain the rules for the Ohio Accountability System and some important Accountability concepts • Reading and interpreting Yearend ODE/EMIS reports used to verify EMIS data that will appear on the Local Report Card OEDSA

  4. Ohio Accountability System Rules • Why do I need to know the rules? • Law and policies are implemented in calculations performed on data reported by districts in EMIS. • Knowing these rules helps to identify reporting issues/errors. • Who is responsible for doing this? • A tremendous amount of data are reported in EMIS, is it possible for one person to track all of this? • It is important [for someone in your district] to know these rules and monitor for changes, since these rules can be adjusted based upon changes in law/policy or clarification. OEDSA

  5. Ohio Accountability System Rules (cont’d) • ODE Office of Accountability web site • http://www.ode.state.oh.us/accountability/ • LRC/Accountability information by year • http://www.ode.state.oh.us/accountability/lrc.asp • Worth book-marking! • Can be used to research information about the Ohio Accountability System from prior years, learn what is happening in the current year, and what is planned for future years. OEDSA

  6. Ohio Accountability System Rules (cont’d) • http://www.ode.state.oh.us/accountability/lrc.asp OEDSA

  7. Ohio Accountability System Rules (cont’d) • What do I need to know about the rules? • For example, it would be helpful to know: • What the general terms mean; • E.g. What is “AYP”? What is “full academic year”? • For 2005-06, which subject/grade level tests are included: • As indicators; • In the performance index, and; • In AYP determinations. • Are there any changes to previously established calculations or “business rules” for 2005-06? OEDSA

  8. Ohio Accountability System Rules (cont’d) • Which subject/grade level tests are included: • as indicators; • in the performance index; • in AYP determinations. How district ratings are determined. Changes to previously established calculations or “business rules”. Information for 2005-06. OEDSA

  9. Ohio Accountability System Rules (cont’d) • Changes in Accountability System business rules/calculations for 2005-06 can be found at: • http://www.ode.state.oh.us/accountability/changes0506.asp • Some examples of changes include: • MR/DD students' scores will count in the resident district totals. • The full academic year (FAY) definition is continuous enrollment from October count week through March 19th. This is the new definition of the 'Majority of Attendance IRN‘ (MOA IRN) so the FAY calculation will be done by local software systems and sent to EMIS as the MOA IRN. • Other important changes related to the OGT and LRC calculations can be found by following the above link. OEDSA

  10. Testing Rules & Resources • Ohio Statewide Testing Program Rules Book • http://www.ode.state.oh.us/proficiency/Rules/Rules_Book_1_31_06.pdf • Monthly communications from the Offices of Curriculum, Instruction, and Assessment • http://www.ode.state.oh.us/proficiency/Monthly_Comms/default.asp • Achievement & OGT Test Score Conversion Tables • http://www.ode.state.oh.us/proficiency/Technical_Data/Default.asp • (See Statistical Summaries) OEDSA

  11. EMIS Resources for 2005-06 • FY2006 EMIS manual • http://www.ode.state.oh.us/EMIS/documentation/manual/2006 • FY2006 EMIS changes • http://www.ode.state.oh.us/EMIS/changes/ • LRC/Accountability prototypes for 2005-06 • http://www.ode.state.oh.us/emis/documentation/default.asp# • Testing Record Valid Combinations • http://www.ode.state.oh.us/emis/OtherResources/default.asp#testingcombinations • Watch the EMIS web site for updated 2005-06 LRC and Accountability Report Reference Guides. (These are excellent resources!) • Also, watch for EMIS Newsflashes. OEDSA

  12. ODE Accountability-related Reports • The following three ODE-generated accountability reports produced at year-end, are designed to be used together: • Accountability Workbooks • LRC Workbooks • Where Kids Count (WKC) files OEDSA

  13. How can I use the Accountability workbooks? • To identify for districts/community schools and buildings: • The number of indicators made; • The performance index score; • Whether or not adequate yearly progress was made, and; • The building/district’s report card designation. OEDSA

  14. Design of the Accountability Workbook • This report provides an overview of district/building accountability status/statistics. • For example: OEDSA

  15. How can I use the LRC workbooks? • It can be used to: • Verify student counts on statewide tests; • Validate the performance level counts; • Pinpoint which student test records need to be verified individually, and; • Verify data that will be publicly released. • NOTE: Most of the information on the LRC workbook that does not appear on the Report Card will be available to the public on the ODE web site. OEDSA

  16. Design of the LRC Workbook • The LRC Workbook contains disaggregate data that appear on the Accountability Report and other information that will appear on the Local Report Card. • For example: OEDSA

  17. How can I use the WKC files? • This file includes student level data that can be used to verify test data for students included in the % proficient calculations on the Accountability Worksheet. • Specifically, to verify student results included in: • The “% proficient” calculation for determining whether the building/district made AYP (for all students and by subgroup); • The “% proficient” calculation for state indicators, and; • The attendance rate. OEDSA

  18. More about the WKC File • This is a district-level file that consists of several columns of data, including: • Student identifying information; • Subgroup membership; • Where kids count information; • Assessment information (by subject for all tested subjects), and; • District Alternate Assessment Cap Information (included or not). OEDSA

  19. More about the WKC File (cont’d) • Before using the “WKC file” to verify data, look to see if any students were not included in calculations due to the District Alternate Assessment Cap. • This is column 4 on the Acct-Statewide_Indicators(D) tab in the District Accountability Workbook. • If the district did not exceed the alternate assessment cap, the columns in the WKC file pertaining to the state and federal caps should be “N”- no. OEDSA

  20. More about the WKC File (cont’d) • The WKC files include more detailed information than the WKC reports; e.g. performance levels, scores. • These can also be imported into Excel, or another program, for detailed verification/analysis. • Information included: • Data reported by your district during the current school year, for students who took one or more of the statewide subject/grade level tests, and students in untested grade levels (for the attendance rate.) • Data reported by other districts that are educating residents of your district through a special education cooperative agreement. OEDSA

  21. More about the WKC File (cont’d) • It is helpful to know which students are included in each calculation. • An important filter is “attending/home status”. • NOTE: It is possible for students that are NOT included in the “% proficient” calculation to be included in the WKC file. • For example, students with attending-home status “2E” (open-enrollment students educated elsewhere) may be included in the WKC file for a district. Those students would NOT be included in the “% proficient” calculation, but would still be included in the file. OEDSA

  22. More about the WKC File (cont’d) • For 2006, because of changes in the full academic year definition, there will be fewer columns in this file.. • The Full-academic year will be based upon the Majority of Attendance IRN. • Specifically, the following columns will no longer be included in that file: • March_IRN • Oct_IRN OEDSA

  23. More about the WKC File (cont’d) • ODE uses the data reported for each student to determine where a student counts using the WKC business rules, and the Full Academic Year criteria. • Errors can happen when entering, loading, and/or reporting data, so it is very important for someone closest to the data to verify its accuracy. • It is important to understand what each data element means and how it is being used, to determine whether or not the data are reported accurately. OEDSA

  24. Example #1 Let’s say a district did not meet the standard for the 7th grade math indicator. What can they do? OEDSA

  25. 1. Look at the state indicator data. • Check to see if the district met the indicator. • Where to look: • Report: District Accountability • Tab: Acct-Statewide_Indicators(D) • Row: 7th Grade Math Indicator, 18th row • Column(s): 5-7 OEDSA

  26. 2. Review the data behind the state indicators. If the indicator was missed, check to see by how many students. (Look at Columns 1 & 2 in the same row, on the same worksheet.) OEDSA

  27. 3. Verify whether the “District Alternate Assessment Cap” has been exceeded. • Look at Columns 3 & 4 on the Acct-Statewide_Indicators(D) tab in the District Accountability Workbook • If the district exceeds the cap, ODE excludes students from the # of students considered to be at or above the proficient level, until the district is at or below the cap. If these are “0”, the district did not exceed the cap. OEDSA

  28. 4. Find individual 7th grade Math Achievement records. • Filter the data in the WKC file to find students who took the 7th grade math achievement test. OEDSA

  29. 5. Look for records with scores below the proficient level. • Sort by “Math_Level”, look for students that scored below the proficient level (Basic or Below-basic), and check: • For students that counted in the district and/or a building, were students really enrolled in the district for a “Full Academic Year”? • Maybe one or more of these students had breaks in enrollment between the end of October Count Week and March 19th? • Were scores reported accurately? • e.g. Check to see if there are students you thought should have passed but did not and verify results. OEDSA

  30. 6. Look for records with scores at or above the proficient level. • For students scoring at the Proficient, Accelerated, or Advanced Levels, find students in the “State Total”, and check: • If any of those students should be counting at the district level. • Specifically, are there students that are only counted at the state level, who are reported as not meeting the “full academic year” criteria (MOA IRN=******), but who were continuouslyenrolled in the district from the end of October Count week through March 19? • Are there students not being counted at your district because the student status or attending/home IRN indicator was reported inaccurately for the student’s situation? • Are there students missing from the WKC file, because there are fatal errors on the aggregation reports? OEDSA

  31. Other Places to Verify Data (in Addition to the WKC File) • Are there students who took the test for whom results are not being reported? • Make sure students reported in a “tested grade level”: • Have a test record reported, and; • That information on the testing record is reported accurately (e.g. appropriate “Required Test Type”). • NOTE: The ODE/EMIS Team has published a list of valid testing combinations that can be reported to EMIS. If you run into a situation that you think is being reported accurately, but the combination is not on the list, contact your designated ITC to see if there is another way to code the situation. OEDSA

  32. Other Places to Verify Data (cont’d) • Check with your EMIS coordinator to see if there are errors appearing on the aggregation reports for a student because s/he does not have a testing record. • ODE also typically sends a file with a list of SSIDs for which a Test Record was expected, but not received. (It is really important to review that list, because there could be funding implications.) • If that list includes any students that you think should not be required to have a testing record reported, verify that you are reporting them accurately. If you are reporting them according to EMIS guidelines, but they are still appearing, inform your ITC. OEDSA

  33. Example #2 Now, let’s say the district missed AYP in math. What can they do? OEDSA

  34. In the District Accountability Workbook, look at the Acct-AYP_Summary(D) tab, columns 0-S. Let’s say AYP was “not met” in Math “% proficient” 1. Check the AYP Summary. OEDSA

  35. 2. Identify areas where AYP was “Not Met”. • Look at the AYP Summary table on the Acct-AYP_Summary(D) tab, columns A-M. • In order for a district to meet AYP, all students (as a group) and all subgroups that are included in the district calculation must be at or above the AYP goals. What this means is that if “Not Met” appears anywhere in this table, the district has failed to meet AYP. OEDSA

  36. 3. Identify which subgroup(s) did not meet AYP. • In this case, let’s say the district did not make AYP in Math because the “Economically disadvantaged” subgroup did not make AYP. • Look more closely at that subgroup to see how close that group is to meeting the target. OEDSA

  37. Look at the Math % Proficient results on the “Acct-AYP_Proficiency(D)” tab. Let’s say that group was 2 percentage points away from meeting AYP. 4. Identify which subgroups are closest to meeting AYP for combined grade levels. OEDSA

  38. 5. Examine subgroup results by grade level. • To identify where to start verifying that data are reported accurately, we need to look more closely at how students in that subgroup performed by grade level on the Math tests. • We are trying to find out “Which grade levels did not meet the % Proficient target in Math?” OEDSA

  39. 5. Examine subgroup results by grade level. (cont’d) • Look at the Math results for the Economically Disadvantaged subgroup on the Acct-AYP_ProficiencyDetail(D) tab, in the District Accountability Workbook to identify which grade levels did not meet the Current Year Target % Proficient or 2 Year Average Target % Proficient. OEDSA

  40. 6. Review subgroup results within grade level, by test type/accommodations. • Next, you could look at the “LRC-Proficiency#(D)” worksheet in the District LRC Workbook to find out the performance levels achieved by students in the “economically disadvantaged” subgroup, by type of test (standard or alternate assessment). This tells you where student performance appears to be lowest. (These are the data to verify, using the WKC files.) This step is optional. OEDSA

  41. 7. Use the WKC files to verify individual subgroup records. • We can use the WKC file (in Excel) to verify the accuracy of the 8th grade Math results reported in EMIS for students in the “economically disadvantaged” subgroup. OEDSA

  42. 8. Use the WKC files to verify individual records for results below the standard. • For students who score at the Basic or Below-basic levels, check: • Were students really enrolled in the building for a “Full Academic Year”? • Maybe one or more of these students had breaks in enrollment between the end of October Count Week and March 19th? • Were scores reported accurately? • e.g. Check to see if there are students you thought should have passed but did not, and verify results. • Is the disadvantagement element correctly reported? • Has this information been updated for yearend? OEDSA

  43. 9. Use the WKC files to verify individual records for results at or above the standard. • For students scoring at the Proficient, Accelerated, or Advanced Levels, check: • Are there students who currently count at the state level, who should be counted at the district/building level? • Specifically, are there students that are only counted at the state or district level, who are reported as not meeting the “full academic year” criteria, but who were continuously enrolled in the district from the end of October Count Week through March 19th? • Are there other students who were not reported as “economically disadvantaged” that should have been? OEDSA

  44. Other Items to Verify in Addition to Analyzing WKC Files • The “% proficient” subgroups are only evaluated for AYP if the group size exceeds 30 (45 for students with disabilities). • Check to see if any subgroups were evaluated for AYP at the district or at buildings, for which the subgroup size should not be that large. OEDSA

  45. Other Items to Verify (cont’d) • Were students reported as enrolled in the correct buildings? Were there any transfers between buildings not reflected in the data? • This could affect AYP and state test indicators for buildings. • For 2005-06, if students transferred between buildings between October count week and March 19th, then the student would not meet the full academic year criteria at the building level, and would not be included in building level statistics subject to FAY. OEDSA

  46. Other Items to Verify (cont’d) • Have buildings correctly identified students who are court-placed or parent-placed into institutions? • These students are reported with student status = P or T. (These students only count at the state level.) • If information about court-placed is not reported by buildings, is there a process for sharing this information with the EMIS coordinator, or someone at the district level who can update the student status? OEDSA

  47. Other Items to Verify (cont’d) • Have students participating in the autism scholarship been identified and reported correctly? • These students do not count in the building/district for accountability purposes. OEDSA

  48. Other Items to Verify (cont’d) • Verify the accuracy of data reported by districts educating resident students through special education cooperative agreements. • The data reported by the educating district count in the “% proficient” statistic at the resident/sending district, if the students meet the “full academic year” criteria. OEDSA

  49. Other Items to Verify (cont’d) • Have assessment results been reported accurately for students attending an MR/DD? • In 2005-06 MR/DD students' scores will count in the resident district totals. • ODE indicates that these students should be included when calculating the 1% cap for alternate assessments allowed to count as proficient in the accountability calculations. • Source: http://www.ode.state.oh.us/accountability/changes0506.asp OEDSA

  50. Other Items to Verify (cont’d) • Have you verified the WKC files to make sure OGT results are accounted for appropriately? • Information about 2005-06 calculations that include OGT results can be found at: http://www.ode.state.oh.us/accountability/changes0506.asp OEDSA

More Related