1 / 16

TBAISD Regional Assessment, Data Warehousing and Reporting Services

TBAISD Regional Assessment, Data Warehousing and Reporting Services. 7/13/10. About TBAISD. 24,000 students 16 LEAs 4 PSAs 12 Non-Public 5 counties Diverse in terms of school size TCAPS @ 10,000 Crawford @ 50 Most districts lack any curriculum or technology administration.

lilike
Download Presentation

TBAISD Regional Assessment, Data Warehousing and Reporting Services

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TBAISD Regional Assessment, Data Warehousing and Reporting Services 7/13/10

  2. About TBAISD • 24,000 students • 16 LEAs • 4 PSAs • 12 Non-Public • 5 counties • Diverse in terms of school size • TCAPS @ 10,000 • Crawford @ 50 • Most districts lack any curriculum or technology administration.

  3. Current Reality • PLC want-a-be; • Assessment diverse; • DRIP; • Collecting more than applying; • Fragmented data storage and reporting; • Loss of continuity over time; • Over-dependency on MEAP/MME; • Lack of “regionalized” PD models; • Lack of consistent use of data by service providers; • Frustration

  4. “A lack of clear and comprehensive data management standards has allowed us to devolve into a mess of incompatible parallel systems that gradually lost sync with each other. As a result, our ability to use data to make well-informed decisions was severely compromised." Many if not most districts are faced with the same challenge. Defining the standards for information collection and documentation are critical actions for districts as they develop their student data-collection systems. -"The Administrators Guide to Data-Driven Decision Making," Todd McIntire

  5. Questions • How do we “teach people to use data” if.. • we don’t know what data they generate? • what they have is disorganized? • not mapped to a specific purpose? • not common across the region? • Is there a “least common denominator” of assessment that will enable the construction of REGIONAL: • procedures? • administration? • pricing contracts? • data systems? • PD? • service alignment? • instructional application?

  6. Assessment Committee • Assure that assessment data is used to improve instruction, curriculum and programming. • Establish a vision for a regional data set that can be used for individual identification and program evaluation (supports curriculum alignment, RtI, early literacy, college readiness, differentiation, teacher evaluation) • Structure support systems aligned to the desired data set (professional development, group pricing, data management, consulting services, regional collaboration).

  7. Assessment Profile

  8. Assessment Profile (cont)

  9. Assessment Profile (cont)

  10. Data Director (warehouse): • Fully rostered data structure (ISD, district, building, grade, teacher, student, demographics); • Repository of all specified performance data sets (historical, and tied to UIC codes) • Repository of regional assessments and item banks; • Data aggregation across region and across assessments; • Reporting services (republishing/enhancing/streamlining what is available from individual testing services) • Ad-Hoc report generation and data mining • Supports localized online assessment and storage practices; • Kindergarten Readiness • Administered to incoming students before enrollment. • Boehm-3 • Parent Survey • AIMS Web • (Target K-3) • Early Literacy • Early Numeracy • Oral Reading • Computation • Math Concepts/App • Progress Monitoring • NWEA • (Target 4-8) • Growth tracking • Adaptive • Language Usage • Reading Comprehension • Math • Science • Explore/Plan • (Target 8,9,10) • Performance against College Readiness standards • ACT predictor • MEAP/MME/ACT • State administered assessments. • Aligned to Michigan grade level and content area standards. • Regional Common Assessments • Test items developed by area teachers. • Aligned to GLCE/HSCS • Unit based for use as “test-lets” throughout the year. • Data reported to teacher to highlight curricular gaps in student learning • Data aggregated across all users for collaborative use. Information Architecture

  11. How will we gather data? SIS1 SIS2 DATA DIRECTOR LEA2 Explore LEA1 NWEA LEA1 PLAN Non-Public NWEA ISD AIMS Non-Public AIMS LEA2 NWEA LEA2 AIMS LEA3 NWEA LEA1 AIMS LEA2 PLAN Non-Public PLAN Non-Public Explore LEA1 Explore Kindergarten Readiness Direct Entry Common GLCE Assessments

  12. Data Warehousing/DataDirector: DATA DIRECOTOR REGION-WIDE Student Demographic/Roster data Kindergarten Readiness Data REGION-WIDE AIMSWeb data REGION-WIDE NWEA Data REGION-WIDE GLCE Mastery data REGION-WIDE Explore Data REGION-WIDE PLAN Data REGION-WIDE MEAP/MME/ACT data

  13. System Development Plan(In cooperation with consortium, steered by assessment committee) • Negotiate access to local Student Information Systems (SIS); • Invent/comply with DD demographic and rostering process; • Comply with DD import scripts for MEAP/MME; • Import locally developed common assessments items/tests into DD; • Create DD assessment/data entry template for Kindergarten Round-up; • Establish data gathering and import scripts for “national” assessments (AIMSWeb, NWEA, Explore, Plan); • Centralize purchasing and rostering support for all assessments; • Establish “report development” process; • Establish Data Application PD model • Launch

  14. Where we stand… • Locally developed common exams • Items created • Exams being constructed; • MEAP/MME: • 3 year of historic data loaded using available directives; • K-Readiness: • assessment structures created; • data collected; • ready to upload • AIMS Web: • all users consolidated under single account umbrella; • ISD rosters (w/UIC), trains, policies, exports; • Import directive developed by DD. • Periodic data loading schedule established. • Regional pricing established.

  15. Where we stand… • NWEA • Import directives developed; • Some header repairs pending; • 3 years of historic data collected, UIC matched, and uploaded into DD. • ISD will take-over all future rostering (with UIC) • Regional pricing established • Explore/Plan • 3 years of past data files purchased from ACT Inc. • All historic Plan data loaded with available directives • Historic Explore data load pending directive repairs • Beginning in 2010 • All districts agree to common test window • ISD orders for all 8th, 9th,10th • ISD pays invoice; re-bills LEAs • ISD pre-identifies answer sheets with UIC • ISD receives data files from ACT Inc, • Uploads to DD using import directives

  16. ISD Role in Assessment Management Account Management • Ordering and Billing; • Transfer/Merge requests; • Adding schools; Continuous Support • Data Integrity police; • Data Team consulting; • Continuous PD coordination; • Coordinate with ISD service providers; System Management • Manage permissions of local administrators; • System level settings/configuration; • Data exports/imports into DD; • Report development; • Rostering and roster maintenance; • Tracking new features.

More Related