1 / 22

Educator Preparation Advisory Council (EPAC) Data Subcommittee February 25, 2014

Educator Preparation Advisory Council (EPAC) Data Subcommittee February 25, 2014. Slide 1. Welcome and Introductions. Welcome Dr. Sarah Barzee Introductions Future meetings: schedule 1-2 hour virtual meetings/conf calls with pre-work EPAC workplan for 2013-14.

Download Presentation

Educator Preparation Advisory Council (EPAC) Data Subcommittee February 25, 2014

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Educator Preparation Advisory Council (EPAC)Data SubcommitteeFebruary 25, 2014 Slide 1

  2. Welcome and Introductions • Welcome • Dr. Sarah Barzee • Introductions • Future meetings: schedule 1-2 hour virtual meetings/conf calls with pre-work • EPAC workplan for 2013-14

  3. Underlying Assumptions of EPAC Charge • The quality of instruction plays a central role in student learning (academic, behavioral and social). • Educator preparation programs ensure baseline knowledge, skills and dispositions are demonstrated (CCT, SLS, CCSSO Learner Ready Definition, etc.) and contribute to the quality of instruction. • Overall goal is to improve programs, not ensure compliance, and provide useful information for improvement of preparation policy and practice. Adapted from Evaluation of Teacher Preparation Programs National Academy of Education, 2013

  4. Purposes of Educator Preparation Program Evaluation • Ensuring accountability and monitoring program quality and providing reliable information to the general public and policy makers • Providing information to consumers to help them make choices about preparation programs and providing future employers information to support hiring decisions • Supporting continuous program improvement with relevant performance data and measures that can identify strengths and weaknesses of existing programs Adapted from Evaluation of Teacher Preparation Programs National Academy of Education, 2013

  5. Validity of Program Approval Decisions • Must be based on multiple measures (quantitative and qualitative data) • Construct validity • Content validity • Predictive validity • Consequential validity • Evaluation system must be adaptable to changing educational standards, curricula, assessment and modes of instruction

  6. EDUCATOR PREPARATION PROGRAM APPROVAL Based on EPAC Principles • Data and Accountability System • Performance Categories: • Recruitment and Completion Rates • Employment and Retention Rates • Pre-Service Performance Rates • Educator Effectiveness (surveys, eval data) • District Partnership Quality Program Approval Process Review of Programs based on EPAC Principles 1-5, multiple measures and qualitative criteria as well as statutory requirements + Will Determine Assessment Subcommittee To review and make recommendations on new assessments to be developed as part of accountability system PROGRAM APPROVAL PROCESS and DECISION BY the STATE BOARD OF EDUCATION (at the individual program, not institutional, level)

  7. Inter-related Work of EPAC Subcommittees • Program Approval: Develop a new, more rigorous program approval process and regulations to guide approval decisions by the State Board of Education (SBE) based on review of efficacy of curriculum, as well as accountability data on a program’s measures of quality. • Data Collection, Analysis and Reporting: Develop a new data collection, analysis and reporting system for institutional reporting and an accountability system for program approval, as well as provide biennial research data on supply and demand, to inform continuous improvement. Data from accountability system will be linked with program approval decisions • Assessment: Guide development of new assessment options including performance assessments, clinical experience evaluations, feedback surveys. Data from new and existing assessments will be used in the data and accountability system.

  8. Data Subcommittee Outcomes • EPAC Principles • Program Entry Standards • Staffing & Support of Clinical Experiences • Clinical Experience Requirements • District-Program Partnerships & Shared Responsibility • Program Completion & Candidate Assessment Standards • Program Effectiveness & Accountability • Develop: • An institutional reporting system • An accountability system to be used as part of program approval • Biennial data report on supply and demand • Use supports from CCSSO/NTEP cross-state collaborative, 2013 – 2015

  9. Creating a Culture of Evidence • Use quantitative and qualitative data to make valid inferences (interpretations and findings) that inform program improvement • Shift focus from compliance to inquiry and program improvement • Data must inform collaboration and shared responsibility for IHE faculty and staff to review and make changes in program structure, practices, policies and teaching • School based faculty and administration also have a shared responsibility to collaborate with IHE partners Adapted from Evaluation of Teacher Preparation Programs National Academy of Education, 2013

  10. Attributes and Measures Related to Program Quality • Two tables from Evaluation of Teacher Preparation Programs, National Academy of Education, 2013 • Table 2.1 Attributes Related to TPP Quality and Evidence Used to Measure Them (page 27) • Table 2.2 Main Types of Evidence Used by Different TPP Evaluation Systems (page 60)

  11. 6 EPAC Principles & Alignment to Performance Indicators 1. Program Entry Standards 2. Staffing & Support of Clinical Experiences 3. Clinical Experience Requirements 4. District-Program Partnerships & Shared Responsibility 5. Program Completion & Candidate Assessment Standards 6. Program Effectiveness & Accountability

  12. Title II HEA Mandate for Accountability of Teacher Preparation Programs • Expectation of identifying At-Risk or Low-Performing preparation programs has been in place since 1999-2000 Title II HEA mandate; this work will redefine criteria for effective, at-risk and low-performing programs • See current definition of At-Risk or Low-Performing institutions that has been in place since 2000 and the basis of reporting low-performing IHEs in the annual Title II state report

  13. Designing the Data and Accountability System • Identify accountability categories • Are the previously identified accountability categories sufficient? • Recruitment and Completion Rates • Employment and Retention Rates • Pre-Service Performance Rates • Educator Effectiveness (surveys, eval data) • District Partnership Quality • How will we weight each category? Are they all equally weighted? • How do we measure these? Which data points from existing assessments or new assessments do we use? • Is there a “trigger” of any specific data point(s), categories that would require immediate program review?

  14. Designing the Data and Accountability System • Data currently available: • Title II Report (refer to 2013 report https://title2.ed.gov/Public/Home.aspx) • Employment/Staff File Data http://www.sde.ct.gov/sde/cwp/view.asp?a=2613&q=322570 • Completer Rates (in Title II report and ETS Title II system) • Pass Rates (Praxis II, Foundations of Reading, CAT, ACTFL) http://www.sde.ct.gov/sde/cwp/view.asp?a=2613&Q=333728 • Assessments yet to be designed or implemented to provide necessary data points • Feedback surveys from teachers • Feedback surveys from employers • Pre-service assessments • Statewide student teaching/clinical experience evaluation instrument • Measure of IHE/District Partnership quality

  15. Cautions about Data Use and Reporting • Consider the source (self-report, district, IHE, feds, etc.) • Consider its “completeness” or missing data • Consider the quality of the data • Comparisons of different N size • Limitations to certain data points as part of the annual data reporting or as part of the accountability system • Evolution and adaptability of data system and accountability system over time

  16. Case Study: Louisiana • Review Louisiana’s accountability system for educator preparation program • First adopted in 2003 • Revised version adopted in 2013 • Review 2011-2012 Annual Report for Teacher Preparation

  17. Designing IHE Data and Performance Reports: Next Steps • Discuss how to annually or biennially report educator preparation data and performance profiles • Build off of existing data systems such as Title II, Certification, CSDE Staff File, etc. • Review and consider a “dashboard” system for displaying annual profile data on each institution and individual program (to the extent that we have program level data and meets the suppression test) • Identify key design features desired for this on-line reporting system

  18. Next Meeting (virtual) • Survey subcommittee for recommendations about: • Categories to be used in accountability system • Weighting of categories • Underlying data points within each category • Overall rating system: Identify levels (e.g., low-performing, at-risk, effective, etc.) • Identify whether system measures are calculated annually, biennially or other • Trigger for off-cycle review: which data point or category can trigger a program approval review outside of established cycle if overall rating system? • April meeting: debrief and finalize the above recommendations • Set dates for: April, May and June

  19. Next Meetings • May: Presentation from Ed Klonoski, President of COSC, on Dashboard System Design for IHE Profiles and Performance Reports Design similar to what we have for school district profiles? • June: Supply and Demand Study preview and summary of data compiled and analyzed

More Related