Biennial Report Technical Assistance Meeting December 8, 2009
3 Major Activities of the system and their different roles Program Assessment –Is the program in alignment with the standards? Biennial Reports – Is the program effective in developing qualified educators and does the program use data to drive its program improvement efforts? Site Visit – Confirms that the Common Standards and Program Standards are implemented in an integrated, effective manner?
Accreditation System Are programs effective in preparing competent educators? Site Visit Biennial Reports Are Common Standards and Program Standards implemented in an integrated, effective manner? Program Assessment Are programs aligned with standards?
Uses for Biennial Reports • Critical part of the accreditation cycle • Key piece of evidence that an institution is responsive to Common Standards 2 and 9 • Used by review teams during Program Assessment and Site Visits. • Biennial Reports, Program Assessment and Site Visits, together give a more comprehensive picture of a program sponsor over time.
2009 – Where are we? • Just completed first year of full implementation of biennial reports in the accreditation system. • All programs from 3 cohorts submitted biennial reports • Approximately 47 institutions • Over 260 programs • Site visits that took place in Spring 2009 included biennial reports as part of their evidence.
Relationship to the Standards Common Standard 2 – Unit and Program Evaluation System The education unit implements an assessment system for ongoingprogram and unit evaluation and improvement. The system collects, analyzes, and utilizes data on candidate and program completer performance and unit operations. (continued on next page)
Common Standard 2 (continued) Assessment in all programs include on-going and comprehensive data collection related to candidate qualifications, proficiencies, competence, and program effectiveness. Data are analyzed to identify patterns and trends that serve as the basis for programmatic and unit decision-making.
Common Standard 9 • Focuses on Candidate Competencies at the Program Level • Hold that thought…
Biennial Report Two sections to the report: • A Submitted by each program. Current program context, recent changes, enrollment/completion data. Includes data from each approved program. • B Submitted by the designated director of educator preparation programs. Overall trends and institution’s action plan.
Biennial Report, Section A Purpose: Snapshot of each program’s processes for utilizing data to increase program effectiveness Part I. Contextual information/Changes Part II. Assessments of Candidates and Completers Part III. Analyses of data Part IV. Proposed Program Changes
Section AProgram Specific Information Part I. Contextual Information General information to help reviewers understand the program, the context in which it operates and what has changed significantly since the Commission approved the current program document. 1 page
Context Number of candidates and completers
Context – cont- • When begin/end program • Cohort model • Program features • Internships • Serves inner-city schools • Bilingual program • Changes since the last site visit/approval
Section A, Part II – Candidate Assessment/Program Effectiveness Program describes assessment procedures and instruments it uses to ensure that candidates have the requisite competencies and that the program is effectively meeting its candidates’ academic and professional growth needs. ≤ 10 pages
Part II. Candidate Assessment An Overarching Chart can be helpful
Provide Aggregated Data • Provide Actual Aggregated Data for 4-6 Key Assessments • Data should reflect the last two academic years • For those submitting in fall 2009 – that would be 07-08 and 08-09 • For those submitting in fall 2010 – that would be 08-09 and 09-10
Examples of Candidate Data Evidence of candidate and completer competence through coursework and practicum • TPA – for MS/SS programs • Key assignments in coursework, observations during fieldwork, practicum, or clinical practice • Demonstrations/presentations prior to being recommended for a credential • Portfolios • Others
Examples of Program Effectiveness Data Evidence of program effectiveness for completers, employers, community • Completer and graduate surveys • Employer surveys/feedback • Retention rate in employment • Placement rates
Reporting the Information • Describe the type of data being collected (e.g., TPA, employer data) • Identify instrument(s) used to gather data • Describe process of collecting data • Include descriptive statistics such as the range, mean, median, mode or percent in each category
Not all Data is Equal! There is data, good data, and better data for the purposes of program improvement.
Examples of Data Data: 100% of candidates successfully complete Ed 235 or Average grade for all candidates who took Ed 235 in Fall of 08 is 3.45 What do either of these examples tell you about program quality or effectiveness?
Example of Better Data What are the competencies covered by ED 235? What key assignments, projects, fieldwork component, etc. are required? Examples of possible data: Data from a common rubric/scoring criteria Tied to explicit standards, competencies, TPEs, or TPA task
Another Example of Good Data Student Teaching Final Evaluation– Exit Clinical Practice Completed by Master Teacher and University Supervisor
Best Data Data from TPE observations during ED 235 (scored with a 4 pt. rubric) is compared to post program information such as the following: Employer Survey data verifies that first year teachers from the X program are effective in teaching…. What do these data sources tell you about program effectiveness?
Lessons Learned - Data Best Biennial Reports • Include data at a level that can be tied to candidate competencies outlined in the standards • Include BOTH candidate assessments and program/post-program feedback information (employer surveys, completer surveys) • Present data in a way that allows the reader to compare candidate and program performance relative to the standards.
Section A, Part IIIAnalysis of Data What does the data indicate about how well the program is performing in terms of developing candidates’ competencies?
Analysis of Data Program uses results of data analyses to identify: • How well candidates are performing • Areas where candidates are not performing as expected • How well completers are performing • Areas in which completers feel unprepared • How the program is perceived by employers • Identify strengths of program and areas for growth
Analysis of Data Programs may take the data as a whole and reach some conclusions, or may analyze each data source separately. If the latter, any areas of conflict in the data should be addressed. Do not overlook areas where the data indicate some improvements are necessary.
Part IV. Use of Assessment Results for Program Improvement Program describes how it will use the results of the analyses of data to build on identified strengths and address areas in need of growth/improvement.
Use of Assessments for P.I. • What changes have or will be made to the program? • What data will the program continue to watch over time? • Is there a need to improve the assessment tools themselves? • Make sure this section is linked to the data and analysis.
Program Improvements Chart or Table may be useful here too.
Biennial Report, Section BInstitutional Summary/Action Plan Purpose: Snapshot of the institution’s processes for utilizing data to increase program effectiveness Institution will review reports from each program and identify trends, institutional strengths, and areas needing growth that occur across programs, and describe a plan of action to improve the performance of all programs.
Section B - Institution • Summary is submitted by unit leader: Dean, Director of Education, Superintendent, or Head of the Governing Board of the Program Sponsor. • Summary identifies: • Trends observed across programs • Areas of strength • Areas for improvement • Next steps or a plan of action. 1 Page
Page Parameters • Section A for each program should be approximately 10 pages. • Section B should be approximately 1 page.
When are they due? • Biennial Reports are due immediately following years 1, 3, and 5 of the cohort cycle • Due to the Commission • August 15, • October 15, or • December 15 They are due FOLLOWING the second academic year during which the data is collected.
How are they submitted? Electronically – via E-mail to BiennialReports@ctc.ca.gov
How are Biennial Reports Reviewed? Several levels of review: 1) Staff review 2) Program Assessment Reviewers 3) Site Visit Teams
Staff Review • CTC staff will review the reports and, if necessary, seek additional information. • Feedback will be provided to program sponsors in a timely manner--6-8 weeks • A summary of the information from the Biennial Reports will be shared with the Committee on Accreditation.
Examples of Issues Identified • No data is submitted – assessment process is discussed • No candidate data is currently collected, analyzed or utilized by the institution, but there is a plan to do so in the future • Links between the data, analysis, and program modifications are hard to see
Examples of Issues – cont- • Data is reported at a level that is difficult to link to candidate competencies explicit in the standards • Data includes only post-program effectiveness or candidate data and not both • Areas of apparent weaknesses are not addressed at all in the analysis or program modifications
After the Staff Review In 4th Year of Cycle – Biennial Reports are provided to Program Assessment Reviewers (evidence for candidate assessment/competency standards) In 6th Year of Cycle – Biennial Reports are provided to Site Visit Team (evidence for Common Standards 2 and 9)
Resources • The Biennial Report template and more can be found at: http://www.ctc.ca.gov/educator-prep/program-accred-biennial-reports.html under Biennial Report information • Cheryl Hickey, firstname.lastname@example.org • Rebecca Parker, email@example.com