1 / 20

ASSESSMENT CONFERENCE JANUARY 13, 2012

ASSESSMENT CONFERENCE JANUARY 13, 2012. ACCREDITATION SITE VISITS. HISTORY OF SITE VISITS. DIVISION 010 – SITE VISIT PROCESS DIVISION 017 – UNIT STANDARDS DIVISION 065 – CONTENT STANDARDS. HISTORY OF SITE VISITS (cont.). Site team selected from higher education peers and k-12 educators.

ciara
Download Presentation

ASSESSMENT CONFERENCE JANUARY 13, 2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASSESSMENT CONFERENCEJANUARY 13, 2012 ACCREDITATION SITE VISITS

  2. HISTORY OF SITE VISITS • DIVISION 010 – SITE VISIT PROCESS • DIVISION 017 – UNIT STANDARDS • DIVISION 065 – CONTENT STANDARDS

  3. HISTORY OF SITE VISITS (cont.) • Site team selected from higher education peers and k-12 educators. • Institution presents evidence at TSPC office for review. • Team reviews evidence and visit the institution (unit). • Team evaluates evidence based on standards. • Purpose of the site visit is to determine compliance to Commission standards (Division 17).

  4. HISTORY OF SITE VISITS (cont.) • Programs first approved by Commission and reapproved as part of unit site visit. • Critics of current process: • Process subjective; • Inconsistent in evaluations; • Teams made recommendations based on site visit findings; • Undefined culture of evidence; • No program review process.

  5. Proposed Changes • Move purpose of accreditation from compliance to continuous improvement • Change the definition of culture of evidence: • Define required assessment systems; • Define required categories of data to demonstrate candidate competencies; • Define processes for use of data for program improvement.

  6. Proposed Changes • Create a rigorous program review process as part of accreditation process. • Emphasis on assessments, rubrics and scoring guides • Emphasis on quality of data for purposes of continuous improvement • Use of data in continuous improvement process

  7. Unit Site Visits • Key standards for accreditation • Candidate competencies evidenced by data • Assessment systems • Field experiences • Cultural competency/Diversity and inclusion • Faculty Qualifications • Unit Resources and Governance

  8. Conceptual Framework584-017-1008 • The conceptual framework establishes the shared vision for a unit’s efforts in preparing educators to work effectively in P-12 schools. The framework provides direction for programs, courses, teaching, candidate performance, scholarship, service, and unit accountability. The conceptual framework is knowledge-based, articulated, shared, coherent, consistent with the unit and institutional mission, and continuously evaluated.

  9. Unit Site Visits – (cont.) • Site team use of rubrics to determine meeting standards • Allows for meeting standards yet leaves room to determine Areas for Improvement (AFI)

  10. Program Review Process • New process in accreditation; • Evidence used to demonstrate validity of candidate competency data during unit site visit • Program review process virtual in nature based on electronic exhibits • Program reviews conducted six months prior to unit site visits

  11. Program Review Process (cont.) • The commission has adopted template for program review process associated with site visits, major program modifications and new endorsement programs • Intent is to provide clear directions on requirements for program review, addition and modification. Electronic submission of materials is required for easier review by commissioners and site team members

  12. Program Review Process (cont.) PRINCIPLES TO FOLLOW FOR DATA COLLECTION • Candidates ability to impact student learning; • Knowledge of content; • Knowledge of content pedagogy; • Pedagogy and professional knowledge; • Dispositions as defined by state standards or the unit’s conceptual framework; • Technology

  13. Program Review Process (cont.) • The following rubric will be used when considering whether the program meets state standards: • Acceptable: The program is aligned to state and/or national program standards. Assessments address the range of knowledge, skill and dispositions stated in standard or by unit. Each assessment is clearly defined and uses a scoring guide. Each assessment is consistent with the complexity, cognitive demands, and skill required by the standard it is designed to measure. Each assessment measures what it purports to measure. Each assessment and scoring guide is free of bias.

  14. Program Review Process (cont.) • Assessment instruments do provide candidates or supervisors with guidance as to what is being sought. Assessments and scoring guides allow for levels of candidate proficiency to be determined. The assessments do address candidate content knowledge, content-pedagogy, pedagogy and professional knowledge, student learning and dispositions. • Field experience does meet the requirements of the standards. There is evidence data has been summarized and analyzed. The data has been presented to the consortium. Syllabi clearly align and clearly address the program standards

  15. Program Review Process (cont.) • Area for Improvement (AFI) Example: Key assessments do not provide candidates or supervisors with substantive guidance as to what is being sought (from candidates) • Rationale: Scoring guides use simple words (i.e. unacceptable, emerging, proficient, or exemplary) and are left to broad interpretation.

  16. Program Review Process (cont.) • AFI Example: Instruments and scoring guides do not allow for levels of candidate proficiency to be determined. • Rationale: Data demonstrates little or no distribution of candidates across the scoring guide scale. All candidates receive predominately the same score.

  17. Program Review Process (cont.) • State Program Review Results Report: • The State Program Review Results Report is the document that will be submitted by the program review site team to the Commission for review at the meeting prior to the submission of the unit’s Institutional Report (IR).

  18. Program Review Process (cont.) • The program review site team will make recommendations to the Commission regarding whether the Commission should: • extend full state recognition of the program(s); • recognition with conditions, or • denial of the program’s recognition. • {See Division 10 for the levels or program review recognitions.}

  19. DATA REQUIREMENTS • These rules are effective starting January 1, 2012. Units subject to accreditation must meet all standards and with regard to assessment and data must implement as follows: • (a) During the 2012 calendar show they have an assessment system in place; • (b) During the 2013 calendar year must have one year of data to evaluate; • (c) During the 2014 calendar year must have two years of data to evaluate;

  20. DATA REQUIREMENTS (CONT.) • (d) During the 2015 calendar year must have three years of data to evaluate; • (e) During the 2016 calendar year must have four years of data to evaluate; • (f) During the 2017 calendar year must have five years of data to evaluate; • (g) During the 2018, if not evaluated after January 1, 2012 must have six years of data to evaluate.

More Related