1 / 21

Patterns of Practice: Program Assessment in Wisconsin

Patterns of Practice: Program Assessment in Wisconsin. Sources of Data in the Program Review and Approval Process. Setting the Context: Designing the Chapter PI 34 Wisconsin Quality Educator Initiative. Consideration of Best Practice CCSSO Standards Projects (INTASC, ISLLC)

dirk
Download Presentation

Patterns of Practice: Program Assessment in Wisconsin

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Patterns of Practice: Program Assessment in Wisconsin Sources of Data in the Program Review and Approval Process

  2. Setting the Context: Designing the Chapter PI 34 Wisconsin Quality Educator Initiative • Consideration of Best Practice • CCSSO Standards Projects (INTASC, ISLLC) • NCATE 2000 Accreditation • NBPTS Certification • These “best practice” models were correlated to identify the likely links between institution, program and educator performance assessment systems. • These linkages are reflected in Chapter PI 34, Wisconsin’s Quality Educator Initiative, intended as a “seamless” system of preparation and licensing based on review and assessment of performance in these three focal areas.

  3. Three Focal Areas for Performance Assessment • Focus on Institutional Performance • Institutional On-site Review • Focus on Educator Preparation Program Performance • Licensure Program Review • Focus on Candidate Performance • The Assessment Portfolio

  4. Institutional Review: Domains of Evidence • Domain I: Relevant Policies and Practices • Domain II: Conceptual Framework, Program Design and Assessment • Domain III: Institutional Evaluation of Performance • Domain IV: Institutional Assessment System • Domain V: Title II Reporting Data • Domain VI: On-going, Systematic Collaboration

  5. Data Capture in the Program Approval Process • Given Conditions • Most of the evidence is ‘snapshot’ data • Evidence is gathered through multiple methodologies that vary widely across programs and institutions • Evidence is primarily summative in nature • The program approval process reviews and verifies evidence of performance to a common standard of high quality, but doesn’t rank performance based on quality.

  6. Domain One: Relevant Policies and Practices • Patterns of practice in : • Unit organization and authority • Faculty review, promotion and tenure, workload, professional development, and expectations for service • Recruitment and retention of diverse and qualified faculty

  7. Domain I: Relevant Policies and Practices • Patterns of practice in: • Student advising, support and records management • Criteria for program entry, progression, and exit • Patterns of practice in recruitment and retention of diverse and qualified candidates

  8. Analysis of Evidence from Domain I • Data are routinely gathered by almost all institutions • Data are mostly housed within different institutional structures and not in commonly accessible ‘warehouse’ structures • Data are not always routinely collected, aggregated and analyzed

  9. Domain II: Conceptual Framework and Program Design and Assessment • Patterns of Practice For the Institution • Mission • Vision • Research and Knowledge Base • Adopted ‘Best Practice’ models

  10. Domain II: Conceptual Framework and Program Design and Assessment • Patterns of Practice: Licensure Programs by Levels • Core knowledge • Core skills • Core dispositions • Based on standards as interpreted through mission, vision, research, and best practice

  11. Domain II: Conceptual Framework and Program Design and Assessment • Patterns of Practice: Licensure Programs by Levels • Performance Tasks • Performance Benchmarks • Performance Indicators • Based on concepts of valid assessment as interpreted through mission, vision, research, and best practice

  12. Domain II: Conceptual Framework and Program Design and Assessment Rich source of data relative to factors that influence institutional and programmatic decisions about “what matters” versus “what can we count?”

  13. Domain III: Institutional Evaluation of Performance • For this domain, the institution provides evidence of HOW it systematically uses the data analyses generated by the Institutional Assessment System to inform decisions around policy, program quality, program change, and allocation of resources. • As this evidence is so reliant on the Institutional Assessment System, we will describe that first and return to Domain III afterward.

  14. Domain IV: Institutional Assessment System • State-wide patterns in: • Assessment of the impact of institutional policies and practices on program and candidate performance, • Assessment of core knowledge, skill, and/or disposition by program level and across programs in an institution. • Systems for assessing candidate progress over time • Models of Portfolio Assessment

  15. Domain IV: Institutional Assessment System • Patterns of practice in assessing proficiency in • Standards • Communication Skill • Human Relations and Dispositions • Content Knowledge • Pedagogical Knowledge • Practice

  16. Domain IV: Institutional Assessment System • Patterns of practice in the use of program assessments to evaluate proficiency in each Wisconsin Educator Standards • Example: Wisconsin Teacher Standard One: • Praxis II • GPA in program academic subject content core • Lesson Plans – Content • Observed Dispositions toward the Content – Field and Student Teaching Evaluations • Candidate Reflection on Student Learning of Content • Self-selected portfolio artifact demonstrating exit level proficiency in using content knowledge to impact pupil learning

  17. Domain III: Institutional Evaluation of Performance • Patterns of practice around • Uses of evidence from the Institutional Assessment System to inform decisions about the overall quality of educator preparation in an institution • Uses of evidence from program assessments to inform decisions about the overall quality of educator preparation in a program • Uses of evidence from the Institutional Assessment System to inform institutional and program change

  18. Domain V: The Institution’s Title II Report • This is the percentage of program completers that pass the required Praxis II content knowledge examination • This is reported out as an aggregate pass/fail percentage on the total of program completers in each licensure program in each institution. • This report is federally mandated and the most public of all domains of evidence reviewed.

  19. Domain VI: On-going, Systematic Collaboration with Employing Districts • Collaboration in designing field and clinical experiences • Collaboration to develop educator preparation programs to meet identified staffing needs • Collaboration around Initial Educator Support, including mentor training • Collaboration to support graduates through targeted professional development based on identified needs, including the need for Master Educators • Collaboration to support graduates through continuing education opportunities, and service on Goal Approval and PDP verification teams

  20. Current Questions and Issues • How do we use testing data to inform program design and change? • How do we assure that the evidence that we are using to make decisions about program approval actually captures what matters most? • How do we assure that policies and practice around licensing have a positive impact on PK-12 student learning outcomes – particularly for students of color and economically disadvantaged students? • How do we best collaborate within the profession to create that sense of community around a culture of evidence for determining program quality? • How do we most effectively support the research community to identify and research compelling questions that impact policy and practice at all levels?

  21. Contact Information • Deborah Mahaffey, Assistant Superintendent, Division of Academic Excellence • Deborah.mahaffey@dpi.state.wi.us • (608) 266-3361 • Judith Peppard, Director, Teacher Education, Professional Development and Licensing • Judith.peppard@dpi.state.wi.us • (608) 266-0986 • Dr. Moreen Carvan, Assistant Director, Teacher Education, Professional Development and Licensing • Moreen.carvan@dpi.state.wi.us • (608) 266-1788

More Related