1 / 12

Differential Effects of Participatory Evaluation in a National Multi-site Program Evaluation

Differential Effects of Participatory Evaluation in a National Multi-site Program Evaluation. Frances Lawrenz University of Minnesota. Collaboratives for Excellence in Teacher Preparation.

crevan
Download Presentation

Differential Effects of Participatory Evaluation in a National Multi-site Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Differential Effects of Participatory Evaluation in a National Multi-site Program Evaluation Frances Lawrenz University of Minnesota

  2. Collaboratives for Excellence in Teacher Preparation • CETP program promotes comprehensive change in the undergraduate education of future teachers by supporting cooperative, multiyear efforts to increase substantially the quality and number of teachers well-prepared in science and mathematics, especially members of traditionally underrepresented groups.

  3. CETP Characteristics • Each CETP was unique • All engaged faculty from STEM and education • All included several institutions of higher education (including community colleges) in a particular geographic area • All included mechanisms for improving undergraduate education in the sciences and mathematics. • All had some sort of relationship with schools within the geographic area. • All offered scholarships to students from underrepresented groups. • The ranges within these were broad however e.g., from 3 institutions to 13 or more, and all had unique elements like liaisons with Tribal Colleges or industrial/science internship placements, etc.

  4. History of CETP Evaluation • Initially, project level emphasis on evaluation • Program monitoring evaluation required—negative experience • First conference with evaluators—Corridors for Collaboration • Second conference with evaluators • Funding of CETP CORE evaluation

  5. Visit our web site www.education.umn.edu/CAREI/CETP

  6. CORE Evaluation Questions • Three main areas: • Institution • K-12 Teachers • K-12 Students • Foci: • Collaborations • Comparisons to standards • Comparisons to other groups

  7. Institutions a. How supportive of SMET reform education policies and procedures are the participating CETP institutions? SOURCES: Deans/Department Chairs, Faculty, PI b. How successful have the CETPs been at course reform? SOURCES: Deans/Department Chairs, Faculty, College Students, PI c. What impact has CETP had on the system or structure of the teacher education systems at the participating institutions?SOURCES: Deans/Department Chairs, Teachers, Principals, PI, Faculty, QRC

  8. Teachers a. How well do CETP teachers demonstrate the knowledge and skills espoused by the National Science, Mathematics and Technology Education standards?SOURCES: K-12 Students, CETP Teachers, Classroom Observation Protocol, Rubric, Principals of CETP b. How do CETP teachers and the classrooms they create differ from non-CETP teachers and the classrooms they create?SOURCES: K-12 Students, Teachers, Classroom Observation Protocol, Rubric, Principals c. What outcomes have the participating higher education institutions, their faculty members, or the CETP contributed to the K-12 schools?SOURCES:CETP Teachers, Deans/Department Chairs, Principals, PI, Faculty

  9. Students a. Are students learning what is expected in the SMET education standards?SOURCES: K-12 Students, PI, Classroom Observation Protocol, Rubric b. Are there differences in student outcomes for CETP and non-CETP teachers?SOURCES: K-12 Students, Classroom Observation Protocol, PI

  10. Positives • Good working relationships • Shared expertise; website, list serve, emails, and meetings • Quality data collection instruments • Group negotiations with NSF • More effort on unique aspects • Incentives; $15,000, print and mail out instruments, help with IRB, data entry and return

  11. Negatives • Instruments not matched exactly to CETP • Time and labor intensive data collection • All levels of participants required to participate • Training required to do classroom observations • Not required by NSF

  12. Recommendations • Plan for the program evaluation before funding begins • Require participation in the evaluation • Consider partial participation—either only some sites participating or some sites collecting only some data • Have project evaluators in the loop from the beginning • Substantively involve PIs • Have the program evaluation data play a critical role in project evaluation

More Related