Improving teacher quality grants cycle 3 external evaluation report
Download
1 / 46

Improving Teacher Quality Grants, Cycle 3: External Evaluation Report - PowerPoint PPT Presentation


  • 113 Views
  • Uploaded on

Improving Teacher Quality Grants, Cycle 3: External Evaluation Report. December 8, 2006 University of Missouri-Columbia Evaluation Team. Principal Investigators Sandra Abell Fran Arbaugh James Cole Mark Ehlert John Lannin Rose Marra. Graduate Research Assistants Kristy Halverson

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Improving Teacher Quality Grants, Cycle 3: External Evaluation Report' - natan


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Improving teacher quality grants cycle 3 external evaluation report

Improving Teacher Quality Grants, Cycle 3:External Evaluation Report

December 8, 2006

University of Missouri-Columbia

Evaluation Team


Evaluation team

Principal Investigators

Sandra Abell

Fran Arbaugh

James Cole

Mark Ehlert

John Lannin

Rose Marra

Graduate Research Assistants

Kristy Halverson

Kristen Hutchins

Zeynep Kaymaz

Michele Lee

Dominike Merle

Meredith Park Rogers

Chia-Yu Wang

Evaluation Team


Context of the evaluation
Context of the Evaluation

  • Improving Teacher Quality grants program, Cycle 3, 2005-2006

  • Focus on high-need schools

  • 9 professional development projects

  • Science and Mathematics, grades 4-8


Evaluation model
Evaluation Model

Adapted from Guskey, 2000


Purpose of evaluation
Purpose of Evaluation

  • Formative evaluation

  • PD environment evaluation

  • Summative evaluation

    • Participant reaction

    • Participant learning—content knowledge and inquiry

    • Participant use of knowledge

    • Organization change

    • Student learning


Methods formative
Methods—Formative

  • Site visits

    • Interviews: teachers and staff

    • Observations

    • Formative feedback report


Methods pd environment
Methods—PD Environment

  • Teacher Participant Data Questionnaire

  • Site visits

    • Interviews: teachers and staff

    • Observations

  • Surveys to PIs (TeachingPhilosophy Survey and Seven Principles)

  • PI preliminary report


Methods outcomes
Methods-Outcomes

  • Participant reactions

    • Site visits

    • Teacher Participant Survey 1 and 2

  • Participant learning—content knowledge

    • Project-specific tests (all 9 projects)

  • Participant learning—inquiry

    • Teaching Philosophy Survey

    • Seven Principles

  • Participant use of knowledge

    • Teacher Participant Survey

    • Interviews

    • Seven Principles

    • Implementation Logs


Methods outcomes1
Methods--Outcomes

  • Organization change

    • Higher Education Impact Survey

  • Student learning

    • Teacher-assessed (3projects)

    • Teacher Participant Survey

    • MAP analyses


Participant summary
Participant Summary

  • 252 participants

  • 86% female; 81% white

  • 40% held a masters degree or higher

  • 76% held their first Bachelor’s degree in a field other than science or math

  • Represented 76 different Missouri school districts, 6 private schools, and 2 charter schools

  • Directly impacted 16,747 students in 2005-2006





Elem middle junior high certification status
Elem/Middle/Junior High Certification Status







Results
Results

  • PD Environment

  • Participant Reactions

  • Outcomes

    • Participant Content Knowledge

    • Participant Knowledge of Inquiry

    • Participant Use of Knowledge of Inquiry

    • Organization Change

    • Student Learning



Pd environment pi beliefs n 19
PD Environment—PI Beliefs (n=19)

least constructivist response = 1, neutral = 3, most constructivist = 5



Participant performance on content knowledge post pre tests
Participant Performance on Content Knowledge—Post/Pre Tests

Posttest scores presented as a percent of pretest scores.


Participant change in inquiry knowledge
Participant Change in Inquiry Knowledge Tests

*p < .05. **p < .01. ***p < .001


Participant change in inquiry usage
Participant Change in Inquiry Usage Tests

*p < .05. **p < .01. ***p < .001




Organization change impact on higher education
Organization Change--Impact on Higher Education Tests

  • Team members from five projects responded to HEI Survey

    • Establishment of new science courses related to the PD projects

    • Establishment of new education courses

    • Redesign of courses to include more inquiry-based labs

    • New or strengthened collaborations between education and science

    • Increased grant writing activity on campus


School level performance on map
School Level Performance on MAP Tests

  • Map Index and % Top 2 Levels

  • Served vs not served schools by High Needs status

  • Science – 2005-06 compared prior years’ average performance

  • Math – no historical comparison possible: examined performance levels by group









Summary of results
Summary of Results Tests

  • Teachers were overall satisfied with PD experiences

    • Valued most: staff, engaging in activities as students would, opportunity to improve content knowledge, working with other teachers;

    • Valued least: lectures,activities geared toward a different grade level or subject matter than they taught, loosely structured follow-up sessions with no clear purpose.


Summary of results cont
Summary of Results (cont) Tests

  • Assessment components less emphasized than content and inquiry components.

  • Teachers gained content knowledge

  • Evidence of some improved teacher practice attributed to projects.

  • Student learning data mixed.

  • Evidence of impact on higher education is limited but promising in some projects.


Conclusions effective project design features
Conclusions: Effective Project Design Features Tests

  • Projects demonstrated effective practice to varying degrees.

  • Alignment of content emphasis areas between projects and teacher/school needs is critical.

  • Shared vision/collaboration with team implemented in a variety of ways.

  • Effective emphasis areas: learning science/math through inquiry; collegial learning with teachers; long-term PD activities; sense of community.


Conclusions cont
Conclusions (cont.) Tests

  • The “smorgasbord” approach – while well intentioned seemed difficult to carry out.

  • Emphasis on mathematics in overall cycle 3 ITQG program was somewhat limited.

  • Individual projects improve over time.

  • Evaluator role balance between program and projects continues to be an issue.


Limitations
Limitations Tests

  • Necessity of sampling.

  • Instruments align with overall program not specific projects.

  • Low overall response rates

    • Implementation Logs

    • End of Project instruments

    • Higher education impact

  • Overall evaluation vs. project specific.

  • Lack of and alignment of student achievement data.

  • Impact on evaluation due to ongoing team collaboration with PIs and K-12 partners.


Recommendations
Recommendations Tests

Project Directors:

  • Continue to build strong relations among PIs and instructional staff.

  • Build stronger K-12 partnerships.

  • Balance content and pedagogy.

  • Emphasize and provide opportunities for practice and feedback on classroom assessment.

  • Encourage participation in evaluation activities.

  • Take advantage of formative feedback.

  • Use literature on best practice when designing and implementing PD.


Recommendations1
Recommendations Tests

External Evaluators:

  • Explore ways to reduce participant time on evaluation.

  • Be proactive in working with PIs and K-12 organizations.

  • Continue to work with PIs through all phases of evaluation.

  • Work with MDHE to examine our roles as evaluators.


Recommendations2
Recommendations Tests

MDHE:

  • Continue funding multi-year projects.

  • Encourage true partnerships via RFP wording and reward systems.

  • Require that the majority of participants are from high-needs districts.

  • Require minimum hours of PD per project.

  • Support PI cross-fertilization of best practices.


Questions
Questions Tests

Copies of the report and Executive Summary available at:

www.pdeval.missouri.edu


ad