1 / 16

Rahel Kahlert, MPA Evaluator, Dana Center, University of Texas at Austin Mary Walker, Ph.D.

Optimizing Evaluation Quality & Cost Effectiveness Evaluating the University of Texas Master Teacher Summer Institute (MTSI). Rahel Kahlert, MPA Evaluator, Dana Center, University of Texas at Austin Mary Walker, Ph.D. Director of the MTSI, University of Texas at Austin.

nehru-garza
Download Presentation

Rahel Kahlert, MPA Evaluator, Dana Center, University of Texas at Austin Mary Walker, Ph.D.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimizing Evaluation Quality & Cost Effectiveness Evaluating the University of Texas Master Teacher Summer Institute (MTSI) Rahel Kahlert, MPAEvaluator, Dana Center, University of Texas at Austin Mary Walker, Ph.D. Director of the MTSI, University of Texas at Austin

  2. About theMaster Teacher Summer Institute • Three week of training for in-service teachers in teams or singles • Academic setting earning graduate credit • Inquiry/problem-based approach to learning in collaborative groups • Goal: to increase teachers’ content and pedagogical knowledge in physics

  3. About Inquiry-based physics • Physics by Inquiry was developed by McDermott and PERG at Univ. Washington • In-service teachers learn to teach science as an inquiry-based process • One week each on: • Electric circuits • Optics • Kinematics (developed by Marshall & Castro)

  4. Evaluation standards used • Service orientation • Identify stakeholders • Make evaluation useful for decision makers • Cost effectiveness • Limited resources available • Maximizing resources • Balance different evaluation standards and their purpose

  5. Data sources Triangulation of data • To collect stakeholder perceptions • Application materials (use of project records to reduce data collection) • Focus groups • Open-ended surveys • Follow-up interviews • To measure teacher content knowledge • Pre-assessments • Post-assessments

  6. Stages of data collection From snapshot to continuous data collection • Previous evaluations: • One survey at the end of MTSI • Snapshot approach • Current evaluation has several stages: • Prior to the Institute • During the institute • At the end of the Institute • Follow-up several months later

  7. Qualitative data • Focus groups paired with surveys (pluses and minuses) • Open-ended short surveys • ++ Capture each teacher’s individual voice • ++ Provide reflection time • –– No interactive nature • Focus groups • ++ Are of interactive nature to “dig deeper” • ++ Crystallize main themes and controversial issues • –– Individual voices might be overlooked • Combination of both maximizes advantages

  8. Quantitative data • Use of pre-existing validated assessments • Cuts costs to develop assessments • Reduces threat to validity • Baseline data available • Quasi-control groups available • Methods • Test of difference between two paired means • Grouping test items into cluster for detailed content analysis

  9. Sample finding • Inquiry-based approach • Addresses teachers’ need for hands-on activities • Promotes visual and conceptual understanding of teachers (and students) • Follow-up evaluation shows: • Teachers used actual activities in the classroom • They shared these activities with other teachers

  10. Sample finding • The two aspects of inquiry-based physics • Helps to increase teachers content knowledge • Shows how teachers can use inquiry-based instruction to help students learn • Follow-up evaluation shows: • Teachers perceived that first aspect was emphasized over second • Were anxious about applying the method in their own classrooms

  11. Sample finding • Alignment • Topics were horizontally and vertically aligned • Close collaboration between MTSI faculty • No collaboration with MTSI Calculus • Follow-up evaluation shows: • Teachers took initiative in promoting alignment within their department • Organized professional development around alignment

  12. Sample recommendation Highlight the pedagogical aspect of inquiry-based teaching • Teach how to ask inquiry-based questions • Step back and explain why a section was taught in a certain way • Teachers were taught through the inquiry-based process, but wanted more emphasis on how to apply this process in their own classrooms

  13. Sample recommendation • Carefully address mixed-experience levels of participants • For example: Provide technology training both before and during the MTSI to address teachers’ mixed technological knowledge and interests • For example: Consider a pre-training session for basic mathematics concepts for teachers with weaker math backgrounds

  14. Next steps • Current question: • How does MTSI benefit teachers’ content knowledge and pedagogical knowledge? • Future question: • Does an increase in teacher content and pedagogical knowledge through inquiry-based MTSI have a positive impact on student learning? • Currently relying on teacher self-reports

  15. Conclusion • It is essential that evaluator and stakeholders (e.g. program director) work closely together • To ensure usefulness of evaluation • To not double the work • There are several steps that help cut costs • Use project records data • Use validated, existing assessments • Use focus groups versus interviews

  16. Contact information Mary Walker marywalker@mail.utexas.edu Rahel Kahlert kahlert@mail.utexas.edu

More Related