1 / 48

Georgia A. Cobbs Trent Atkins The University of Montana

How effective is our Elementary Teacher Education Program? Two Projects: AGILE Teacher Work Sample :. Georgia A. Cobbs Trent Atkins The University of Montana. Background.

glen
Download Presentation

Georgia A. Cobbs Trent Atkins The University of Montana

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How effective is our Elementary Teacher Education Program?Two Projects: AGILE Teacher Work Sample: Georgia A. Cobbs Trent Atkins The University of Montana

  2. Background • “The current political climate with its attention to teacher and student accountability and the shift in schooling from a norm-referenced, textbook driven system to a learner-centered, standards-based system has highlighted the need for a framework like work sampling that offers fodder for potential theoretical and empirical connections between preparation, teaching practices, and P-12 student learning”(Girod, Schalock& Cohen, 2006).

  3. Teacher Education Needs • Teacher education under scrutiny • How to link teacher preparation to practice to K-12 learning? • Complex correlation (Berliner, 2002) • Need for congruence between Higher Education and PK-12 • Need to improve literacy instruction preparation • How to measure effectiveness?

  4. Basis for AGILE & TWS • Need to blur the line between general education & special education • Many barriers in PK-12 are same issues in higher education • Ultimately, many of the issues in PK-12 are responsibility of higher education

  5. AGILE Guiding Questions • Does training, engagement and ongoing support in RTI have a positive impact on: #1 Student-teacher view of self-efficacy? #2 Student-teacher view of measurement? #3 Student-teacher knowledge of the principles of Effective Instructional Practices?

  6. AGILEGuiding Question cont #4 K-3 student reading skills? #5 K-3 student spelling skills? #6 K-3 student early numeracy skills? #7 K-3 student math computation skills?

  7. AGILE Methods • 2 professors met w/ (3) student teachers about 3-4/ month • Collected data about every 2 weeks on Elem students • If students were not improving modify instruction • No specific strategies prescribed (RTI)

  8. Math Computation

  9. Oral Counting

  10. K-3 Student Data cont.

  11. Implementation Highlights • School placements have been easier • Schools appreciate the training we are providing to students • More meaningful, transactional relationship with schools • Student-teachers are receiving more focused university supervision from UM faculty (Darrell and Trent)

  12. AGILEEpiphanies • The complete lack of training in intervention strategies • Taking on assessment and evidence-based intervention strategies during student teaching is too much • Assessment and evidence-based intervention strategies must be integrated into the teaching preparation curriculum

  13. Oregon Teacher Work Sample • Developed over the last 20 years • Standardized a method for TWS • Over 1,000 Student teachers • 10,000 K-12 students

  14. Why TWS at UM? • To prepare teachers to make a difference in the learning of children • Teach Teacher Candidates to make data driven decisions. • Help to ensure teachers meet ethical obligations • To validate effectiveness of Teacher Ed programs

  15. Purpose of TWS • Effort to move the Teacher Education Program into a new era • Experience of documenting teaching effectiveness • Measure knowledge & skill gains of PK-12 students • Assess using a pre/post action research design.

  16. TWSResearch Questions • Q1 At what grade was the most impact made? • Q2 Was there an impact difference between male and female students? • Q3 Is there an impact difference among the types of students? • Q4 Is there an impact difference among the types of lessons?

  17. Methodology • Senior level Math Methods 2011-2012 • Required assignment in methods • Teach a math lessonwith a pre-post assessment • Email me the spreadsheet of data • Answer assigned questions

  18. Math Lesson (Researcher coded) • Book • Manipulative • Technology • Book/Manipulative • Book/Technology • Technology/Manipulative • Other

  19. Data Collection • Assignment to my UG/G Students (N=102) • Part of their requirements for my course • Report back to me • Some worked in pairs • Data incomplete or no clear pre-post (narrative no test scores) • Final Teacher Candidate Data N=76

  20. Suggestions for Assessment • Same for both the pre & the post assessment. • Or two assessments should be very similar • These could be chapter or unit test included within a curriculum. In other cases you may be asked to use an assessment that is not part of a specific curriculum. In other situations, you may need to create your own assessment.

  21. Data Reporting • ANONYMOUS results to me • Set up a spreadsheet like below

  22. Demograhpics

  23. N=1169 Elementary Students

  24. Demographics

  25. Mean Score Differences by Grade Level

  26. Findings for GRADE LEVEL • To conduct analyses, grade were grouped into three categories: Early (K,1, and 2), Middle (3 and 4), and Late (5 and 6). • When grouped this way, there were slight differences in mean scores Early (17.25), Middle (17.54), and Late (20.05). • No statistically significant differences found.

  27. Mean Differences by GENDER

  28. Findings for Gender • Females (18.65) scored slightly higher than male (16.63) students. • Differences were not statistically significant.

  29. Mean Differences by ETHNICITY

  30. Findings for ETHNICITY • To conduct analyses, the ethnicity variable was dichotomized. • “Non-diverse” students (20.78) scored higher than diverse students (18.41). • Differences were not statistically significant.

  31. LEARNING NEEDS

  32. Percent by LEARNING NEEDS

  33. Mean Differences by LEARNING NEEDS

  34. Findings for LEARNING NEEDS • To conduct analyses, due to the small number, ELL (n = 2 ) students were removed. • Students in special education scored highest (22.57), students with no identified needs scored in the middle (18.19), and students who are identified as gifted and talented made the smallest gains (11.56) • Differences were not statistically significant.

  35. LESSON TYPE

  36. Percent by LESSON TYPE

  37. Mean Differences by LESSON TYPE

  38. Findings for LESSON TYPE • Statistically significant differences among the lesson types!!!! • Book w/manipulatives had the most impact (statistical significance compared to all others). • Followed by Technology combined with manipulatives (statistical significance compared to all other but the book only).

  39. Findings for LESSON TYPE • Statistically significant differences among lesson types • Book w/ manipulatives most impact (statistical significance compared to all others). • Technology w/ manipulatives (statistical significance compared to all other but the book only).

  40. Findings continued • Book by itself was third most impactful (statistical differences when compared to the book and technology, the book and manipulatives) • Least impactful lesson types were • manipulatives (14.49) • technology (7.80) • book w/ with technology (6.47)

  41. Limitations • Not a standard assessment • Teacher Candidates not versed in Research Design • Coding of lesson type • Incomplete data: unknown demographics • Error in recording/analysis by teacher candidate or researcher

  42. Next Steps • Make a Google Form! • Standardize data collection • Teach Research Design to Teacher Candidates • Other ideas?

  43. Conclusions • Book/manipulative lesson may be most effective for teacher candidates • Need to research this more • Trend to have teachers make more data driven decisions • UM Teacher Candidates are teaching with data-driven decisions, perhaps more teachers will carry on as well

  44. Questions? • Comments? • Thank you!

  45. References • Berliner, D. (2002). Educational research: The hardest science of all. Educational Researcher, 31(8), 18-20. • Girod, M., Schalock, M. & Cohen, N. (2006). The Teacher Work Sample as a Context for Research. Paper presented at the annual meeting for AACTE, San Diego, CA.

More Related