1 / 54

Using IDEA for Faculty Evaluation

Using IDEA for Faculty Evaluation. Shelley A. Chapman, PhD. Texas A & M University February 2013. Plan for this Session. “Teaching Effectiveness”  What it is Uniqueness of IDEA Conditions for the Good Use of IDEA 3-Phase Process for Faculty Evaluation Using Reports to Improve Teaching.

mabli
Download Presentation

Using IDEA for Faculty Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using IDEA for Faculty Evaluation Shelley A. Chapman, PhD Texas A & M University February 2013

  2. Plan for this Session • “Teaching Effectiveness” What it is • Uniqueness of IDEA • Conditions for the Good Use of IDEA • 3-Phase Process for Faculty Evaluation • Using Reports to Improve Teaching

  3. Teaching Effectiveness Most Surveys How well do students rate their progress on the types of learning the instructor targeted? How well do the instructor’s methods resemble those of a “model” teacher?

  4. What makes IDEA unique? • Focus on Student Learning • Focus on Instructor’s Purpose • Adjustments for Extraneous Influences • Validity and Reliability • Comparison Data • Flexibility

  5. Conditions for Good Use The instrument • Focuses on learning • Provides suggested action steps

  6. Conditions for Good Use The Faculty • Trust the process • Value student feedback • Are motivated to make improvements

  7. Conditions for Good Use • Campus Culture • Teaching excellence -high priority • Resources to improve - provided • Student ratings - appropriate weight

  8. Conditions for Good Use The Evaluation Process • 30-50% of evaluation of teaching • 6-8 classes, more if small (<10) • Not over-interpreted (3-5 performance categories)

  9. Underlying Philosophy of IDEA Teaching effectiveness is determined primarily by students’ progress on the types of learning the instructor targets.

  10. Faculty Information Form

  11. Diagnostic Report Overview • How did students rate their learning experience? What contextual factors impacted those ratings? How do my scores compare to: IDEA, discipline, and institution? What might I do to facilitate better learning for my students next time?

  12. What the Report Can Provide Suggested Action Steps Context: Variables and Comparisons Calculation of Scores

  13. Using IDEA As Part of a Faculty Evaluation Process

  14. Multiple forms of Assessment Balanced Plan for Summative Evaluation

  15. Syllabi Graphic Organizers Assignments and project descriptions Rubrics Written Teaching Philosophy/Reflections Samples of Student Work CATs and results Evidence of Good Teaching

  16. Classroom Observation Classroom Visitation Invited Presentations Alumni Surveys Focus Groups of Graduating Students Evidence of Good Teaching

  17. Classroom Observations

  18. Classroom Observations

  19. Flow of Communication Map Instructor M M F F F F F F F F • M F M F F M F M M • M M

  20. Administer Appropriately Collect 6-8 Reports (more if class size is <10) 30-50% of Overall Evaluation Evidence of Good Teaching

  21. Student Comments-formative Be mindful of standard error of measurement (±.3) Use 3-5 Performance Categories Evidence of Good Teaching

  22. Three-Phase Process for Faculty Evaluation

  23. Three-Phase Process for Faculty Evaluation • Set Expectations • What does this entail regarding IDEA?

  24. Criterion-Referenced Standards Criterion • Use averages on 5-point scale • Recognize that some objectives are more difficult to achieve • “Authenticate” objectives Page 1

  25. Norm-Referenced Standards Use Converted Averages • IDEA • Discipline • Institution

  26. Comparison Information: Converted Averages

  27. T-Score Distribution Gray Band 40% Similar 10% Much Higher 20% 20% 10% Much Lower Lower Higher

  28. Comparison Scores

  29. Create Categories of Performance Below Acceptable Standards Marginal, Needs Improvement Meets Expectations Exceeds Expectations Outstanding Does Not Meet Expectations Exceeds Expectations Meets Expectations

  30. Performance Categories: EXAMPLE

  31. Three-Phase Process for Faculty Evaluation II. Collect Data What do you look for regarding IDEA?

  32. Improving Online Response Rates • Create value for student feedback • Monitor and Communicate through multiple modalities: • Twitter • Facebook • Other • Prepare Students • Talk about it • Syllabus

  33. Example: Course Syllabus

  34. For Personnel Decisions Pages 1 and 2 What were students’ perceptions of the course and their learning?

  35. Things toConsider… Were the appropriate objectives selected? • How many? • Do they match the course? • How might you “authenticate” the objectives selected?

  36. FIF: Selecting Objectives • 3-5 as “Essential” or “Important” • Is it a significant part of the course? • Do you do something specific to help students accomplish the objective? • Does the student’s progress on the objective influence his or her grade? Be true to your course.

  37. Things to Consider… What were the students’ perceptions of their course and their learning?

  38. How Did Students Rate Their Learning?

  39. How Did Students Rate Their Learning? 1If you are comparing Progress on Relevant Objectives from one instructor to another, use the converted average.

  40. 4 Progress On Relevant Objectives 4.3 + 4.3 4.1 4.2 3.6 5

  41. Summary Evaluation: Five-Point Scale 50% 25% 25%

  42. Understanding Adjusted Scores

  43. Work Habits (Item 43) Student Motivation (Item 39) High High Avg. Avg. Low Avg. Low High 4.48 4.38 4.28 4.13 4.04 High Avg. 4.38 4.29 4.14 3.96 3.76 Average 4.28 4.14 4.01 3.83 3.64 Low Avg. 4.15 4.05 3.88 3.70 3.51 Low 4.11 3.96 3.78 3.58 3.38 Impact of Extraneous Factors • Gaining Factual Knowledge – Average Progress Ratings Technical Report 12, page 40

  44. Work Habits (Item 43) Student Motivation (Item 39) High High Avg. Avg. Low Avg. Low High 4.48 4.38 High Avg. 4.38 4.29 Average 4.01 Low Avg. 3.70 3.51 Low 3.58 3.38 Impact of Extraneous Factors • Gaining Factual Knowledge – Average Progress Ratings Technical Report 12, page 40

  45. Raw or Adjusted Scores

  46. Do raw scores meet or exceed Expectations?* • Are • adjusted • scores lower or • higher than • raw • scores? Lower Yes Higher Use adjusted scores Use raw scores When to Use Adjusted Scores No *Expectations defined by your unit.

  47. Three-Phase Process for Faculty Evaluation III. Use Data Which data will you use and how?

  48. IDEA Faculty Worksheet • Keep track of reports • Look for longitudinal trends • Use for promotion and tenure Created by Pam Milloy, Grand View University . Available from The IDEA Center Website

  49. Using the Data Summative (pp.1-2) Formative (p.3) Identify areas to improve Access applicable resources from IDEA website Read and have conversations Implement new ideas • Criterion or Norm-referenced • Adjusted or raw • Categories of Performance • 30-50% of Teaching Evaluation • 6-8 Classes (more if small)

  50. Reflective Practice Try new ideas Online, Paper Talk with colleagues What the reports say and what they mean IDEA resources that are keyed to reports

More Related