1 / 49

IDEA Student Ratings of Instruction

Insight Improvement Impact ®. IDEA Student Ratings of Instruction. Texas A & M University February/March 2013. Created by: Shelley A. Chapman, PhD Senior Educational Consultant. Plan for this Session. What Makes IDEA Unique Conditions for Good Use Underlying Philosophy of IDEA

ciara
Download Presentation

IDEA Student Ratings of Instruction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Insight Improvement Impact® IDEA Student Ratings of Instruction Texas A & M University February/March2013 Created by: Shelley A. Chapman, PhD Senior Educational Consultant

  2. Plan for this Session • What Makes IDEA Unique • Conditions for Good Use • Underlying Philosophy of IDEA • Faculty Information Form • Interpreting Reports • Questions and Answers

  3. Individual Development Educational Assessment Teaching Improvement Faculty Evaluation Curriculum Review Program Assessment Accreditation

  4. Insight Improvement Impact® A Non-Profit Organization • Kellogg Grant in 1975 • Separate Organization and Non-profit status 2001 • Mission To help colleges and universities as they seek to improve teaching, learning, and leadership

  5. What makes IDEA unique? • Focus on Student Learning • Focus on Instructor’s Purpose • Adjustments for Extraneous Influences • Validity and Reliability • Comparison Data • Flexibility

  6. Conditions for Good Use The instrument • Targets learning • Provides suggested action steps for teaching improvement

  7. Conditions for Good Use The Faculty • Trust the process • Value student feedback • Are motivated to make improvements

  8. Conditions for Good Use • Campus Culture • Teaching excellence - high priority • Resources to improve - provided • Student ratings - appropriate weight

  9. Conditions for Good Use The Evaluation Process • 30-50% of evaluation of teaching • 6-8 classes, more if small (<10) • Not over-interpreted (3-5 performance categories)

  10. Reflective Practice using Individual Reports Try new ideas Online, Paper Talk with colleagues What the reports say and what they mean IDEA resources that are keyed to reports

  11. Underlying Philosophy of IDEA Teaching effectiveness is determined primarily by students’ progress on the types of learning the instructor targets.

  12. Faculty Information Form

  13. Student Learning Model Specific teaching behaviors are associated with certain types of student progress under certain circumstances. Teaching Behaviors Student Learning Circumstances

  14. Student Learning Model: Diagnostic Form Teaching Behaviors Items 1-20 Student Learning Items 21-32 Summary Items: 40-42 Research Items: 44-47 Up to 20 extra items Circumstances Students: Items 36-39, 43 Course: Items 33-35

  15. FIF: Selecting Objectives • 3-5 as “Essential” or “Important” • Is it a significant part of the course? • Do you do something specific to help students accomplish the objective? • Does the student’s progress on the objective influence his or her grade? Be true to your course.

  16. Common Misconception #1 Students are expected to make significant progress on all 12 learning objectives in a given course.

  17. Common Misconception #2 Effective instructors need to successfully employ all 20 teaching methods in a given course.

  18. Relationship of Learning Objectives to Teaching Methods

  19. Common Misconception #3 The 20 teaching methods items should be used to make an overall judgment about teaching effectiveness. Faculty Evaluation

  20. Used for research Best answered toward end of term Do NOT influence your results Course Description Items (FIF) Bottom of Page 1 Top of page 2

  21. IDEA Online

  22. IDEA Online: Student Survey Delivery • Email/Course embedded URL • Blackboard Building Block • Email reminders • Start/end dates Determined by Institution • Submission is confidential and restricted to one

  23. Online Response Rates – Best Practices • Create value for student feedback • Monitor and Communicate through multiple modalities: • Twitter • Facebook • Other • Prepare Students • Talk about it • Syllabus

  24. Example: Course Syllabus

  25. IDEA Online: FIF Delivery • Email delivery/reminders • Start/end dates determined by Institution • Access is unlimited while available • Questions can be added to student survey • Objectives can be copied from previously completed FIFs

  26. Copying Objectives

  27. Reflective Practice with IDEA IDEA Student Ratings of Instruction Reports

  28. Diagnostic Report Overview • How did students rate their learning experience? What contextual factors impacted those ratings? How do my scores compare to: IDEA, discipline, and institution? What might I do to facilitate better learning for my students next time?

  29. 1. How did Students Rate their Learning? 1If you are comparing Progress on Relevant Objectives from one instructor to another, use the converted average.

  30. 4 Progress On Relevant Objectives 4.3 + 4.3 4.1 4.2 3.6 5

  31. Summary Evaluation: Five-Point Scale 50% 25% 25%

  32. 2. What contextual factors impacted those scores?

  33. Adjusted Scores • Student Work Habits (#43DF) • Student Motivation (#39DF) • Class Size (Enrollment, FIF) • Student Effort (multiple items) • Course Difficulty (multiple items)

  34. 2. What contextual factors impacted those ratings?

  35. 3. How do my scores compare to: IDEA, Discipline, Institution?

  36. Comparisons (Norms): Converted Averages • Able to compare scores on the same scale T Scores • Average = 50 • Standard Deviation = 10 • They are not percentiles or percentages

  37. Comparisons (Norms): Converted Averages

  38. Comparison Scores Distribution Gray Area 40% Similar 20% 20% 10% Much Higher 10% Much Lower Lower Higher

  39. Comparison Scores

  40. 4. What might I do to facilitate better learning next time?

  41. Page 2: What did students learn?

  42. Page 3: Suggested Action Steps #16 #18 #19

  43. POD-IDEA NotesIDEA Website

  44. POD-IDEA Notes • Background • Helpful Hints • Application for online learning • Assessment Issues • References and Resources

  45. References and Links to Helpful Resources are Provided

  46. IDEA Papers Resources for Faculty Evaluation Faculty Development

  47. Reflective Practice Paper or Online Try something new Meet with colleagues to reflect Interpret Reports POD-IDEA Notes IDEA Papers

  48. Questions ? ideasri.tamu.edu www.theideacenter.org Visit our IDEA Help Community!

  49. Teaching Goals Inventory

More Related