490 likes | 616 Views
Insight Improvement Impact ®. IDEA Student Ratings of Instruction. Texas A & M University February/March 2013. Created by: Shelley A. Chapman, PhD Senior Educational Consultant. Plan for this Session. What Makes IDEA Unique Conditions for Good Use Underlying Philosophy of IDEA
E N D
Insight Improvement Impact® IDEA Student Ratings of Instruction Texas A & M University February/March2013 Created by: Shelley A. Chapman, PhD Senior Educational Consultant
Plan for this Session • What Makes IDEA Unique • Conditions for Good Use • Underlying Philosophy of IDEA • Faculty Information Form • Interpreting Reports • Questions and Answers
Individual Development Educational Assessment Teaching Improvement Faculty Evaluation Curriculum Review Program Assessment Accreditation
Insight Improvement Impact® A Non-Profit Organization • Kellogg Grant in 1975 • Separate Organization and Non-profit status 2001 • Mission To help colleges and universities as they seek to improve teaching, learning, and leadership
What makes IDEA unique? • Focus on Student Learning • Focus on Instructor’s Purpose • Adjustments for Extraneous Influences • Validity and Reliability • Comparison Data • Flexibility
Conditions for Good Use The instrument • Targets learning • Provides suggested action steps for teaching improvement
Conditions for Good Use The Faculty • Trust the process • Value student feedback • Are motivated to make improvements
Conditions for Good Use • Campus Culture • Teaching excellence - high priority • Resources to improve - provided • Student ratings - appropriate weight
Conditions for Good Use The Evaluation Process • 30-50% of evaluation of teaching • 6-8 classes, more if small (<10) • Not over-interpreted (3-5 performance categories)
Reflective Practice using Individual Reports Try new ideas Online, Paper Talk with colleagues What the reports say and what they mean IDEA resources that are keyed to reports
Underlying Philosophy of IDEA Teaching effectiveness is determined primarily by students’ progress on the types of learning the instructor targets.
Student Learning Model Specific teaching behaviors are associated with certain types of student progress under certain circumstances. Teaching Behaviors Student Learning Circumstances
Student Learning Model: Diagnostic Form Teaching Behaviors Items 1-20 Student Learning Items 21-32 Summary Items: 40-42 Research Items: 44-47 Up to 20 extra items Circumstances Students: Items 36-39, 43 Course: Items 33-35
FIF: Selecting Objectives • 3-5 as “Essential” or “Important” • Is it a significant part of the course? • Do you do something specific to help students accomplish the objective? • Does the student’s progress on the objective influence his or her grade? Be true to your course.
Common Misconception #1 Students are expected to make significant progress on all 12 learning objectives in a given course.
Common Misconception #2 Effective instructors need to successfully employ all 20 teaching methods in a given course.
Common Misconception #3 The 20 teaching methods items should be used to make an overall judgment about teaching effectiveness. Faculty Evaluation
Used for research Best answered toward end of term Do NOT influence your results Course Description Items (FIF) Bottom of Page 1 Top of page 2
IDEA Online: Student Survey Delivery • Email/Course embedded URL • Blackboard Building Block • Email reminders • Start/end dates Determined by Institution • Submission is confidential and restricted to one
Online Response Rates – Best Practices • Create value for student feedback • Monitor and Communicate through multiple modalities: • Twitter • Facebook • Other • Prepare Students • Talk about it • Syllabus
IDEA Online: FIF Delivery • Email delivery/reminders • Start/end dates determined by Institution • Access is unlimited while available • Questions can be added to student survey • Objectives can be copied from previously completed FIFs
Reflective Practice with IDEA IDEA Student Ratings of Instruction Reports
Diagnostic Report Overview • How did students rate their learning experience? What contextual factors impacted those ratings? How do my scores compare to: IDEA, discipline, and institution? What might I do to facilitate better learning for my students next time?
1. How did Students Rate their Learning? 1If you are comparing Progress on Relevant Objectives from one instructor to another, use the converted average.
4 Progress On Relevant Objectives 4.3 + 4.3 4.1 4.2 3.6 5
Summary Evaluation: Five-Point Scale 50% 25% 25%
Adjusted Scores • Student Work Habits (#43DF) • Student Motivation (#39DF) • Class Size (Enrollment, FIF) • Student Effort (multiple items) • Course Difficulty (multiple items)
3. How do my scores compare to: IDEA, Discipline, Institution?
Comparisons (Norms): Converted Averages • Able to compare scores on the same scale T Scores • Average = 50 • Standard Deviation = 10 • They are not percentiles or percentages
Comparison Scores Distribution Gray Area 40% Similar 20% 20% 10% Much Higher 10% Much Lower Lower Higher
Page 3: Suggested Action Steps #16 #18 #19
POD-IDEA Notes • Background • Helpful Hints • Application for online learning • Assessment Issues • References and Resources
References and Links to Helpful Resources are Provided
IDEA Papers Resources for Faculty Evaluation Faculty Development
Reflective Practice Paper or Online Try something new Meet with colleagues to reflect Interpret Reports POD-IDEA Notes IDEA Papers
Questions ? ideasri.tamu.edu www.theideacenter.org Visit our IDEA Help Community!