1 / 24

Using Student Ratings to Improve Program Quality and Student Learning

Using Student Ratings to Improve Program Quality and Student Learning. Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William Pallett, The IDEA Center Dr. Barry Stein, Tennessee Tech University. Introduction: IDEA Student Ratings. Presenters Content

sanam
Download Presentation

Using Student Ratings to Improve Program Quality and Student Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Student Ratings to Improve Program Quality and Student Learning Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William Pallett, The IDEA Center Dr. Barry Stein, Tennessee Tech University

  2. Introduction: IDEA Student Ratings • Presenters Content • Bill Pallett IDEA & Program Assessment • Randi Hagen Student Learning Outcomes • Bill Pallett QEP at Tennessee Tech • Sarah Logan Involving the Campus • Feel free to ask questions during presentations. • At the end, we want to hear your campus stories.

  3. IDEA and Program Assessment: Aggregating Data • Assessment Questions: IDEA as Supporting Evidence • Are course emphases consistent with stated curricular purposes? • Do courses’ overall student progress ratings compare favorably to courses at other institutions? • When a learning objective is selected as Essential or Important, does student self-report of learning meet our expectations?

  4. Assessment Questions • What teaching methods might we employ more effectively to support student learning? • How do students’ workhabits, motivation, etc. compare to students at other institutions? • How do students view course work demands? • What factors do instructors report have a positive/negative influence on student learning?

  5. Longitudinal Questions • Do results change over time in the desired direction? • Does IDEA provide supporting evidence that innovations and interventions have been successful?

  6. Student Learning Outcomes What should a student know and do as a result of taking this course at Flagler College?

  7. IDEA Summary of Your Teaching Effectiveness

  8. Instructor’s Progress on Specific Objectives

  9. Areas to Improve Your Teaching Effectiveness

  10. Getting Started Improving Your Teaching Effectiveness • Design your college professional development program around the weaknesses of your faculty • Individual Use: POD-IDEA Center Notes • Individual or Group Use: IDEA Papers • IDEA Seminars

  11. Use of Data File Progress on Relevant Objectives: Progress on those objectives selected by the instructor as important or essential to this class. 1= No apparent progress 2= Slight progress 3= Moderate progress 4= Substantial progress 5= Exceptional progress

  12. Quality Enhancement Plan (QEP) • QEP and SACS Accreditation • IDEA • NSSE

  13. NSSE and IDEA Relationship

  14. Using IDEA Results for the QEP at Tennessee Tech University Selecting a QEP Topic Assessment Plan for the QEP Measuring Progress Identifying Problem Areas

  15. IDEA Teaching Evaluation Instrument Frequency Goals Selected Progress on Goals

  16. Frequency IDEA Objectives Selected

  17. Progress on IDEA Teaching Objectives

  18. Assessment Plan Options - IDEA Cost

  19. IDEA Works with Other Assessments • Enrolled Student Surveys (NSSE) • Alumni Surveys • Employer Surveys • Performance Measures*

  20. Involving the Campus: Angelo State University • Diagnostic form features • Improvement of teaching • Programs add items of interest to • satisfy accrediting agencies • Answer internal questions • Group reports • External comparisons at a point in time • Internal comparisons across time

  21. Using Group Reports • Campus-wide • Provost reviews the university and college reports and compares results from multiple years • Deans review college and department results • University, college, and department results for • Excellent teacher • Excellent course • Progress on Objectives are on a network drive for everyone to use.

  22. Academic Departments • Meet to compare current department ratings to • External benchmark • Former department ratings • Current college ratings • Questions • In what ways are comparisons meaningful? • With what level of ratings are we satisfied? • What are reasons for lower than anticipated ratings? • In what areas do we want to improve?

  23. Academic Programs • Faculty teaching the same course discuss choice of objectives (and list them on syllabi). • Faculty with highly structured curricula outline objectives for each course level. • Questions • What objectives are appropriate for certain courses? • In what ways do objectives differ for upper- vs. lower-level courses so that students receive a well-rounded educational experience?

  24. Summary: IDEA Student Ratings • Assessment • Students’ perceived learning on course goals • Efficacy of instructors’ teaching methods • Provide evidence for • Review and improvement of • core curriculum • academic programs • Accreditation and other reporting • Faculty development

More Related