1 / 29

Chuck Phillips, PharmD, PhD IDEA User Group Meeting Chicago, IL May 30, 2010

Using Aggregate IDEA Data Across the College: Assessing Teaching Goals, Tracking Faculty Progress, and Supporting Curricular Review. Chuck Phillips, PharmD, PhD IDEA User Group Meeting Chicago, IL May 30, 2010. Don’t kill the assessment messenger.

dagmar
Download Presentation

Chuck Phillips, PharmD, PhD IDEA User Group Meeting Chicago, IL May 30, 2010

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Aggregate IDEA Data Across the College: Assessing Teaching Goals, Tracking Faculty Progress, and Supporting Curricular Review Chuck Phillips, PharmD, PhD IDEA User Group Meeting Chicago, IL May 30, 2010

  2. Don’t kill the assessment messenger A man in a hot air balloon realized he was lost. He reduced altitude and spotted a woman below. "Excuse me, can you help me? I promised a friend I would meet him an hour ago, but I don't know where I am."

  3. The woman below replied, "You are in a hot air balloon hovering approximately 30 feet above the ground. You are between 40 and 41 degrees north latitude and between 59 and 60 degrees west longitude."

  4. "You must be an assessment person," said the balloonist. "I am," replied the woman, "How did you know?"

  5. "Well," answered the balloonist, "everything you told me is technically correct, but I have no idea what to make of your information, and the fact is I am still lost. Frankly, you've not been much help so far."

  6. The woman below responded, "You must be a professor." "I am," replied the balloonist, "but how did you know?"

  7. "Well," said the woman, "you don't know where you are or where you are going. You have risen to where you are due to a large quantity of hot air. You made a promise which you have no idea how to keep, and you expect me to solve your problem. The fact is you are in exactly the same position you were in before we met, but now, somehow, it's my fault."

  8. Purpose • I’m a professor and I work with a great faculty interested in improving our program • Sometimes, assessment people “are the messengers” • Need to make IDEA results applicable to the program, chairs, and individual faculty • Changes need to be faculty driven but with relevant assessment data

  9. Overview • Overview of our college • Assessing college-wide teaching goal • Faculty and course tracking and uses • Curricular Review uses • Sharing

  10. Drake Background • Professional and liberal arts (5600 enrollment) • College of Pharmacy and Health Sciences • Two programs (BS HSCI, PharmD) • 40 faculty • 860 students • IDEA Diagnostic Evals • Since 2004 (improve teaching, common tool) • Paper and on-line (35-40 courses/semester)

  11. I. Teaching GoalGreat Universities measure what they value (anonymous) • How are we using our IDEA reports? • Teaching Excellence: P&T, Culture, etc. • Chose “Progress on Relevant Objectives” • Converted scores • 80% of courses similar to, higher, or much higher than IDEA average • Started at about 65% of courses

  12. Process • Request the aggregate data file each semester • Add data to cumulative SPSS database • Add variables such as: • AY, Reliable results, active faculty • Run a crosstabs of PRO_CAdj_IDEA by AY • Look into using PRO_Craw_IDEA • Professional/Grad students already motivated

  13. Progress on Relevant Objectives vs IDEA National Database 69.2% 66.2%

  14. What questions got asked afterwards? • Which courses are low? • Lower and much lower courses were taught by a variety of faculty (from new to senior faculty) • Bi-modal faculty (favorite course vs another) • Is “progress on relevant objectives” still the correct measure? • Are students taking it seriously? • Is the tool a good measure?

  15. Action: Faculty Development • What’s the ‘appropriate group’ to review? • Is there education to do on selecting and teaching to these objectives? • (in-house and guest speaker) • Are low scores related to new, first-time-taught courses? • Do instructors need more training?

  16. Actions • What to do if we don’t achieve goal? • Faculty Development programs and faculty meetings on: • Soul searching on what I’m trying to achieve • How to choose objectives (right ones, right number) • Should Teaching methods be adjusted (TBL) • Linking content and methods to the objective Good discussions and culture of assessment

  17. II. Curricular Issues • Annual Assessment Overview • Sharing information with faculty • Track data matching our culture and values • How much discussion vs lecture? • What are faculty asking of students? • What are faculty emphasizing? • Where are students making progress?

  18. Class format Increase in lecture probably due to fewer individual lab sections being evaluated.

  19. Instructor Related Course Requirements (Some or Much required) Reading and memorization were new categories in 08-09

  20. Percent of CPHS classes selecting objective as either Essential or Important (FIF)

  21. Student ratings of progress on objectives chosen as Essential or Important 1=no progress 2=slight progress 3=moderate progress 4=substantial progress 5=exceptional progress

  22. Amount and Difficulty of Course work: Student Ratings Values are similar if within 0.3 1=Much less than most courses, 2=less than most, 3=about average, 4=more than most, 5=much more

  23. III. Faculty/Course Tracking • How can we use all of this data? • Reports for: • Department Chairs • Track faculty progress over time • Track course progress over time • Can compare individual to dept. average • Faculty • Summary for P&T portfolios • Tracking own progress over time

  24. Faculty/Course Tracking • Run Summary tables by • Department and AY • Instructor, course number, and AY • PRO_Adj_Mean • Exc_Tchr_Adj_Mean • Exc_Crs_Adj_Mean • SumEval_Adj_Mean

  25. Department Progress

  26. Individual Faculty

  27. Faculty/Course Tracking • Dept. Chair help • Can tell faculty: them vs. dept. average • Can show progress over time (by faculty or course) • Don’t have to pull out old IDEA reports • Faculty help • Easy summary for P&T portfolios

  28. Pearls • Not a lot of time to pull data (Assessment office) • Need dedicated time to coordinate data flow and action (College committees) • Changes data into information • Makes assessment data ‘visible’ to faculty • Expect some resistance • Not valid, not the only measure, etc. • Generates rich faculty discussions on quality improvement • Provides a service to chairs, faculty

  29. Questions/Discussion chuck.phillips@drake.edu c/o Drake College of Pharmacy and Health Sciences 2507 University Ave Des Moines, IA 50311 515-271-4980

More Related