1 / 16

A Funny Thing Happened on the Way to the Assessment*

A Funny Thing Happened on the Way to the Assessment*. Presentation at the UMD mini-conference on assessment September 23, 2009 Geoffrey G. Bell and Kjell Knudsen. * With apologies to Stephen Sondheim, Burt Shevelove & Larry Gelbart. Purposes of our talk.

albina
Download Presentation

A Funny Thing Happened on the Way to the Assessment*

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Funny Thing Happened on the Way to the Assessment* Presentation at the UMD mini-conference on assessment September 23, 2009 Geoffrey G. Bell and Kjell Knudsen * With apologies to Stephen Sondheim, Burt Shevelove & Larry Gelbart

  2. Purposes of our talk • By the end of this talk, we hope you will: • See how a shift from teaching-centered course design to assessment-centered course design generates radical rethinking of the course. • Understand what is meant by “course-embedded assessment measures” • Have a sense of student reception to the new assessment tools.

  3. Some background material • LSBE faculty have been assessing students for several years now in conjunction with AACSB reaccreditation efforts. • In the MBA program, one of the chief places we conduct assessment is MBA8411 - Policy Formulation & Implementation • This course is a “capstone” course; normally the last one MBA students take before graduation.

  4. Traditional course design & assessment • Many of us have a “lead a horse to water” mentality of teaching and learning. • Result: the “traditional” method of course design and assessment: • Faculty decide what students should know. Assessments appended at the end, as an add-on. • Problem: assessment becomes an additional burden placed on the instructor, and may be unrelated to the course itself.

  5. Our approach • We planned our courses in close cooperation with Dr. Pat Borchert, who taught the course the prior year, and built on her experience in assessment. • We decided early on to start with the assessment, develop the tool, and build the course from there. • That is, we decided what tool we would use to assess student learning, and only then did we design the rest of our courses.

  6. Our assessment instrument • We used a “living case study” with a detailed rubric. • Students analyzed the airline industry and one firm in the industry, and presented their “recommendations to management.” • We assessed their performance with a detailed rubric (approximately 30 pages) which we developed alongside the assignment itself. • Students were given the rubric along with the assignment.

  7. The assessment tool is course-embedded! • One key thing to note is our tool is a course-embedded measure. • An extensive case analysis is normally used in this course, so no additional burden was placed on us! • The measure was embedded directly in the course. • The assessment items in the rubric were a subset of the total assignment, and would have been completed by students as part of their assignment even without the assessment.

  8. An example of the correspondence of learning objectives and measures

  9. A sample rubric page

  10. Key notes about the rubric • The rubric had clear behaviorally-anchored scales; not just “exemplary,” “competent,” “developing.” We defined in terms of specific behaviors what “exemplary,” “competent,” and “developing” meant, which increased inter-rater reliability. • The actual assessed items were only a subset of the total rubric. That is, the assignment was broader in scope than the assessment.

  11. Our accidental revelations • When we completed designing the assessment tool, we discovered two unexpected revelations: • The detailed case became the outline for the course itself. • We had shifted from a perspective of “What do we want students to know?” to “What do we want students to do?” As students do what we want, they will demonstrate they’ve learned what we want them to know!

  12. Results of these revelations • We were better able to discern “core” knowledge from “peripheral” knowledge. The core knowledge was embedded in the assessment tool. (Why do we want students to know it if we’re not assessing it?) • The focus of the course shifted to a “learning by doing” mode. Students still learned the material, but they learned it specifically by working on an important task.

  13. Student response to the assignment • Students found the assignment very interesting, but very time consuming. • Some students wondered whether the company studied would pay for their advice!

  14. Student response to the rubric • Students made extensive use of the rubric in preparing their assignments. • Students reported that the rubric: • Guided them through the assignment and clarified their expectations • Constrained their creativity and “out of the box” thinking. • Was too long and detailed. • Students thought one solution would be to present our expectations for “exemplary” performance and let them figure out the rest.

  15. Your take-aways from the session • Assessment can be much less burdensome to faculty when it is embedded in the course rather than added on at the end. • A well-designed assessment tool can aid in course design, and help instructors and students distinguish between core and peripheral course content. • A behavior-based rubric increases inter-rater reliability of assessment of student performance. • The rubric guided students through the assignment and helped them understand assignment expectations fully, but tended to constrain their creativity and out-of-the-box thinking.

  16. Final thoughts • Assessment doesn’t need to be really burdensome. • It can facilitate both student learning and your teaching!

More Related