1 / 59

Welcome to the UA Assessment Showcase 2012!

Welcome to the UA Assessment Showcase 2012!. Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction and Assessment. Actionable Assessment of Academic Programs: Principles and Practices for Usable Results. Jo Beld

kevork
Download Presentation

Welcome to the UA Assessment Showcase 2012!

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Welcome to the UA Assessment Showcase 2012! Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction and Assessment

  2. Actionable Assessmentof Academic Programs:Principles and Practicesfor Usable Results Jo Beld Professor of Political Science Director of Evaluation & Assessment Assessment Showcase April 17, 2012

  3. Agenda • Principles: Conceptual frameworks • Practices: Making assessment useful • Practices: Engaging faculty • Politics and policy: The bigger picture

  4. Conceptual frameworks Utilization-focused assessment (Patton, 2008): Focus on intended uses by intended users

  5. Conceptual frameworks Backward design (Wiggins & McTighe, 2005): “Beginning with the end in mind”

  6. Conceptual frameworks Traditional assessment design: Choose an assessment instrument Gather and summarize evidence Send a report to someone

  7. Conceptual frameworks Backward assessment design: Identify intended users and uses Define and locate the learning Choose assessment approach

  8. Conceptual frameworks Once you’ve defined your outcomes, start planning your assessment project here

  9. Making assessment useful Studio art major • Developed evaluation form for senior exhibit that doubles as assessment instrument • Addressed disconnect between student and faculty criteria for artistic excellence • Revised requirements for the major • Refocused common foundation-level courses

  10. Making assessment useful Chemistry major • Using ACS exam as final in Chem 371: Physical Chem • Students outperform national average and do well in kinetics despite limited coverage in course • Chem 371 being retooled to focus on thermodynamics and quantum mechanics

  11. Making assessment useful History major % “exemplary” ability to… • Gathering evidence in 2011-12 (voluntarily!) to examine sequencing in the major • Examining ability to understand and work with historiography in new intermediate seminars for major

  12. Making assessment useful Statistics concentration • Collaboratively designed final exam question and grading rubric in Stats 270 to examine interpretation and communication of results; two faculty graded essays • Instructor adjusted teaching in response to findings

  13. Making assessment useful Management Studies concentration • Quiz scores: Teams outperform best individual students • Course evaluations: Students believe they learned “much” or “exceptional amount” by working together in teams (73%) • Team-based learning being extended to other courses Mean Results – Management Studies 251 Course Quizzes

  14. Making assessment useful Interdisciplinary programs • Collaboratively developed assessment questionnaire • Considering direct assessment of interdisciplinary proficiency using common rubric with program-level portfolio • Will consider whether all programs should have capstone course or experience

  15. Making assessment useful Benefits for individual courses: • Setting priorities for content/instruction • Revising/expanding assignments • Clarifying expectations for students • Enhancing “scaffolding” • Piloting or testing innovations • Affirming current practices

  16. Making assessment useful Benefits for the program as a whole: • Strengthening program coherence • Sending consistent messages to students • Revising program requirements • Extending productive pedagogies • Affirming current practices

  17. Making assessment useful More program benefits: • Telling the program’s story to graduate schools and employers • Enhancing visibility to disciplinary and inter-disciplinary associations • Supporting grant applications • Meeting requirements for specialized accreditation

  18. Making assessment useful Benefits for faculty members: • Efficiencies in curriculum and instruction • Confidence that what you’re doing is working • Collaboration and collegiality within and across departments • Professional development for early faculty • Better integration of adjunct faculty

  19. Making assessment useful How might assessment be useful for an individual course, your program as a whole, or your faculty colleagues?

  20. Engaging faculty • Consider your colleagues • De-mystify assessment • Reduce costs and enhance rewards

  21. Engaging faculty Consider your colleagues: Faculty roles, commitments, and disciplinary identities offer both incentives and disincentives to engage assessment

  22. Engaging faculty Your colleagues as practitioners of their disciplines: Studio Art

  23. Engaging faculty Your colleagues as practitioners of their disciplines: Chemistry

  24. Engaging faculty Your colleagues as practitioners of their disciplines: Political Science

  25. Engaging faculty Demystifying assessment: • Not scholarship of teaching and learning • Not individual teaching evaluation • Not student satisfaction data • Not necessarily quantitative • Not rocket science (unless that’s what you teach!)

  26. Engaging faculty “Direct” Assessment” “Direct” Assessment Evidence of what students actually know, can do, or care about “Indirect” Assessment Evidence of learning-related experiences or perceptions

  27. Engaging faculty Common direct assessment “artifacts” • Theses, papers, essays, abstracts • Presentations and posters • Oral or written examination items • Responses to survey or interview questions that ask for examples of knowledge, practice, or value

  28. Engaging faculty Common indirect assessment “artifacts” • Course mapping, course-taking patterns or transcript analysis • Responses to survey or interview questions about experiences, perceptions, self-reported progress, or impact of program experiences • Reflective journals

  29. Engaging faculty But wait!! Aren’t we observing student work all the time anyway? What’s the difference between grading and assessment?

  30. Grading summarizes many outcomes for one student Assessment summarizes one outcome for many students Engaging faculty

  31. Engaging faculty The purpose of assessment is to provide systematic, summarized informationabout the extent to which a group of students has realized one or more intended learning outcomes

  32. Engaging faculty Reducing the costs of assessment • Use what you’ve already got • Borrow freely • Integrate assessment into work you are already doing • Share the work broadly • Limit your agenda

  33. Engaging faculty Reaping the rewards of assessment • Address questions that matter to faculty • Build in collaboration • Pair direct with indirect methods • Choose approaches that “multi-task” • Dedicate time for discussion and application

  34. Engaging faculty Plan intentionally for use of results • Borrow strategies from past successes in collective departmental action • Focus reporting on planned actions, not on the evidence itself • Weight Watchers, not the Biggest Loser • Dedicate time and resources for action

  35. Engaging faculty What can you do in your program to: • Link assessment to faculty identities and incentives • De-mystify assessment • Reduce costs OR • Enhance benefits?

  36. The bigger picture

  37. The bigger picture Accreditation by a federally-recognized accreditation agency is required for access to federal student aid • Recognition requires accreditors to evaluate whether an institution maintains clearly-specified educational objectives and is successful at meeting them.

  38. The bigger picture Guiding values of new HLC criteria: “A commitment to assessment would mean assessment at the program level that proceeds from clear goals, involves faculty at all points in the process, and analyzes the assessment results; it would also mean that the institution improves its programs…on the basis of those analyses.”

  39. The bigger picture

  40. Table Talk – Questions for Dr. Beld Table Facilitators (a.k.a. FLC members) Paul Blowers Chemical & Environmental Engineering Eliud Chuffe Spanish & Portuguese Faiz Currim Management Information Systems Wendy Davis Animal Science Ryan Foor Agricultural Education Herman Gordon Cellular & Molecular Medicine Christopher Johnson Educational Technology Amy Kimme-Hea English Carl Maes College of Optical Sciences Katrina Miranda Chemistry & Biochemistry John Murphy Pharmacy Practice & Science Teresa Polowy Russian & Slavic Studies Claudia Stanescu Physiology Hal Tharp Electrical & Computer Engineering Deb Tomanek Office of Instruction & Assessment

  41. Assessment & Research in Student Affairs Angela Baldasare, Ph.D. Director, Divisional Assessment & Research baldasar@email.arizona.edu

  42. Starting from Scratch: From Outcomes to Assessment Activities Aurelie Sheehan, Ph.D. Director, Creative Writing asheehan@email.arizona.edu

  43. Unlocking AssessmentLinking Findings to Outcomes David Cuillier, Ph.D. Director, University of Arizona School of Journalism cuillier@email.arizona.edu

  44. Our Process • Define learning outcomes • Measure at overall program level • Link findings specifically to outcomes • Make adjustments (report & faculty retreat) • Feedback loop – see if it worked

  45. EXAMPLES…

  46. Outcome #10: Technology MEASURE: 2009 survey of multimedia knowledge On a scale of 0-9 Photoshop 6.24 Final Cut 2.59 Dreamweaver.76 Soundslides .47 Audacity .35 CSS .35 Flash .24 FINDING: Need more Soundslides/Audacity training ADJUSTMENT: Created multimedia class in 2010 FEEDBACK LOOP: To survey students again 2012

  47. Outcome #9: Writing MEASURE: Survey of intern supervisors FINDING: Positive trajectory on student writing ADJUSTMENT: Keep doing what we’re doing

  48. Lessons learned • Faculty buy-in through program-level view • One person responsible • Model assessment plans (e.g., Elon) • Explicit reporting – state it clearly • Focus on findings, not methods

  49. Program Assessment: Raw Data to Findings Ingrid Novodvorsky, Ph.D. Director, College of Science Teacher Preparation Program novod@email.arizona.edu

  50. TEACHER PREPARATION PROGRAM Student Learning Outcomes—Core Understandings (These describe attributes of a well-prepared science teacher.)

More Related