1 / 35

Doing Assessment for the Right Reasons

Doing Assessment for the Right Reasons. Victor M. H. Borden Associate Professor of Psychology (IUPUI) Associate Vice President for University Planning, Institutional Research, and Accountability (IU). Presented at the University of Arizona February 11, 2009. A Motivational Talk.

aspen
Download Presentation

Doing Assessment for the Right Reasons

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Doing Assessment for the Right Reasons Victor M. H. Borden Associate Professor of Psychology (IUPUI) Associate Vice President for University Planning, Institutional Research, and Accountability (IU) Presented at the University of Arizona February 11, 2009

  2. A Motivational Talk

  3. What is Assessment?(Pat Terenzini, Penn State) • A range of methods used to gather information about student learning for purposes of improvement • A problem • An accountability device concocted by others higher in the organizational food chain to ensure that work is being done well, with resource allocation decisions lying in wait just beyond the reach of the campfire's light; professionally demeaning, a close relative of root canal work • An opportunity • A means for enhancing teaching and learning, a series of activities that will help us do better the things we believe are important

  4. What is Assessment?(Terenzini, cont.) • A quality enhancement effort • Where we shift our focus from quality as input to quality as student learning outcome • A process to address a set of questions • Do our graduates know and can they do what our degrees imply? • What do the courses and instruction we provide add up to for students? • Are they learning what we are teaching? • What knowledge and abilities do we intend students to acquire? • At what level do they succeed and is that good enough? • Ongoing, formative, and developmental • Part of good education

  5. What isn’t Assessment? • Giving students grades • But evaluating grades can be part of assessment • Simply testing or just collecting information • Evaluating tests and analyzing information is assessment • Results should not be a part of faculty evaluation system • But faculty efforts to assess should be considered in promotion and tenure • Neither quick nor easy • Formative and developmental • Not solely an administrative duty • Primarily faculty-driven

  6. Why Faculty Distrust Assessment(Donna Tillman, CSU Pamona) Distrust of how results will be used Unclear objectives/definitions Increased workload Lack of support/incentives Flavor of the month Already doing it Disrupts teaching Not connected well to improvement Promotes conformity/standardization Not reliable or valid

  7. Intended Takeaway Points • There are two basic reasons to assess student learning outcomes • To improve the quality of your programs • To demonstrate the level of quality to others • prospective students, accreditors, potential funders/collaborators, etc. • The two reasons are perversely related • Assessing student learning begins as an amateur activity and we do not yet know if it ever reaches beyond that level

  8. Takeaway Points (cont.) • It’s not going away • Demonstrating quality through student learning assessment will become an increasingly important factor in recruiting students • And especially top students

  9. The Assessment/Accountability Dilemma • Without accountability we would probably not engage • At least not systematically • Like brushing teeth and taking showers • Accountability shapes assessment in ways that are not very palatable to academics • Away from formative, curriculum-embedded, authentic, and program-specific • Toward summative, normative, and uninformative (e.g., Spellings Commission)

  10. The Accountability “Culture Gap”Nancy Shulock, CSU, Sacramento • Policymakers want accountability to be unambiguous, concise, and quick. …[they] want to know, in no uncertain terms, whether goals are reached, whether students graduate, whether transfer rates are up or down, whether students are prepared to take their places in the 21st century workforce. They do not want explanations, caveats, or excuses.

  11. The Accountability “Culture Gap”Nancy Shulock, CSU, Sacramento • The academic community finds bottom line approaches… threatening and inappropriate. [They] fear that such an approach can be punitive and can narrow society’s concerns to those aspects of higher education that can be readily measured, at the expense of dearly held values. They fear legislative intrusion into matters of educational expertise…They question how educational quality and equity can be quantified and assessed in a neat and tidy way and worry that quantitative measures create perverse incentives. They fear one-size-fits-all measures that ignore different missions, demographics, student bodies, resources, and factors outside their control. Most importantly, they resist legislative involvement in the measurement, or assessment, of student learning, which they believe to be a faculty responsibility.

  12. Two Paradigms of AssessmentPeter Ewell, NDIR S1, 2008

  13. Ewell’s Conclusion On the one hand, there must be a real response to external demands for evidence along the lines of the “Accountability Paradigm,” but one which is proactive, genuine, and nuanced. On the other hand, the “Improvement Paradigm” must not be allowed to atrophy—if only because a genuine commitment to improve represents the best and most convincing evidence of accountability

  14. Moving Away from the Distraction • Voluntary System of Accountability • A reasonable, politically expedient response or a threat to core academic values? • Consumer information (CDS plus) • Student satisfaction/engagement • Standardized tests of student learning • CAAP, MAPP, and CLA • The “value-added” measure

  15. Toward the Right Reasons • To improve student learning and development • To make teaching more worthwhile • To ensure that faculty maintain their role as developers and guardians of curricula and of student learning • To beat off the wolves

  16. Why not Business as Usual? • Is it working? • Can you answer that question convincingly? • If all your students are doing well then are your programs sufficiently rigorous? • Are the conditions staying the same such that what we’ve done in the past will continue to work as well? • Are there new technologies that might enhance the learning experience? • Is there a match between the technologies we use and the technologies students use effectively? • Do we have the same resources that allow us to do things the same way?

  17. The Flavor of The Day • Aren’t we better off just letting the fads come and go, while we pay attention to what really matters? • Yes, but what really matters? • Pressures to demonstrate that college learning matters continue to increase as prices increase, the financial onus shifts more to the consumer and economic well-being becomes more closely tied to college learning outcomes • As more institutions develop quality assurance capacities, there will be more pressure for others to do so to compete for the best students and faculty

  18. What Really Matters? • Learning matters • What matters in learning?

  19. What Really Works? • Research findings converge • Seven Principles of Good Practice in Undergraduate Education • Chickering & Gamson, 1991 • New Directions for Teaching and Learning, No. 47. • Insights from Neuroscience and Anthropology, Cognitive Science and Work-place Studies • Ted Marchese http://www.newhorizons.org/lifelong/higher_ed/marchese.htm

  20. Seven Principles • Encourages Contact Between Students and Faculty • Develops Reciprocity and Cooperation Among Students • Encourages Active Learning • Gives Prompt Feedback • Emphasizes Time on Task • Communicates High Expectations • Respects Diverse Talents and Ways of Learning

  21. Marchese’s Insights • Good teachers, like "reflective practitioners" in other professions, constantly test, adjust, and reframe their models of practice on the basis of experience and reflection

  22. Marchese’s Insights • The more a teacher can emphasize . . . • learner independence and choice • intrinsic motivators and natural curiosity • rich, timely, usable feedback coupled with occasions for reflection • active involvement in real-world tasks emphasizing higher-order abilities • done with other people in high-challenge, low-threat environments that provide for practice and reinforcement • . . . the greater the chances he or she will realize the deep learning that makes a difference in student lives.

  23. Herbert Simon’s Observation • Learning takes place in the minds of students and nowhere else, and the effectiveness of teachers lies in what they can induce students to do. The beginning of the design of any educational procedure is dreaming up experiences for students: things that we want students to do because these are the activities that will help them to learn this kind of information and skill. And then we can back off and ask what we have to do to get students to carry out these activities.

  24. What Doesn’t Work • Passive listening • Obsession with coverage and lower-order thinking skills (i.e., memorization) • Little student choice about what is studied and how it is studied • Fear/anxiety-riddled instruction and evaluation • Limited interaction with instructor and other students • High-stakes evaluation • But good students will learn even under poor learning conditions • Can you imagine what they can do under the best conditions?

  25. How Do Your Classes/Programs Rate? • Do you scrutinize them against any kind of criteria? • What criteria? • Articulated objectives/outcomes • Effective learning process • Who Should Care? • If you don’t, why should I consider attending or sending my son/daughter to UA?

  26. Improving Academic Productivity • It's Time to Improve Academic, Not Just Administrative, Productivity, William Massy, Chronicle of H.E. 1/9/2009 • Not about how hard professors work • What is needed is for most, if not all, colleges to mount systematic and well-resourced programs for analyzing and continuously improving the processes of teaching and learning • Three examples • National Center for Academic Transformation Large Course Transformation Project • U Minnesota, Rochester’s Center for Learning Innovation bachelor’s degree in health sciences • U of Mo. System academic-audit process

  27. So What Can/Should You Do? • External tasks are imposed to incite action • You have two choices: • Treat it as something you have to do for someone else and do as little as you can to comply • Use it as an opportunity to engage your faculty to improve the quality of courses and programs • Either way, you end up doing about the same amount of work, so which should you choose?

  28. Approach learning like you approach scholarship Collaboratively as a discipline/community Use literature to inform practice Develop conceptual frames, implementation methods and assessment strategies (plan, do, check, act) Integrate assessment into teaching practice Do not just add it on as an extra activity but modify the way you approach teaching/learning Make Learning More Meaningful

  29. Typical Steps To Take • Identify Individual/Team to Lead within Unit • Preferably with interest in scholarship of teaching and learning • Develop some expertise • Attend conferences, retreats, workshops • Integrate or ramp up reflection on teaching practices and curriculum in department meetings • In other words, focus on improving and use assessment as a means for doing so

  30. It’s an Uneven and Bumpy Road Assessment Progress Table for School of Science Departments A = Accomplished; P = In-Process; N = Not Started

  31. Ultimately it Needs to Become University-Wide • Course • Classroom assessment Techniques (Angelo and Cross) • Primary Trait Analysis (Wolvoord) • Program • Learning goals and curricular mapping • Portfolios, Capstones, Exams, etc. • General Education • GenEd assessment plans and resulting assessments • Co-Curricular and Academic/Student Support • Ongoing evaluation and improvement of support services • Institutional Effectiveness • Surveys – NSSE, alumni, student satisfaction • Institutional analysis – retention, major migration, course-taking patterns

  32. The Golden Rule of Assessment • Do yourself what you expect other professional practitioners whose services you utilize to do

  33. Best Wishes • For an epiphany on the right reasons • For common cause • For a large piece of the bailout package • Although perhaps not quite as large as

More Related