1 / 29

Implementing Sustainable, Useful Assessment

Implementing Sustainable, Useful Assessment. So What’s the Problem?. The Perception.

vila
Download Presentation

Implementing Sustainable, Useful Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Implementing Sustainable, Useful Assessment So What’s the Problem?

  2. The Perception • “Seasoned observers have pointed out the irony of the academy, as an institution dedicated to discerning the truth through evidence, being so seemingly resistant to measuring quality through evidence. It is an irony that puzzles—and frustrates—a widening circle of stakeholders.” Perspectives, American Association of State Colleges and Universities – Spring 2006

  3. The Perception • “Too many decisions about higher education—from those made by policymakers to those made by students and families—rely heavily on reputation and rankings derived to a large extent from inputs such as financial resources rather than outcomes.” Report from Dept. Of Education’s National Commission on Education, 2006

  4. The Perception Accountability Agenda: • "Expand and refine evidence of institutional performance and student achievement, assuring that this vital evidence is used as the key factor in determination of the quality of higher education;" Council for Higher Education Accreditation

  5. Question to Ponder • Why, after over a decade of intensive effort by accrediting agencies and leaders in higher education, are we still struggling with outcomes assessment implementation? • How we answer this question determines what solutions we seek.

  6. Technical Deficit Hypothesis • We haven’t yet found the necessary technical tools – software, measuring instruments, procedures – that will help us satisfy our accrediting agencies without disrupting our teaching and research practices.

  7. Motivational Hypothesis • Although there is a need for technical improvements, we already have the necessary tools to successfully proceed. • The primary problem is that past and current incentives in higher education favor the status quo – input based assessment.

  8. Interaction Hypothesis • Technical and motivational factors interact. • To succeed, faculty must: • Have the necessary tools • Have the knowledge to use them • Be motivate to do so. Deficiency in any area can impede progress.

  9. ToolsKnowledgeMotivation • Necessary to • Define & structure the process • Minimize clerical requirements • Provide institution-wide coordination • Provide common recording and reporting formats. • TracDat, iWebfolio, Measures …

  10. ToolsKnowledgeMotivation • Training • Startup Consultation • Technical & Conceptual – all levels • Resident Trainers/Advocates • On-going requirement • Must be formalized • Tied to job-role not person

  11. ToolsKnowledgeMotivation • Observation suggests that motivation presents the greatest challenge and has received the least systematic attention. • Note to Audience: Pay attention to what people do, and not what they say they do.

  12. The Fundamental Issue Inputs Outputs What We Provide Teaching Practices Teacher Qualities Courses Programs Facilities Equipment Credentials Independent Variables Results We Produce Student learning Alumni achievements Community Impact Dependent Variables

  13. Consider … • Classroom Visits • Student Evaluations • Accreditation reviews • Program reviews • Where’s the focus – Inputs or Outputs?

  14. How it is $ equipment $ $credentials $ Inputs Outputs Faculty/Staff Focus $ facilities $ $ teaching $

  15. What’s Expected Student Learning Alumni Achievements Inputs Outputs Faculty/Staff Focus Community Impact

  16. Consequences of Input Focus • Program effectiveness remains unknown • Drives up program costs • Impedes search for more effective inputs.

  17. So … • I’ve suggested that: • Progress toward effective outcome-focused assessment requires: • Appropriate tools • Knowledge to effectively use them • Motivating conditions • The motivation factors present the greatest challenge and have received least systematic attention.

  18. Factor 1 • Relative economic advantage • Public impressed by inputs • The package sells, not the content • Economic advantage favors inputs, but events signal impending change.

  19. Factor 2 • Social Prestige • Impressive inputs = high prestige • More PhDs • Bigger labs • Famous faculty

  20. Factor 3 • Vested Interests • Faculty want their degrees to be valued. • Wealthy schools want their lavish resources to be valued. • Faculty feel safer with input assessment • Input focus allows grandiose outcome claims • Input focus enriches our environments • But strain our budgets

  21. Factor 4 • Ease of observing advantages • Inputs – advantages are obvious – easily measured • Outcomes – advantages are nebulous or unknown – difficult to measure

  22. Score: 4 to 0 • Economic advantage still favors inputs • Prestige still based largely on inputs • Vested interests support inputs • Advantage-recognition favors inputs

  23. Transition • We are currently in a transition phase. • Measurement of outcome is being demanded, but inputs are still being used as the primary indicators of quality.

  24. Transition • Assessment based on learning outcomes holds the promise of clear advantages for students and society. • Hence the growing external demand • But tangible advantages for educational institutions and educators are still lacking. • Hence the grudgingly slow response

  25. Transition • Most of us do assessment to avoid aversive consequences, not to achieve positive outcomes. • We must come to recognize that pursuing outcome improvement is ultimately in our best interest.

  26. The Challenge • Convince boards, administrators and faculty that the rules of survival are changing. • Identify and put in place in-house incentives for outcome-focused assessment.

  27. Possibilities • Promulgation of positive outcomes and improvements • Provides recognition • Provides exemplars • Outcome-based resource allocation • Outcome-based personnel evaluation

  28. The Choices • Wait for the public consequences to be put in place and risk not being prepared. • Pro-actively implement in-house incentives favoring outcome-focused assessment. • What affects faculty & staff matters.

  29. Heads Up! The ball’s in your court … Don’t Blow It! Thank You

More Related