1 / 36

‘Assessment of Student Learning’: A Programmatic Perspective

‘Assessment of Student Learning’: A Programmatic Perspective. Doug Baker & Jenny Kindred Faculty Members of CAS Assessment Committee Assessment Workshop May 1 , 2014. How Do You Answer the Following Questions. What does it mean to assess or evaluate student learning?

Download Presentation

‘Assessment of Student Learning’: A Programmatic Perspective

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ‘Assessment of Student Learning’: A Programmatic Perspective Doug Baker & Jenny Kindred Faculty Members of CAS Assessment Committee Assessment Workshop May 1, 2014

  2. How Do You Answer the Following Questions • What does it mean to assess or evaluate student learning? • What is the difference between direct vs. indirect measures of student learning? • What does it mean to assess student learning from a programmatic perspective?

  3. Student Learning: Example Claims • Sylvia earned a “B” in Introduction to Linguistics, although she received a “C” on the final exam.

  4. Student Learning: Example Claims • Sylvia earned a “B” in Introduction to Linguistics, although she received a “C” on the final exam. • Donovan is considered a ‘great writer’ by his peers. He earned A’s in high school and in freshmen composition. In his second college writing course, Donavan was absent 7 times and was often late and appeared unmotivated. One time he said aloud: “This class is stupid.”

  5. Student Learning: Example Claims • In her chemistry class, Gretchen is able to define stoichiometry and equilibria; however, she appears unable to transform that information when putting it into action.

  6. Student Learning: Example Claims • In her chemistry class, Gretchen is able to define stoichiometry and equilibria; however, she appears unable to transform that information when putting it into action. • Since the Professional Readiness Exam reflects the “knowledge, skills, and understanding that an effective teacher needs in order to teach,” a student passing the test will most likely become an effective teacher.

  7. Student Learning: One Program’s Response • The journalism program is considered competitive and highly effective in placing students in internships; furthermore, most of the top students get jobs after graduation or attend graduate school.

  8. Student Learning: One Program’s Response • The journalism program is considered competitive and highly effective in placing students in internships; furthermore, most of the top students get jobs after graduation or attend graduate school. • Since expectations and demands of journalists have changed, the journalism program plans to evaluate students in two or three courses on how well students are learning to adapt traditional journalistic practices to the evolving news environment.

  9. Claims and Evidence • What evidence supports claims that programs make about student learning?

  10. Claims and Evidence • What evidence supports claims that programs make about student learning? • What counts as evidence? • According to whom? • Under what conditions? • And for what purposes?

  11. Questions • How do we know students are learning? • What evidence supports our claims? • How can we describe to others how we know students are learning? Why should we? • How can we answer these questions from a programmatic perspective? And why should we?

  12. Questions • How do we know students are learning? • What evidence supports our claims? • How can we describe to others how we know students are learning? Why should we? • How can we answer these questions from a programmatic perspective? And why should we? • How can we become more curious about the process?

  13. Orienting Perspectives • Assessment of student learning should be humane and doable.

  14. Orienting Perspectives • Assessment of student learning should be humane and doable. • Assessment is ideological and implies a comparison.

  15. Orienting Perspectives • Assessment of student learning should be humane and doable. • Assessment is ideological and implies a comparison. • Classroom assessment of student performance is relevant to programmatic assessment of student learning, but not the same.

  16. Orienting Perspectives • Assessment of student learning should be humane and doable. • Assessment is ideological and implies a comparison. • Classroom assessment of student performance is relevant to programmatic assessment of student learning, but not the same. • We begin where we begin – so build on past experiences and stay open to suggestions.

  17. Assessment Workshop Goals • Build Capacity by developing program leaders of assessment • Recognize Contextual Factors that inform assessment (e.g., national and local contexts) • Learn Assessment Practices for classrooms or programs • Accomplish a Goal toward designing assessment • Communicate Across Boundarieswith stakeholders

  18. Building Capacity • EMU needs faculty to plan, implement and report on programmatic assessment of student learning • EMU needs faculty to talk with others about how assessment of student learning improves programs and opportunities for students to learn • EMU needs faculty to engage with stakeholders about assessment of student learning

  19. Recognizing Context: State • “We’ve built [an educational] system that doesn’t work anymore in terms of helping people be successful.” - Governor Rick Snyder, 22 April 2013 – Detroit Free Press

  20. Recognizing Context: National • “The new crisis accuses the American university of failing to educate (variously, failing to train the mind and to prepare for the workplace), of losing its place in international competition, of being an institution top-heavy with administrators and pandering to a faculty that does very little, as well as to students who care more about expensive cars and state-of-the-art fitness rooms than about Socrates. Above all, the university has become unjustifiably expensive, inaccessible, and unaccountable.” – Peter Brooks, New York Review of Books (2011)

  21. Recognizing Context: National • “Especially during the past decade there has been a flood of criticism of the American college and university. Much of this has been cleverly, even brilliantly, expressed, but the criticism is often superficial, illogical and unsound. Some of it, on the other hand, has been sanely constructive and helpful. If we were to believe all that the critics say, we should inevitably be forced to the conclusion that little if anything is right with higher education today.” - Walter Crosby Eells, “Criticisms of Higher Education”

  22. Recognizing Context: National Eells, Walter Crosby. “Criticisms of Higher Education.” The Journal of Higher Education v. 5.4 (April 1934): 187-189.

  23. Recognizing Context: National • “Curricula are too often legacies of the past rather than products of the time to meet the needs of the time” (82). - Lemuel R. Brown

  24. Recognizing Context: National Brown, Lemuel R. “Some Needed Readjustments in the Teaching of English Grammar.” English Journal 2.2 (February 1913): 81-92.

  25. Context • Nationally –Alarmist documents (e.g., “A Nation at Risk” (1983); NCLB (2001))—also see rebuttals (e.g., The Manufactured Crisis (1996)) • University Accreditation – Higher Learning Commission (http://www.ncahlc.org/) • Specialized Accreditation • University, college, department and program • Students’ academic needs

  26. Higher Learning Commission(https://www.ncahlc.org/) • HLC is an independent corporation and one of two commission members of the North Central Association of Colleges and Schools (NCA), which is one of six regional institutional accreditors in the US…accredits degree-granting post-secondary educational institutions. • EMU’s Accrediting Body – Next Visit is 2017-18 • 5 Criteria

  27. HLC – 5 Criteria • Criterion One: Mission • Criterion Two: Integrity: Ethical and Responsible Conduct • Criterion Three: Teaching and Learning: Quality, Resources, and Support • Criterion Four: Teaching and Learning: Evaluation and Improvement • Criterion Five: Resources, Planning, and Institutional Effectiveness

  28. Teaching and Learning: Evaluation and Improvement • 4a: The institution demonstrates responsibility for the quality of its educational programs. • 4b: The institution demonstrates a commitment to educational achievement and improvement through ongoing assessment of student learning. • 4.C. The institution demonstrates a commitment to educational improvement through ongoing attention to retention, persistence, and completion rates in its degree and certificate programs.

  29. Fundamental Questions • How are your stated student learning outcomes appropriate to your mission, programs, degrees, and students? • What evidence do you have that students achieve your stated learning outcomes? • In what ways do you analyze and use evidence of student learning? • How do you ensure shared responsibility for student learning and for assessment of student learning? • How do you evaluate and improve the effectiveness of your efforts to assess and improve student learning? • In what ways do you inform the public and other stakeholders about what students are learning—and how well?

  30. Challenge “We must improve our assessment systems so that they help us enhance student learning, draw upon the best aspects of academic culture, and are sustainable in terms of time and resources. Then we need to explain our assessment systems clearly and without arrogance to our various constituencies.” – Barbara Walvoord (3)

  31. Designing the Assessment Cycle PLANNING • What does your program expect students to learn—what are the student learning outcomes (SLOs)? • When will students have opportunities to meet the SLOs? (Curriculum map) • What student performances will the program assess to see how students are progressing toward meeting the SLOs? • How will the program assess student performances, compared to what standard? (Rubric) • How will program instructors meet and discuss findings and make decisions? • Backwards design is key (Wiggins & McTighe)

  32. Designing the Assessment Cycle ASSESSING • What assessment methods will the program implement? • How will you collect data (e.g., survey or observation of representative performance)? • Will you use direct and indirect measures? • How will you analyze the data? (What you find becomes your evidence, or the program’s evidence for decision making.)

  33. Designing the Assessment Cycle REPORTING • How will the program describe the assessment cycle and report findings? • To whom? • For what purposes? • Under what conditions? • And for what desired outcomes?

  34. Designing the Assessment Cycle ‘CLOSING THE LOOP’ • What will the program do with findings described in the report? • How will the findings lead to decisions that improve the program and opportunities for students to learn?

  35. Communicating Across Boundaries • Instructors within the Program • Department • College & University • Other stakeholders, including students

  36. References • A Nation at Risk: The Imperative for Educational Reform. A Report to the Nation and the Secretary of Education United States Department of Education by The National Commission on Excellence in Education. April 1983. http://datacenter.spps.org/uploads/sotw_a_nation_at_risk_1983.pdf • Berliner, David C., and Bruce Biddle. The Manufactured Crisis: Myths, Fraud, And The Attack On America's Public Schools. NY: Basic. 1996. • Brooks, Peter. “Our Universities: How Bad? How Good?” The New York Review of Books. March 24, 2011. Accessed April 30, 2014, • Eells, Walter Crosby. “Criticisms of Higher Education.” The Journal of Higher Education v. 5.4 (April 1934): 187-189. • Grafton, Anthony. “Our Universities: Why are They Failing?” The New York Review of Books. November 24, 2011. Accessed April 30, 2014, http://www.nybooks.com/articles/archives/2011/nov/24/our-universities-why-are-they-failing/. • Wiggins, Grant, and Jay McTighe. Understanding by Design, 2nd ed. New Jersey: Pearson. 2005. • Walvoord, Barbara E. Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education, 2nd ed. San Francisco: Jossey-Bass. 2010.

More Related