1 / 52

Current Debates about Assessing Student Learning

Current Debates about Assessing Student Learning. Natasha Jankowski, PhD. Director, National Institute for Learning Outcomes Assessment Research Professor, Education Policy, Organization and Leadership, UIUC. @ NILOA_web @ njankow. NILOA

mvillanueva
Download Presentation

Current Debates about Assessing Student Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Current Debates about Assessing Student Learning Natasha Jankowski, PhD. Director, National Institute for Learning Outcomes Assessment Research Professor, Education Policy, Organization and Leadership, UIUC @NILOA_web @njankow

  2. NILOA NILOA’s mission is to discover and disseminate effective use of assessment data to strengthen undergraduate education and support institutions in their assessment efforts. ●Surveys ● Web Scans ● Case Studies ● Focus Groups ● Occasional Papers ● Website ● Resources ● Newsletter ● Presentations ● Transparency Framework ● Featured Websites ● Accreditation Resources ● Assessment Event Calendar ● Assessment News ● Measuring Quality Inventory ● Policy Analysis ● Environmental Scan ● Degree Qualifications Profile ● Tuning www.learningoutcomesassessment.org

  3. www.learningoutcomesassessment.org

  4. Turn to a neighbor and share • What are you most frustrated about when it comes to assessment? • Now… • What was your favorite teaching or learning moment? • Why are these mutually exclusive? Assessment and teaching and learning?

  5. Debates on the Purpose Why do we do assessment? What is the value and purpose of engaging in assessing student learning?

  6. Value • Institutions of higher education are increasingly asked to show the value of attending, i.e. impact in relation to cost; employment – what is the value of a degree and what does it represent in terms of learning? • Public and policy makers want assurance of the quality of higher education • Regional and specialized accreditors are asking institutions to show evidence of student learning and instances of use • We have questions about our own practice to inform internal improvements

  7. Purpose Statement for BU

  8. Accreditation Under Fire Accreditors are increasingly being questioned on their value and role, and they have their own regulations to which they are held accountable in order to be recognized as an accrediting agency

  9. Principles of Local Practice • Focus on improvement and compliance will take care of itself.

  10. 2017NILOAProvostSurvey • Sample:Allregionallyaccredited, undergraduatedegree-granting institutions (n=2,781) • Announcedvia institutionalmembershiporganizations,website, newsletter,mailing • Onlineandpaper • 29% responserate (n=811)

  11. Institution-level 82% of campuses have SLO statements Concrete, clear proficiencies students are to achieve -- reference points for student performance common to all undergraduates across all majors.

  12. Learning Outcomes are Increasingly Aligned At 50% of campuses: all programs have PLOs and align those PLOs with ILOs

  13. Current Activity • 77% of institutions are currently involved in mapping curriculum • 62% facilitating faculty work on the design of aligned assignments

  14. Findings • 1. Institutions are trending towards greater use of authentic measures of student learning, which is consistent with what provosts indicate are most valuable for improving student outcomes. • 2. Majority of changes made and uses of evidence of student learning occur at the program- and course-level.

  15. VALUE Institute

  16. But what about programs? Program-level survey: Randomly selected 3-5 departments from all regionally accredited degree-granting institutions and sent them a survey with a 30% response rate (n= 982)

  17. Program survey findings • At the department/program level, the primary driver in assessment activity is faculty’s interest in improving their programs—followed by accreditation, both institutional and specialized. • A diverse array of assessment methods are employed at the program level across institutions, the most frequently used being capstone experiences, rubrics, final projects, and performance assessments • The primary use of assessment results at the program level is for program review—followed by instructional improvement and institutional accreditation.

  18. Program Survey Findings Cont. Although differences exist among disciplines in terms of the amount of assessment activity, the most notable variations are in the types of assessment methods they use.

  19. Model Differences

  20. Institutional or Program Improvement Model

  21. Assess, Intervene, Reassess

  22. Student Learning Improvement

  23. Assessment as a Process... • Is trying to get us to think intentionally about our learning design

  24. Assessment Cube of Misunderstandings Uses/Questions Purposes/Value Definitions Levels/Focus

  25. Three Schools of Thought • Measurement • Compliance (Reporting) • Teaching and Learning (Improvement)

  26. Measurement • Built upon scientific principles or empirical research, objective, rational, validity, and reliability • The Multi-State Collaborative: A Preliminary Examination of Convergent Validation Evidence ~Mark Nicholas, John Hathcoat, & Brittany Brown • Testing and standardization • Must be measureable • Argue narrowing of curriculum • Goal driven • Focused on process • Interventions • Pre/post • Comparisons

  27. VALUE report

  28. VALUE report

  29. Compliance • Documenting institutional quality assurance through reporting frameworks • Is assessment destroying the liberal arts? ~Karin Brown • Bureaucractic • Laborious • Time consuming • Separated from teaching and learning • Add on • Accountability and quality assurance • Reporting and archive • Lots of data collection, minimal use

  30. ACCREDITATION/ PROGRAM REVIEW

  31. But where are the students…?

  32. Teaching and Learning • Focus on pedagogy, understanding of student experience, informing program improvement, embedded in curricular design and feedback, builds student agency • Does continuous assessment in higher education support student learning? ~Rosario Hernandez • Driven by faculty questions regarding their praxis – is what I am doing working for my students? • Improvement oriented • Focus on individual students • Students as active participants – not something done to them • Formative • Feedback • Collaborative • Assessment for learning • Adaptive and embedded

  33. Transparency Awareness of Learning Outcome Statements

  34. IMPROVEMENT

  35. Involving students

  36. Involving students Assessment is not something we do to students it is something we do with students.

  37. Additional Points of Contention

  38. Academic Freedom

  39. Does It Improve Learning?

  40. Cost of Assessment

  41. https://blogs.illinois.edu/view/915/639769

More Related