1 / 31

LEAP Integrative Learning

Crowd-Sourcing Innovative Practices : Assessing Integrative Learning at Large Research Institutions.

ghalib
Download Presentation

LEAP Integrative Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Crowd-Sourcing Innovative Practices:Assessing Integrative Learning at Large Research Institutions

  2. Mo Noonan BischofAssistant Vice Provost mabischof@wisc.edu Amy Goodburn Associate Vice Chancelloragoodburn1@unl.eduNancy Mitchell Director, Undergraduate Education nmitchell1@unl.edu

  3. LEAP Integrative Learning Synthesis and advanced accomplishment across general and specialized studies demonstrated through the application of knowledge, skills, and responsibilities to new settings and complex problems.

  4. Challenge: AssessingIntegrative Learning • Can/Should the same assessment tools be used for assessing within a course, a unit, and/or institution? • Does it apply to integrating knowledge and skills within a discipline, among disciplines, or both? • How can we align quality improvement levels while respecting disciplinary purposes & values?

  5. UW-Madison Learning Community 21,615 employees… 2,177 faculty 1,635 instructional academic staff 1,261 research academic staff 5,291 graduate assistants 42,820 students… 29,118 undergraduates 9,183 graduate students 2,774 professional students 1,745 Non-degree students

  6. Annually: 7,400 new undergraduates 29,500 enrolled undergraduates 6,500 Bachelor’s degree graduates

  7. 13 academic schools/colleges distributed responsibility and governance ~500 academic programs, all levels 134 Bachelor’s level degree programs Annual Degrees More than 300 200-299 100-199 50-99 1-49

  8. Includes WI-X and ELO’s Institutional-level learning goals, assessments Program-level learning goals, assessments Program-level learning goals, assessments Program-level learning goals, assessments Program-level learning goals, assessments Program-level learning goals, assessments

  9. Why pilot the AAC&U VALUE Rubrics? • Identified gap: institutional level assessment, direct measure approach • Evaluates student learning across programs • Aligns with AAC&U Essential Learning Outcomes • Aligns with VSA/College Portrait demonstration project • First pilot project summer 2012, second pilot 2013 • Main Goal:bring faculty across disciplines together to evaluate student work

  10. AAC&U VALUE Rubric Project • Cohort of 25 faculty • Cross-disciplinary representation • Focus on faculty engagement • “Value-added” approach to compare first year students and students near graduation • AAC&U VALUE written communication rubric

  11. Written Communication VALUE Rubric Selected written communication for ease of identifying artifacts across disciplines/programs Dimensions: • Context and Purpose for Writing • Content Development • Genre and Disciplinary Convention • Sources and Evidence • Control of Syntax and Mechanics

  12. Artifacts: “Value-added” Approach • Goal was to collect 350 artifacts at each level, FYR and NGR • Identified 52 courses that had high numbers of FYR and NGR and seemed likely to have a suitable writing assignment • 22 courses (41 instructors) had a suitable assignment and agreed • Invited 2450 students to submit artifacts • Collected 451 submissions

  13. Scorers: Faculty Engagement • 1.5 day workshop in June 2013 • Set ground rules • 3 structured rounds intended to get faculty familiar with the rubric and to “test” scorer agreement • Asked faculty to think beyond their field/discipline • Each scorer rated about 40 artifacts • Discussion revealed challenge with the 4-point scale and what is “mastery”

  14. *Zmw score is from the Mann Whitney U-Test. Zmw scores >1.96 indicate that the two groups are significantly different at p=0.05.

  15. Summary Findings • Percent of nearly graduating students who were judged proficient or better (a score of 3 or 4 on 4 point scale) on each of the dimensions was fairly high—ranged from 64%-83%. Across all dimensions: 74.7% • Levels of significant difference between first-year and nearly graduating students were weak • Inter-scorer reliability was problematic (“mastery” issue…) • Overall 67% of scorer pairs showed weak agreement or • Systematic disagreement

  16. What did we learn? • Importance of assignment (artifact) development • Adapt rubric: program mix and/or campus culture (language, LOs) • Engagement of faculty = high quality discussions (ground rules/calibration) • Next Steps: continue to engage faculty at program and disciplinary levels Contact Information Mo Noonan Bischof, Assistant ViceProvost, University of Wisconsin-Madison, mabischof@wisc.edu More about our project: http://apir.wisc.edu/valuerubricproject.htm

  17. University of Nebraska-LincolnResearch One, Big Ten Conference, Land-Grant24,000 students8 independent colleges

  18. Achievement-Centered Education (ACE) • 10 Student Learning Outcomes (30 credits) • 600 courses across 67 departments • Transferable across 8 colleges • Requires assessment of collected student work

  19. UNL Assessment Context • Review of each ACE course on 5-year cycle • Biennial review of all undergrad degree programs • 50 disciplinary program accreditations • 10-year North Central/HLC accreditation

  20. ACE 10 Generate a creative or scholarly product that requires broad knowledge, appropriate technical proficiency, information collection, synthesis, interpretation, presentation, and reflection.

  21. HLC Quality Initiative:ACE 10 Project 25 faculty across colleges meet monthly to • Explore methods and tools for assessing work • Develop a community to share ideas • Connect ACE 10 & degree program assessment • Develop process for creating assessment report • Create team of assessment “ambassadors”

  22. Discussing Assessment Practices

  23. A Common Rubricdisciplinary vs. institutional goals

  24. Inquiry Project Results • Abandoned idea to pilot a common rubric • Revised syllabus to focus on processes, not tools • Developed poster session for public sharing • Streamlined ACE & program review processes • Creating process for 5-year ACE program review

  25. Group Discussion • How do you address differences across disciplinary norms and cultures? • How can program/ disciplinary assessments inform institutional assessment and vice versa? • What strategies can you use to develop shared goals and understanding? • What are some effective practices for supporting and sustaining faculty and staff engagement?

More Related