1 / 23

Multi-State Collaborative to Advance Learning Outcome Assessment

Multi-State Collaborative to Advance Learning Outcome Assessment. Pilot Project and PCC. Our Problem. How can we tell that students who complete the general/transfer degrees have met the outcomes of those degrees? No reliable pattern of course-taking No capstone in place

lieu
Download Presentation

Multi-State Collaborative to Advance Learning Outcome Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multi-State Collaborative to Advance Learning Outcome Assessment Pilot Project and PCC

  2. Our Problem • How can we tell that students who complete the general/transfer degrees have met the outcomes of those degrees? • No reliable pattern of course-taking • No capstone in place • No exit exam administered

  3. Our Challenge • NWCCU Recommendation: • 1. The evaluation committee recommends that the College develop indicators of achievement for all of the College's core learning outcomes that are assessable and can be used as a basis for determining that an established target for student performance levels has been achieved and that such achievement contributes to demonstrating mission fulfillment (Standard l.B) • Figure out the level of achievement you want, and then find a way to determine how well students meet that

  4. Our Approach • Has been very SAC-focused • LDC: How do the core outcomes look in your discipline? • CTE: How do your outcomes align with core outcomes? • This is VERY appropriate, and has encouraged conversations and improvement

  5. Our Hope and Worry • The Hope: • that as students make their way through the courses they need to graduate, they have multiple opportunities to develop and achieve a level of mastery of the Core Outcomes • The Worry: • Will we be able to figure out whether this is really happening ?

  6. Good news: • Many SACs have found that the best way to assess core outcomes is to use • Natural (or at least logical) alignment of discipline content with core outcomes • Embedded Assignments (authentic, students will do it, less extra work for faculty) • This is a national trend

  7. We are not alone! • Nine states—Connecticut, Indiana, Kentucky, Massachusetts, Minnesota, Missouri, Oregon, Rhode Island, and Utah—are collaborating to develop and pilot a model to use authentic student work and embedded assignments to assess broad institutional outcomes • Statewide teams have been working to develop the model and pilot • Active support of SHEEO and AAC&U (and more recently, Gates Foundation

  8. Key features of the model • Focuses on broad, institutional, ”essential competency” level outcomes • Is designedto reflect cumulative student knowledge and ability (not focused on the course, instructor or discipline) • Uses authentic student work derived from embedded assignments • Is based on the use of Essential Learning Outcomes and associated VALUE Rubrics developed by faculty members under the auspices of AAC&U’s LEAP initiative.

  9. Pilot Project • Collect student work that can be assessed appropriately for two “essential competencies,” using the LEAP Value Rubrics for • Written Communication • Quantitative Literacy • Of students 75% of the way through their degree (Associates or Bachelor’s) • To be scored by faculty assembled from all 9 states, CC and Univ to obtain statewide “score”

  10. Pilot Project • 6 institutions per state: 3 CCs, 3 Universities • Artifacts to be collected: • “Minimum target” is 75 artifacts per outcome per institution • Ideal is to have “representative” distribution across disciplines • Students: 75% of the way through their Degree (Associates or Bachelors) • Each institution figures out how to reach these targets

  11. Pre-Project Professional Development • For this to work there must be some instructors who have or are willing to develop/adapt existing assignments so that students are likely to demonstrate competency for multiple elements of the rubric. • Some Phase I pilot funding will go to states, to be used for assignment development

  12. Artifact collection • Institutions remove identifying information (student, instructor, college, state) • Artifacts will be coded so the scores can be offered back to institutions • Institutions may code internally so results can be used in-house, back to the instructor if desired

  13. Scoring • Centralized face-to-face norming and scoring • Faculty from each of the participating institutions (probably 2, hopefully 3) • Team from each state organized by the state project leadership

  14. Use of Results? • Campuses may use results however they choose • States will provide aggregated holistic and analytic scores among same-sector institutions (e.g., 2-year, 4-year) • Within-state comparisons may be made with benchmarks established for same-sector campuses • Disaggregated comparisons may be made, as possible, by selected student body and institutional characteristics • MSC will request aggregated holistic and analytic scores among same-sector institutions (e.g.,2-year, 4-year)

  15. The Timeline • Phase I • Identification of Institutions: Jan/Feb 2014 • Communication and Systems Development • Faculty Professional Development May/June 2014 • Artifact Collection: Fall 2014 • Phase II • Scoring: January 2015 • Results back to states, institutions

  16. Oregon Schools in the Pilot • University of Oregon • Oregon State University • Oregon Tech (OIT) and • Portland Community College • Chemeketa CC • Southwestern Oregon CC

  17. How will we do this? • Identify faculty who are “willing to play” • i.e., who have or are willing to tweak assignments to align with expectations of the rubric (can be LDC or CTE) • In Fall, use Banner to identify students in their classes who are within 75% of degree completion • The completed assignments from those students will be collected, coded, redacted if necessary and submitted • Participating faculty will be offered a chance to be selected for the multi-state norming/scoring event

  18. A parallel/aligned project for us ? • We could norm/score the same artifacts as we send forward – • Compare with multi-state score • Allows us to involve all participating faculty and others (depending on need and funding) • We could norm/score work from all of the students in the selected classes • Is there is an difference based on # of credits accumulated? • If we have SACs participating, we can code that so the results can be sorted, and they could norm/score, to compare to Statewide /PCC collection/scoring?

  19. Oregon LO&A Group • Planning late spring/early summer workshop focused on: • understanding the rubrics • developing assignments that work • Workshop parameters have yet to be determined, but to PCC participating faculty will be encouraged and supported to attend

  20. Why I think this is awesome • Collaboration with faculty from other colleges and universities • Professional development in… • Developing assignments • Scoring with rubrics • Opportunity to view and evaluate work done by students from other colleges, universities and states • What can we learn about our students now? • Could we do this “in house” to meet NWCCU expectations?

  21. More awesomeness… • This does not replace the SAC approach, but complements it • This does not land on SACs as extra work, • Is voluntary for individual instructors • SACs may want to “sign on” • They could use this as their 2014-15 assessment • It might help those struggling to figure it out • They might be interested in comparing their results with others • Quantitative Literacy is not one of our Core Outcome, • but maybe it should be (it is at MANY colleges and universities • This would give us a chance to see what that might look like

  22. Questions?

  23. One more thing…. • Suggestions for how to roll out the “call for participants”? • FT faculty, PT faculty, also SAC chairs • See draft e-mail – suggestions???

More Related