1 / 56

Assessment Results: The Who, the What, the Where and the How

Assessment Results: The Who, the What, the Where and the How. BRIC Las Positas Rob Johnstone, Bob Gabriner, Greg Stoup, RP Group Based on the work of Bob Pacheco & Fred Trapp, RP Group. What Sparked these Questions?. The ACCJC Institutional Effectiveness Rubric

dayo
Download Presentation

Assessment Results: The Who, the What, the Where and the How

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment Results:The Who, the What, the Where and the How BRIC Las Positas Rob Johnstone, Bob Gabriner, Greg Stoup, RP Group Based on the work of Bob Pacheco & Fred Trapp, RP Group

  2. What Sparked these Questions? The ACCJC Institutional Effectiveness Rubric (Part III Student Learning Outcomes) Proficiency Comprehensive assessment reports exist and are completed on a regular basis.

  3. Where Did We Look? With Whom Did We Consult? ACCJC. Institutional Effectiveness Rubric ACCJC. 2002 Standards ACCJC. Themes ACCJC. Guide to Evaluating Institutions Professional literature Efforts of national groups/institutes Institutional web sites, listservs and colleagues Around the country 4-years, 2-years, for-profit.

  4. Our Review Showed Five Overarching Uses of Assessment Evidence: Help faculty explore the student learning process. Determine the extent to which the curriculum is working. Where can time, energy and/or money be allocated for continuous improvement in learning? Exploit the writing process and dialogue about results to gain broader institutional learning experiences. Help meet the quality assurance pledge to the community.

  5. Area 1: Help faculty explore the student learning process.

  6. 1. Help faculty explore the student learning process.Happens at the Course LevelCCBC (Maryland) • Middle States Commission on Higher Education • Community College Futures Assembly, Bellwether Award, 2008 • Instructional Programs & Services for High Impact Course Level Assessment • CHEA award winner, 2006 • Institutional Progress in Student Learning Outcomes • National Council on Student Development (NCSD) Exemplary Practice Award Winner

  7. Projects are at least three semesters long Individual and high-impact courses (all sections) included Project proposal by a faculty group Measurable objectives External review & approval in selecting methods/instrument & analyzing results. Benchmarking should be included if possible. Controls and sample size considered. Course improvements based on data analysis Reassessment expected Results/report shared across the college and web posted 1. Help faculty explore the student learning process.Happens at the Course LevelCCBC (Maryland)

  8. 1. Help faculty explore the student learning process.Happens at the Course LevelCCBC (Maryland) • Learning Outcomes Assessment Final Report Template • Design & proposal for the LOA project • Implementation of design & data collection • Redesign of the course to improve student learning • Implementation of course revisions & reassessment of student learning • Final analysis and results • Eavesdropping • Two-page executive summaries available • http://www.ccbcmd.edu/loa/CrseAssess.html

  9. CHEM 108 An initial “failure” turned to success and collaboration with a four-year school HLTH 101 Addressing an achievement gap with professional development and incrEavesd communication with students CRJU 101 and 202 Statewide group assessment development effort and creativity in the interventions used 1. Help faculty explore the student learning process.Happens at the Course LevelCCBC (Maryland)

  10. GEG 1 Assessment part of the program review 2008 GEG 1 appendix GEG 1L appendix Eaves dropping http://www.rcc.edu/administration/academicAffairs/effectiveness/assess/resources.cfm 1. Help faculty explore the student learning process.Happens at the Course LevelCCBC (Maryland) 11

  11. Area 2: Determine the extent to which the curriculum is working.

  12. 2. Determine the extent to which the curriculum is working. Three Levels of Curriculum Development Assessment Informs this Improvement If you do not assess, you lose the opportunity to improve.

  13. This Kind of Analysis Happens at the Program Level The Study of the Learning Process happens at the course level where faculty’s natural curiosity about student learning takes place. Curriculum analysis occurs at the program level.

  14. What Are the Most Common Things Included? • Assessment focus- course, program, general ed, etc. • What outcomes were assessed? • How and when were they assessed? • Who was assessed? • What were the results? • Who reviewed the results, made sense of the them and what conclusions were reached? • What are the implications for practice and/or policy or future assessment work? • How were the results used?

  15. Program-level ReportingHocking College • North Central Association of Colleges & Schools, Higher Learning Commission • CHEA award winner, 2008 • Institutional Progress in Student Learning Outcomes

  16. Hocking College (OH)Program-level Report All programs have individual assessment plans Mission statement & central objective for assessment Institutional success skills (GE) Program exit competencies Criteria for and means of assessment Reporting of results

  17. Hocking College (OH) Learning outcomes data collected in a student E-portfolio Directing internal and external evidence (1 to 10 measures) Indirect evidence (1 to 4 measures) Evidence drawn from samples of student work for faculty to apply an agreed upon holistic rubric Eight general education outcomes (student success skills) Discipline-specific exit competencies or outcomes

  18. Hocking College (OH)Program-level Report Eaves dropping Cloud reference, not college URL as links are broken there Various reports available in each program profile Curriculum matrix Criteria statements (exit competencies) Instructional Program Outcomes (assessment plan) Trend Charts for performance criteria

  19. Hocking College (OH)Program-level Report • Example reports and analysis • Culinary Arts Technology • Forestry Management Technology • Nursing Technology

  20. Area 3: Where can time, energy and/or money be allocated for continuous improvement in learning?

  21. Where can time, energy and/or money be allocated for continuous improvement in learning? • This tends to happen via general education level assessment & reports, or • Another method is to create “institution-level” reports, where the outcomes of course-level, program-level, and general education level or integrated and summarized.

  22. General-Education ReportsMesa College (AZ) • North Central Association of Colleges & Schools, Higher Learning Commission • CHEA Award winner, 2007 • Institutional progress in Student Learning Outcomes

  23. Mesa CollegeGeneral Education Report Multiple outcomes assessed Annually Annual Report elements Executive Summary Methodology Results & observations (GE & workplace) Indirect measures findings Appendices of past results

  24. Mesa Community College (AZ) • General education studies completed 2007-08; 2005-06 • Numeracy • Scientific inquiry • Problem solving/critical thinking • Information literacy • Workplace skills (CTE) • General education studies completed 2006-07; 2004-05 • Arts & humanities • Cultural diversity • Oral communication • Written communication

  25. Mesa College (AZ)General Education Reports Eaves dropping http://www.mesacc.edu/about/orp/assessment/index.html Annual reports and summaries available Eight years of history and experience

  26. New England Association of Schools and Colleges, Commission on Institutions of Higher Education Cited in the Art and Science of Assessing General Education Outcomes: A Practical Guide (AAC&U, 2005) General-Education ReportsMesa College (AZ)

  27. Capital CC (CT)General Education Reports Multiple reports One per outcome Each study commonly takes a year Report elements Introduction Methods Results and findings Conclusions and recommendations Implications for future assessments Appendices of assignment, rubric, notes to teachers, etc.

  28. Capital CC (CT)General Education Reports • General education studies completed • Writing, 2001-02 • Math, 2002-03 • Critical thinking, 2003-04 • Global perspective, 2004-05 • Eaves dropping • Annual reports and summaries available • http://www.ccc.commnet.edu/slat/

  29. Portland CC (OR)General Education Reports Northwest Accrediting Commission Eaves Dropping http://www.pcc.edu/resources/academic/learning-assessment/ One general education theme a year Learning Assessment Focus for 2009-10- Critical Thinking & Problem Solving Physical Science, Geology and General Science Bioscience Technology Management and Supervisory Development Culinary Assistant Program

  30. Truman State University (MO)Various Reports Southern Association of Schools and Colleges, Commission on Colleges Eaves Dropping Assessment work began in 1970; National Leader http://assessment.truman.edu/ Assessment Almanac- A compilation of results from each year’s assessment work (versions from 1997 to 2009 are posted) General Education outcomes are assessed in the context of the major field of study Portfolio Project- required of all seniors to show best work assessed by faculty for the nature & quality of the liberal arts and sciences learning outcomes (versions from 1997 to 2008 are posted)

  31. Who Writes the Reports? • Course-level, program-level & general education • Teaching faculty study team with technical assistance from • institutional research or assessment committee • No “lone ranger” authors • Institutional summary • Academic administrator with assistance from • Learning outcomes coordinator or assessment committee • Compilation of work accomplished in one or two academic years across the institution

  32. Where Are the Reports? • Location of course, program and general education comprehensive assessment reports • Teaching faculty study team members, assessment committee chair, assessment website • Location of institutional summary reports • Academic administrator, assessment committee chair, assessment website • Not in the library basement, actively used to promote a learning organization

  33. Area 4: Exploit the writing process and dialogue about results to gain broader institutional learning experiences.

  34. 3. Exploit the writing process and dialogue about results to gain broader institutional learning experiences. • To all affected participants • Campus committees • Curriculum, assessment, resource allocation group, unit (department) leadership, general academic and college leadership • Campus fairs, brown-bag lunches, poster sessions for information sharing • Faculty professional development programs • Accreditation self-study committee work groups • College web site for the public

  35. Learning organization Environment that promotes a culture of learning Individual & group learning enriches & enhances the organization as a whole Systematic problem solving using data for decisions Learning from experiences in assessing organizational performance Comparing yourself to others (benchmarking) and borrowing ideas Adriana Kezar ed. Organizational Learning in Higher Education New Directions for Higher Education. No. 131, Fall 2005. Jossey-Bass. Exploit the writing process and dialogue about results to gain broader institutional learning experiences.

  36. Characteristics of Organization Learning • Researchers have found some critical features of learning organizations (Lieberman, 2005, pp. 87-98). In particular, a college as an effective organizational learner: • Maintains a scholarly approach to the questions and problems that the institution faces; • Approaches the campus problems as learners and not as experts; • Develops a culture of evidence that drives decision-making; • Links the organizational learning to the college’s mission; • Makes connections throughout the college and not just as individual units (e. g. , faculty, administration); and • Recognizes and rewards the college’s efforts to become a “learner.”

  37. Exploit the writing process and dialogue about results to gain broader institutional learning experiences. • The assessment data sense-making process = a faculty learning experience • Linking results to future interventions = a learning experience for • Faculty, assessment committee, academic administration, planning & resource allocation groups • Using results to inform an intervention, then reassess = a learning experience (accomplished one or more terms later) for • Faculty, assessment committee, academic administration • Reference for future assessment work and other groups on campus

  38. Exploit the writing process and dialogue about results to gain broader institutional learning experiences. Hocking College (OH) Student E-portfolio Annual summary Improvements in the program in the previous year brought on by study of assessment results Expenditures of time, money & materials for the assessment program Requests for assistance in implementing assessment Recommendations for altering the institution’s assessment process Transition from evaluating individual students to assessing groups of students & the curriculum experience

  39. Community College of Baltimore County (MD) Learning Outcomes Assessment Advisory Board Links findings in assessment reports to other college-wide initiatives and professional development opportunities Use of assessment processes and (findings) results Challenged faculty to reexamine prompts used in assessment Clarity of written prompt & extent it supports program goals Common assignment options and common rubric increases faculty understanding and buy-in Builds faculty unity toward common goals Public web page enhances communication and accessibility to information Exploit the writing process and dialogue about results to gain broader institutional learning experiences.

  40. Exploit the writing process and dialogue about results to gain broader institutional learning experiences. • CHEA award winner, 2010 • Institutional Progress in Student Learning Outcomes

  41. Reports as Institutional Learning& Resource Allocation Northern Arizona University- Seals of Assessment Achievement & Excellence Purpose: To recognize programs that have demonstrated significant progress with assessing student learning To promote “best practices” in assessment by sharing practical experiences To encourage programs to showcase program-level achievements and to adjust curricula when appropriate.

  42. Reports as Institutional Learning& Resource Allocation Feedback & recognition Feedback rubric for annual assessment reports Conversations and action Collection and analysis of evidence Implementation of findings Recognition (achievement & excellence)

  43. Reports as Institutional Learning& Resource Allocation Seal of Assessment Achievement Academic programs earning this recognition have demonstrated in their annual report that learning outcomes have been assessed through two or more methods, and findings have been discussed among the faculty.

  44. Reports as Institutional Learning& Resource Allocation Seal of Assessment Excellence Academic programs earning this recognition have demonstrated a thorough implementation of assessment plan(s) the reporting of meaningful assessment data the discussion of findings among faculty and perhaps students the use of findings to showcase student achievements and to make curricular adjustments.

  45. Reports as Institutional Learning& Resource Allocation Mesa College (AZ) Results Outreach Committee Promotes use of outcomes data in relation to faculty development, pedagogy and academic climate Groups of faculty offer a proposal for summer or academic year work above the course level Resulting report placed on the web and used for campus discussion and action

  46. Area 5: Help meet the quality assurance pledge to the community.

  47. 5. Help meet the quality assurance pledge to the community. • National Institute for Learning Outcomes Assessment (NILOA) • Assists institutions & others in discovering & adopting promising practices in the assessment of college student learning outcomes. • Documenting what students learn, know and can do is of growing interest to colleges and universities, accrediting groups, higher education associations, foundations and others beyond campus, including students, their families, employers, and policy makers.

More Related