1 / 40

Improving a College Through Web-Based Outcomes Assessment Reporting

Improving a College Through Web-Based Outcomes Assessment Reporting. ACCT – Dallas, TX Oct. 2011. Presenters. David Peter, Chair, Board of Trustees Kevin Berthot, Vice Chair, Board of Trustees Charlie Boaz, Trustee Pat Griffith, Trustee Dr. Brian Inbody, President.

callia
Download Presentation

Improving a College Through Web-Based Outcomes Assessment Reporting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improving a College Through Web-Based Outcomes Assessment Reporting ACCT – Dallas, TX Oct. 2011

  2. Presenters • David Peter, Chair, Board of Trustees • Kevin Berthot, Vice Chair, Board of Trustees • Charlie Boaz, Trustee • Pat Griffith, Trustee • Dr. Brian Inbody, President

  3. Neosho County Community College Mission To enrich our communities’ and our students’ lives through: • Student learning • Student success • Ensuring access • Responsiveness to stakeholders • Meeting community needs

  4. Neosho County Community College • Service area: Neosho County, Franklin County, northern half of Anderson County • Two campuses: 90 miles apart—Chanute and Ottawa • Headcount 3,700+ • Credit hours generated—’10-’11—45,000+ • 1,500 Full Time Equivalent (FTE) Students • 60% Transfer courses, 40% Technical Education

  5. On Our Way to “Hitting Bottom”—The Culture of NCCC Before 2003 • Over a number of years a rift had formed between the Board, the administration, and the faculty • The faculty circumvented the administration • The faculty lost their enthusiasm for teaching, and put their energy into regime change

  6. On Our Way to “Hitting Bottom”—The Culture of NCCC Before 2003 • Lack of consistent College leadership • For example – 11 Chief Academic Officers in 16 years • Board micromanaged the administration • Board meetings were the “best show in town” • Community discontent with Board leadership and College leadership—demanded the College close!

  7. On Our Way to “Hitting Bottom”—The Culture of NCCC Before 2003 The victims were students and student learning – no innovation, no new programs, no administrative support for new initiatives, no way of knowing if students were learning.

  8. Dire Situation—”Accredited, On Probation”—October 2002 • Board Governance Issues • Poorly run finances—No fund warrants • No President • No viable outcomes assessment system • Neither Course Outcomes Assessment, nor • Program Outcomes Assessment, nor • General Education Outcomes Assessments Accrediting agency due back in Spring 2004!

  9. Immediate Actions Taken—2003 • We put in place a good President and let her do her job. • We fixed the finances with better budgeting and smarter fiscal decisions. • We changed the role of the Board through training and professional development. • We increased and improved communications throughout the College and with the community.

  10. Dilemma Still looming large was a disgruntled faculty, and a “non-existent” outcomes assessment system. How do we create a course/program/general education outcomes assessment system, switch the focus from teaching to learning, and change the culture of a college—all within 15 short months?!?

  11. Neosho County Community College Mission To enrich our communities’ and our students’ lives through: • Student learning • Student success • Ensuring access • Responsiveness to stakeholders • Meeting community needs

  12. Faculty Culture—2002 • Faculty had suffered from a lack of academic leadership. • Most faculty had become stagnant with their teaching methodologies. • Faculty did not communicate between academic departments. • The faculty did not see how the assessment pieces fit together and its relevance to their teaching.

  13. The Situation with Outcomes Assessment in 2003 • No discussion within Institution about learning objectives for students • No data to back up budget requests (seat of the pants budgeting) and most importantly… • No value placed on assessment, nor knowledge about how to do outcomes assessment • Approaches in place: • Pretest/post test only with no correlation to outcomes • Paper report filed each semester that was never read • Faculty focused on inputs – not on results – teaching was emphasized, not learning

  14. Getting Started – Nuts and Bolts • The new President began conversations with faculty about what changes and emphases needed to occur in learning outcomes • New Chief Academic Officer and Assessment Coordinator were installed • They created the Assessment Committee • The committee conducted informational sessions to educate the faculty on learning outcomes and assessment

  15. Getting Started – Shifting the Focus • The Assessment Committee sponsored outcome assessment discussions/meetings (This was the most important step – starting the conversation!) • Instructors who taught the same course worked together to gather input about outcomes from: • Business and industry • Transfer colleges • Discipline expectations

  16. Getting Started – Shifting the Focus WONDERFUL conversations were occurring, many for the first time, between course level instructors as they really examined what they were teaching and—more importantly—what they could prove that students were learning!

  17. The Next Step – Creating the Computerized Assessment Form with Zero Budget • The Assessment Committee created the computerized outcomes assessment form • It’s all done with Public Domain software. • With that form we gather course outcome-level data from EVERY section and drop it into a database • We can search by course, section, instructor, semester, etc. • We can tell you how students in that section faired on each and every outcome

  18. The Next Step – Creating the Computerized Assessment Form with Zero Budget • The database can link course outcome level assessment results to the budget • It can show learning growth over time • It documents every change instructors are making to their courses and the results of those changes • It forces the faculty to “think” about why students didn’t reach an outcome and what they should do differently next time • The Form

  19. The Assessment Database • The database records student learning goal achievement by section, course, and outcome, all in one database! • It captures how faculty changed their instructional methods • It generates budget requests • It allows for grade distribution/outcomes assessment comparison 87%-91% of all sections file an outcomes assessment report every single semester!

  20. The Results of the Outcomes Assessment System • The faculty could now prove where learning is occurring and where it isn’t. • The faculty have discovered that they were “masking” poor student performance on one or more outcomes, when they only used pre/post tests as the assessment process. (Were the students really learning outcome 3 – no!) • The faculty now know that outcomes need to be rewritten so they are measureable. • Instructors now have lively discussions about successes, concerns, student performance, and assessments/outcomes.

  21. Most Importantly… When we, the administration and Trustees, saw poor performance in learning outcomes, we congratulated the instructor for finding the problem and we offered help. We didn’t use the information against the instructor. This is KEY for the system to work, and for assessment to be successful and REAL. This is how the culture changed—we developed an atmosphere that values learning and doesn’t persecute people for finding problems!

  22. On to Program Outcomes! • We, along with many community colleges, were struggling to measure program outcomes, especially in Associate of Arts and Associate of Science programs. • We wanted to better utilize the course assessment data we were already gathering and use it for Program Assessment—but how were we to do this. The Answer – the Program Outcome Matrix!

  23. Creating the Program Outcomes Matrix • MORE WONDERFUL conversations now began across campus as instructors from different disciplines talked with each other about what they covered in various program courses and how all course outcomes could fit together to form program outcomes. • The Program Outcomes Assessment System uses the SAME DATA collected at the course level, so no additional assessment is needed. Instructors are assessing program outcomes in the courses, when the students learn the material, not years later with a capstone course or exam.

  24. How the Program Outcomes Assessment System Works • Once during an academic year, institutional research uses the program matrix to pull course outcome data from the database and produces a report. • The faculty responsible for that program meet and use the report to gauge student learning on each program outcome. • The faculty then create a Program Outcomes Assessment Report detailing actions to be taken to improve student learning in the program, as well as any budget requests needed for that program’s improvement.

  25. On to General Education Outcomes Assessment! • Step one was to rewrite our general education outcomes from platitudes to REAL measurable outcomes – “What every NCCC graduate ought to know.” • This led to EVEN MORE WONDERFUL conversations among faculty between and within the campuses about how their specific course outcomes contributed to the overall general education of our students.

  26. General Education Outcomes Assessment! • We have created the Master General Education Matrix that pulls together all individual course outcomes that feed into the general education learning outcomes. • We have since verified the outcomes data with the CAAP (Collegiate Assessment of Academic Proficiency) exam.

  27. How the Trustees Use Assessment • Base budgetary decisions on what areas need attention • Confirm that all sections and all instructors are engaged in continual improvement • Ensure consistency across sections of the same course on the learning outcomes while giving instructors freedom on how to teach and assess those outcomes • Succinct information provided to Board on student learning

  28. Wonderful Results! Change Happens! • A more engaged faculty and student body • A budget that relies on data, not guesses • A transparency of data that is posted to web, used by the faculty for learning improvement, and presented to the Board of Trustees (Are Students Learning?) • A point of pride for an institution very much in need of one

  29. Wonderful Results! Change Happens! • NCCC is seen by the state as the leader in outcomes assessment • NCCC has consulted with many colleges outside the state on outcomes assessment • NCCCwon a national award from National Council of Instructional Administrators in 2008 • NCCC won a Bellwether Finalist Award in 2009

  30. Wonderful Results!! Change Happens! • Increases in retention (Board mandate!) – Fall to Fall Retention up 16% • CAAP Scores that meet or exceed national average in Reading, Math and English • Improvement in Nursing Board Scores • Increases in enrollment (11th up from 18th out of 19 community colleges in Kansas, making us the 2nd fastest growing college. This fall, we grew faster than any community college in the state.) • First college in Higher Learning Commission history to go from probation to a full 10 years accreditation!

  31. But Most Importantly… The NCCC computerized outcomes assessment system has become the catalyst in transforming the College’s culture from a dysfunctional, stagnate institution into a premier college that values students and student learning above all else!

  32. Lessons Learned • The system needs to be faculty driven from the very beginning. • The most important step is to get the faculty talking with each other. • We found it was much better to build our own system and make it simple, than to hire some company that has an overly-robust system that requires extensive training and money. • You need to test your system thoroughly before implementation—which we did not have time to do the first time it was launched. • You should include documentation/instructions to aid faculty.

  33. Lessons Learned • Require faculty to participate in the system and pay for it – (At NCCC it is part of the full-time instructor’s contract and we pay the adjunct instructors to fill out the form and submit their results ($25 per credit hour of the course being assessed).This has improved the participation percentage.) • The results need to be tied to budget decisions so that resources are directly going towards impacting learning. • The most important lesson is how to treat instructors when educational outcome goals are not met. These discoveries are met with excitement, not condemnation. “Congratulations, you found the problem! Now we can address it! How can we help?”

  34. Conclusions • A simple, elegant idea that had an enormous impact – changing a culture from dysfunctional to engaged. • A HUGE turn-around from probation to premier! • A Board that issued a mandate, stepped back, gave support, and let it happen. • A system that is leading change and transforming learning at area colleges and colleges around the country. • You can do it too! This idea can be transplanted and adapted to any college struggling to create a culture of student learning and a true Outcomes Assessment system — and it’s FREE!

  35. Questions and Answers

  36. Contacts David Peter dpeter@neosho.edu Kevin Berthot kberthot@neosho.edu Charlie Boaz cboaz@neosho.edu Pat Griffith pgriffith@neosho.edu Dr. Brian Inbody binbody@neosho.edu

More Related