1 / 17

Model System For Assessment: Adding Value To Academic Programs

Model System For Assessment: Adding Value To Academic Programs. Some background, context General program assessment model Program outcomes technical characteristics Course curricular assessment system Outcomes, course listing matrix Course loads frequency analysis

dinesh
Download Presentation

Model System For Assessment: Adding Value To Academic Programs

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Model System For Assessment: Adding Value To Academic Programs • Some background, context • General program assessment model • Program outcomes technical characteristics • Course curricular assessment system • Outcomes, course listing matrix • Course loads frequency analysis • …….Other comments, relationships • Dr. John W. Sinn, Professor • Technology Systems Department • Bowling Green State University • Fall, 2004

  2. Some Background, Context • Initial assessment systems developed at CMSU • Based on work with advisory committee, 2000 • Basis for design, validation of masters curriculum • CMSU work provided base for BGSU work • Ongoing, long term assessment by GPAM • Faculty driven, in-depth assessment process • First identified, updated outcomes • Second, reviewed all courses against new outcomes • Further developed via Engineering Management

  3. Some Background, Context • Based on multi-year assessment……… • Systems to work with advisory committee resulted • General program assessment model developed • General Program Assessment Model (GPAM) • Various forms, matrices to assist in GPAM • POTC, CCAS, OCLM, CLFA • Designed to “flush out” current status, plan • GPAM is all about customer satisfaction • GPAM helps suppliers meet customer demands • Systematic way to objectively do assessment

  4. General Program Assessment Model Advisory Committee, Strategic Planning Various Inputs, Internal And External Administrative, University Initiatives Program Outcomes, Continuous Improve-ment Course Assessment, Are Outcomes Being Met? Assess Program, Curriculum, Holistically Faculty Expertise, Professional Bodies Professional Bodies, Accreditation Faculty, Student, Alumni Success General Program Assessment Model (GPAM)

  5. General Program Assessment Model Multiple Forms, Matrices Do GPAM, Mechanics POTC Done In Various Ways CCAS, Faculty Assess Courses OCLM, What Courses Do Outcomes? CLFA, Planning Tool POTC CCAS OCLM CLFA Course Curricular Assessment System Program Outcomes Technical Objectives Objectives, Course Listing Matrix Course Loads Frequency Analysis GPAM generally moves from left to right, but may vary

  6. General Program Assessment Model General Program Assessment Model……… • GPAM infrastructure assumes……. • Advisory committee in place, functioning • Advisory committee well selected, governed • Matured, alumni, friends of program • Must move from political to objective • Administration, advisory committee, faculty…… • Dialogue, communicate, same agenda • Built a respectful working relationship, in place • Disciplined systems in place, robust approach

  7. General Program Assessment Model • GPAM derives learning outcomes, examples…. • Advanced design • Engineering economy • Quality systems • Culture, service • Technical communication • Applied research • Technical projects • Materials, processes • Outcomes have 3-5 technical characteristics • GPAM sets stage, provides infrastructure • Assess, rethink outcomes, dialogue for future • Advisory committee is key, iterative process

  8. Program Outcomes Technical Characteristics • EM learning outcomes use a POTC format of “technical characteristics” to further define.

  9. Program Outcomes Technical Characteristics POTC identifies specific outcomes, characteristics • POTC relates, connects many program issues • Alumnus may be important feedback • Review of literature • Various university, educational initiatives • Accreditation issues, opportunities, challenges • POTC may be an annual, ongoing process • Timing, logistics may be critical • Advisory committee is pivotal…….

  10. Course, Curricular Assessment System • CCAS, course analysis, faculty self assessment • CCAS provides supplier information in GPAM

  11. Course, Curricular Assessment System • CCAS is faculty driven, supplier information • CCAS does critical function in GPAM • Cannot proceed if faculty resist, do not cooperate • May respond if handled via advisory committee • This part of GPAM holds many opportunities • Communication enhancement for faculty • Set standards for syllabi, format agreements • Flush out actual work in courses • Seeking duplication, areas of outcome deficiency • Baseline to improve, as CCAS comes into being • Reflects change in defining “academic freedom”

  12. Outcomes, Course Listing Matrix Outcomes, course listing matrix (OCLM) • Supplied CCAS, POTC customer demands…… • Merged CCAS, POTC provides OCLM results • OCLM is supplier courses addressing outcomes • Key question, do we meet customer demands?

  13. Course Loads Frequency Analysis Course loads, frequency analysis (CLFA) • CLFA is derived from overall GPAM work • CLFA, a planning tool, strategic base for future

  14. Course Loads Frequency Analysis Course loads, frequency analysis (CLFA) • CLFA helps all plan for future, strategize • Must be able to deliver on customer demands • Do labs match course delivery? • Do we have faculty needed to deliver? • Capacity planning, other key issues……. • Frequency of offerings, by whom • New product development, program launches • Key communication tool with all in GPAM

  15. …….Other Comments, Relationships How best to review courses………..program? • Faculty have key responsibility, authority • Must understand customer demands • How to assure objectivity? • Use alumni surveys, survey industry, customers • Listen to current customers, do consulting, projects • Collect data, document for routine analysis • Course drops, grades awarded • Frequency of course changes, updates • Systematically, objectively, who does, how?

  16. …….Other Comments, Relationships How best to review courses………..program? • Look at products produced in courses • Ultimate test of course, outcomes in products • Portfolios of projects, student applied research work • Have industrial practitioners review products • Students, faculty publish, present work • Shared at conferences • Published in journals, traditional, creative websites • Certifications via professional groups • Are we maximizing on student organizations? • Reflections of mature, robust programs, systems

  17. …….Other Comments, Relationships How best to review courses………..program? • Recruitment data may be useful • Quality of incoming students, numbers turned away • Satisfaction of students recruited, graduating • Understanding where students come from, and not • Faculty reputations, scholarship, funded sources • Students seek out our programs? How are we found? • Are we well published, leaders in our fields? • How are we engaged in professional groups? • What donations are we attracting? How, why? • Who, how tracks, systematically, routinely?

More Related