1 / 20

American Society For Engineering Education

American Society For Engineering Education. Annual Conference St. Louis, MO June 18-21, 2000. Using Quality Function Deployment to Meet ABET 2000 Requirements for Outcomes Assessment Prof. Phillip R. Rosenkrantz Cal Poly Pomona. Outcomes and Assessment Team. ABET 2000 Criteria

Download Presentation

American Society For Engineering Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. American Society For Engineering Education Annual Conference St. Louis, MO June 18-21, 2000

  2. Using Quality Function Deployment to Meet ABET 2000 Requirements for Outcomes AssessmentProf. Phillip R. RosenkrantzCal Poly Pomona

  3. Outcomes and Assessment Team • ABET 2000 Criteria • 1.5 year-long project • Faculty involvement • Industry involvement • Alumni involvement

  4. Selection of Assessment Methodology • Strategic Planning • Malcolm Baldrige National Quality Award Criteria • Total Quality Management (TQM) • Quality Function Deployment (QFD) • Customized Approach

  5. Quality Function Deployment Chosen as Primary Methodology • Enthusiastically supported by the full IME faculty. • Adaptations and enhancements using other methodologies • QFD team formed (Dept, IAC, Alumni) • Met regularly for five quarters • “Modified” version of QFD was used.

  6. Phase I The Voice of the Customer • The IME Department recognized constituencies or “stakeholders” that need to be considered in all curriculum, scheduling, and program related decisions. • Identified eighteen stakeholders.

  7. Three Categories of Stakeholders • Those we serve; • Those who use our graduates; • Those who regulate us • Used 1, 3, 9 weighting scale

  8. Most Important (9 points) • Students (& Alumni) • University Administration/CSU • Manufacturing sector companies • ABET (accrediting agency) • State Government

  9. Next Most Important (3 points) • Other faculty/departments • Parents of students • Service companies • Board of Professional Engineers • ASQ (Certification) • SME (Certification)

  10. Least Important (1 point) • Grad schools • General public • Granting agencies • Public sector employers • Information sector companies • WASC • APICS

  11. Phase IIProgram Objectives and Outcomes(Needs Assessment) • Department Mission Statement • Department Objectives • ABET “a-k” outcomes • SME “Competency Gaps” • “Other” sources • Result: Goals & 24 “SKAA’s” (Skill, Knowledge, Attitude, and Ability areas)

  12. Phase IIIQFD Implementation • Five Matrices • Interative Process • Results flowed from one matrix to the next • Fast Input from many stakeholders • Provided valuable results • Quantifiable

  13. Matrix 1:Stakeholder vs. SKAA • 18x24 matrix was used to evaluate the importance of each SKAA for each stakeholder. • Identified which SKAAs are the most important overall. The result is a ranking that include the importance weighting for each stakeholder.

  14. Matrix 2: SKAA vs. Core Course • Core courses evaluated on current SKAA coverage. • Column totals reveal how much each individual course covers SKAA’s. • Row totals show how much each SKAA is covered in the curriculum. • Rankings of SKAA row totals reveal potential weaknesses in the curriculum.

  15. Case Study - IME 415 Quality Control by Statistical Methods • Column total was initially 41 points. • Professionalism/Ethics & Social Responsibility (+8) • Teaming – Team projects (+8) • Employability – Six-Sigma Quality (+2) • Use Skills/Tools – Web, Charts (+3) • Reliability Engineering – Intro (+3) • Quality Standards –ISO/QS 9000 (+0) • Added 24 points to the column = 75 points

  16. Matrix 3:SKAA vs. Methodology • Developed list of current and potential teaching methodologies. • Methodologies evaluated against each SKAA for “potential” effectiveness and assessment capability. • Rankings indicate methodologies with the most potential benefit in achieving and evaluating desired outcomes.

  17. Matrix 4:SKAA vs. Assessment Tool • List of existing and potential assessment tools. • Presented to the faculty and modified. • Tools rated for potential effectiveness in assessing the degree to which each SKAA has been effectively taught. • Used to decide which tools should be supported at the department level.

  18. Matrix 5:Assessment Tools vs. Core Courses • Core courses rated for the potential effectiveness of the tool. • Matrix gives each faculty member a more complete list of assessment options for the courses taught.

  19. Phase IVAction Planning • Timetable • New Industry Survey Instruments • Revised Instructional Assessment Instrument • Exit Interview Process

More Related