1 / 58

ABET Processes What Needs to Be Done? How Can It All Be Accomplished? What Are the Benefits?

ABET Processes What Needs to Be Done? How Can It All Be Accomplished? What Are the Benefits?. Fred DePiero Associate Dean & EE Professor. Outline. Overview of ABET Requirements What needs to be done? Specific assessment methodologies & program improvements Suggested approaches

sook
Download Presentation

ABET Processes What Needs to Be Done? How Can It All Be Accomplished? What Are the Benefits?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ABET ProcessesWhat Needs to Be Done?How Can It All Be Accomplished?What Are the Benefits? Fred DePiero Associate Dean & EE Professor

  2. Outline • Overview of ABET Requirements • What needs to be done? • Specific assessment methodologies & program improvements • Suggested approaches • Centralized (most) indirect measures • Decentralized (most) direct measures • Draw from university sources (direct & indirect) • College-wide improvements • College-wide overview • Lessons learned, general comments Fred DePiero fdepiero@calpoly.edu

  3. Organization of Talk • In Order of Criteria (‘C2’, ‘C3’, ‘C4’) • ‘What?’ • What are requirements? • ‘How?’ • Suggestions on how to accomplish tasks • ‘Benefits!’ • What are institutional benefits? C2 What? Fred DePiero fdepiero@calpoly.edu

  4. Limited DisseminationOf Materials • Distribute freely within your faculties • Do NOT post on public web sites • BlackBoard (e.g.) fine Fred DePiero fdepiero@calpoly.edu

  5. Audience? / Status? • Familiar with: • Criteria 2, 3, 4, 5? A-K? • ‘Indirect’ vs ‘Direct’ measure? • Status: • Have PEOs? Reviewing and revising? • Measurable skills identified for A-K? Rubrics? • Any direct measures for A-K? • Curriculum mapping? Fred DePiero fdepiero@calpoly.edu

  6. Recent ABET Visit, ‘Went Well’ • Visit: October 26-28, 2008 • Simultaneous visit (CAC & EAC) • 12 Programs evaluated for reaccreditation • Including 2 outside CENG • 1 Program evaluated for first accreditation • (Specific results confidential) • Biomedical Engr going for 1st accr in ~1.5 years • Five Self Studies selected for display at ABET Symposium 2010 (Las Vegas). • ME, CE, IE, AERO, CSC Fred DePiero fdepiero@calpoly.edu

  7. My Role? • Provide centralized services • On-Going: run surveys of seniors, alumni • Gather data from university-wide sources • Visit Prep: Additional data services (course offerings, teaching assn), transcripts, schedule, coach to coordinator, chair, faculty • Specific guidance to programs • Self Study template. Compare recent S-S examples • Provide suggestions • Share best practices. Report from BAP, ASEE • Also an ABET Program Evaluator - Invaluable! • 5 team members are PEVs Fred DePiero fdepiero@calpoly.edu

  8. Overview of All Criteria • C1, Students • Advising, transfer credit, grad evaluations • C2, Program Educational Objectives • Career accomplishments ~3-5 years after graduation • Review and revise PEOs • C3, Student Outcomes, A-K • Skills, knowledge, abilities acquired by time of graduation Fred DePiero fdepiero@calpoly.edu

  9. Overview of All Criteria… • C4, Continuous Improvement • Use evaluation of C2, C3 to improve program • C5, Curriculum • Basic math & science, eng science, design • C6, Faculty (Sufficient & Competent) • C7, Facilities • C8, Institutional Support • Program Criteria Fred DePiero fdepiero@calpoly.edu

  10. Criterion 2: Objectives • ‘Objectives’ define career accomplishments of graduates, ~3-5 years after graduation. • Programs define their own Objectives. • Examples: Graduates will… • Continue education (graduate school or other) • Work in an ethical manner • Become leaders on project team • Be good members of the community, do volunteer work • Examples: MATE, ME, CE, (IE) C2 What? Fred DePiero fdepiero@calpoly.edu

  11. Criterion 2: Objectives • Assessment, Typical Approaches • Use alumni surveys • Feedback from Industrial Advisors • Evaluation • My interpretation, validated by ABET • All alumni should attain one or more Objectives. ‘1-of-N’ requirement. • All Objectives should be attained by at least some alumni. C2 What? Fred DePiero fdepiero@calpoly.edu

  12. Improving Survey Processes C2, C3 How? Fred DePiero fdepiero@calpoly.edu

  13. Creating Better Surveys • Three most important aspects: • Short, Short, SHORT! How? Fred DePiero fdepiero@calpoly.edu

  14. Get More ValueFrom Surveys • Alumni surveys – 2 styles of questions • Include likert scale questions on each PEO. • Asked specific questions (‘yes/no’), for more accurate data - fewer misinterpretations • On specific career accomplishments of alumni • Drop ‘Leader in field?’, rather ‘Award for work?’ • (Examples: MATE, ME, CE, IE) C2 How? Fred DePiero fdepiero@calpoly.edu

  15. Example Alumni Survey (MATE) (PEOs listed on survey) • 1. Apply materials engineering principles to analyze and solve real-world engineering challenges. • 2. Communicate and perform as effective engineering professionals in both individual and team-based project environments. • 3. Develop intellectually through continuous learning. • 4. Work in an individual or team environment in a socially responsible manner. Please rate the extent to which you feel that the career objectives are appropriate for CalPoly MATE graduates? (Accomplished within 5 years of graduation) • Very Appropriate • Somewhat Appropriate • Neutral • Somewhat Inappropriate • Not Appropriate Fred DePiero fdepiero@calpoly.edu

  16. Example Alumni Survey (MATE) (PEOs listed on survey) • 1. Apply materials engineering principles to analyze and solve real-world engineering challenges. • 2. Communicate and perform as effective engineering professionals in both individual and team-based project environments. • 3. Develop intellectually through continuous learning. • 4. Work in an individual or team environment in a socially responsible manner. (The following specific accomplishments could be presented as Yes/No questions) • I routinely analyze and solve real-world engineering challenges; • I have been promoted from an entry-level position; • I have been given a raise and/or a bonus; • I have started or now own a successful business; • I am a team member of a successful organization; • I am part owner of a patent; • I have been given positive performance reviews; • I have management/leadership responsibilities; • I have learned new skills; • I have acquired knowledge beyond my B.S. degree; • I have earned graduate degrees; • I have earned other degrees and/or certifications; Fred DePiero fdepiero@calpoly.edu

  17. Get More ValueFrom Surveys • Alumni surveys – 2 styles of questions • To ensure ABET compliance • Ask seniors ABET Criteria verbatim • Issue/Limitation: Varied interpretation by respondents, others • For more useful data • Translate ABET, local / specific terms, e.g. describing skills • Map results into PEOs C2 How? Fred DePiero fdepiero@calpoly.edu

  18. Using Electronic ToolsTo Help with Surveys • Web-based, via SurveyMonkey.com • Alumni: Social networking site ‘PolyLink’, helps build on-line community • Seniors: Email addresses from univ data sources • SurveyMonkey provides simple statistics • Organize documents (survey instruments) and results electronically… C2, C3 How? Fred DePiero fdepiero@calpoly.edu

  19. Encouraging BetterSurvey Response Rates • Email Prompt / Message – Alumni Surveys • Modified subject line - less likely flagged as spam • An energetic email message • Sender appears to be Dept Chair • Motivating respondents • Either a course requirement for senior surveys, or instructor’s polite request • Prize and gift offered for alumni surveys C2 How? Fred DePiero fdepiero@calpoly.edu

  20. Survey Return Rates Improved • Alumni: • 20%-40% (2x - 3x improvement) • Seniors: • 100%, if a strict course requirement • 75% if requested & encouraged by instructor, nagged via re-prompts C2, C3 How? Fred DePiero fdepiero@calpoly.edu

  21. Criterion 3: Student Outcomes • Outcomes define students’ abilities by graduation. • Recommend: Use ABET’s standard 3A-3K • Requirements: (Challenging!) • Programs define more specific measurable skills. (Action verb - Preferably high on Bloom’s scale) • Curricula must ensure that every student has an opportunity to learn these skills. • Recommend both direct & indirect measures in a sustainable process. C3 What? Fred DePiero fdepiero@calpoly.edu

  22. Criterion 3: Outcomes ‘A-K’ (a) an ability to apply knowledge of mathematics, science, and engineering (b) an ability to design and conduct experiments, as well as to analyze and interpret data (c) an ability to design a system, component, or process to meet desired needs within realistic constraints such as economic, environmental, social, political, ethical, health and safety, manufacturability, and sustainability (d) an ability to function on multidisciplinary teams (e) an ability to identify, formulate, and solve engineering problems (f) an understanding of professional and ethical responsibility (g) an ability to communicate effectively (h) the broad education necessary to understand the impact of engineering solutions in a global, economic, environmental, and societal context (i) a recognition of the need for, and an ability to engage in life-long learning (j) a knowledge of contemporary issues (k) an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice. Note: Over half are non-technical, non-traditional for engineering programs! Engage GE Community for a better curricular mapping (win-win)! C3 What?

  23. ‘Measurable’ Skills - 3D • 3D) Multidisciplinary Teams • Students will be able to… • Recognize value of a MD team: broad skill set. • Communicate effectively with colleagues in other disciplines and listen well. • Employ flexible styles and behaviors and recognize the style of another. • Identify when problems occur due to poor team member interaction. Identify ways to improve team dynamics. • Assess: Via reflection after (or mid) project. C3 What? Fred DePiero fdepiero@calpoly.edu

  24. ‘Measurable’ Skills - 3H • 3H) Societal Impact • Students will be able to… • Identify societal impact associated with a project • Awareness of response of science and technology to needs driven by society, health & safety, or the environment. • Awareness of the potential for a negative impact of science and technology on society, perhaps resulting from an unintended consequence. • Assess: Short essay, or ‘Give an example’ C3 What? Fred DePiero fdepiero@calpoly.edu

  25. ‘Measurable’ Skills - 3J • 3J) Contemporary Issues • Students will be able to… • Demonstrate awareness of current or recent events in nation or world that could influence a student's career path or their field. • Identify sustainability issues associated with a project • Identify a disruptive technology. • Demonstrate awareness of the way technology impacts and is impacted by public policy making [local, state, federal]. • Assess: Short essay, or ‘Give an example’ C3 What? Fred DePiero fdepiero@calpoly.edu

  26. ‘Measurable’ Skills - 3I • 3I) Life-Long-Learning • Students will be able to… • Learn new techniques, tools, or devices outside the classroom environment. • Find appropriate technical resources, compare quality and availability of sources • Organize info, and evaluate applicability / usefulness. • Assess: Via reflection after (or mid) project. C3 What? Fred DePiero fdepiero@calpoly.edu

  27. ‘Measurable’ Skills - Examples • For IE, See SS pg 44 3b) Students will attain an ability to design and conduct experiments, as well as to analyze and interpret data • SKILLS 1. Students will be able to identify the experimental and control variables to conduct DOE 2. Students will be able to check various data collection assumptions 3. Students will be able to select proper DOE models to correctly analyze the obtained sample data 4. Students will be able to conduct experiments (prepare materials and equipment, use and specify measuring and test equipment, collect experimental data for analysis) 5. Students will be able to perform inferential statistical analysis through confidence intervals, test of hypothesis, ANOVA, Chi-squared, and regression 6. Students will be able to assess the risk of making wrong statistical recommendations 7. Students will be able to interpret (explain) data and make recommendations based on those interpretations • For ME, See SS Criterion 3 C3 What? Fred DePiero fdepiero@calpoly.edu

  28. Effort to Identify Measurable Skills - Helpful • Finding common ground on what is most important in your discipline • Provide focus & reinforcement • Provide closer ties to curriculum. • Get better traction with faculty • Skills seem trite? • Make more meaningful! • Raise the bar, as Program improves C3 How? Fred DePiero fdepiero@calpoly.edu

  29. Broad Spectrum Of Outcomes Valuable • Example: MD Team Skills (Help creative process) • A. Black, 1990 identified 32 traits of creative people. Divergent thinking, flexible, risk taker, sensitive, severely critical… • Less likely to have all 32 traits in a homogeneous team. • Hence MD teams are more likely to include qualities which are in opposition. • Csikszentmihalyi, M.1999 • MD teams foster creativity, which plays a crucial role in concept generation and eng’g design. • Would broad, non-traditional Outcomes be included, without ABET requirement? C3 Benefits! Fred DePiero fdepiero@calpoly.edu

  30. Direct Measures… • Some centralized, some decentralized • Some methods scalable C3 How? Fred DePiero fdepiero@calpoly.edu

  31. Scalability Improves Sustainability for Direct Measures • Decentralized Efforts • Involving course projects / assignments / exams • Faculty Assessors examine sampling of student work, using rubrics. (Common approach). Requires rubric… All A-K? • Capstone projects, group presentations, peer evals (rubric) • Embedded questions on common finals • Student Analysis of Own Senior Project • Project styles may vary, but analysis can be consistent. All A-K! • LLL: ‘Identify new tools, abilities learned outside classroom’. C3 How? Fred DePiero fdepiero@calpoly.edu

  32. Scalability Improves Sustainability for Direct Measures • Centralized Efforts • Writing Proficiency Exam • Graduation requirement • Institutionalized process. Results in student’s record • Rubric/norming used by faculty, campus-wide • Fundamentals in Eng’g Exam • A-K coverage limited. Used by some programs. C3 How? Fred DePiero fdepiero@calpoly.edu

  33. Assessment Instruments -Consider Worst-Case Measures • “Locally-developed” exams • Inspired by ABET & IAB comments • Multiple-choice (scalable) or essay. • Required in course, contributes ~5% to grade • Summative, 4 years. All Outcomes covered. • Yields new information, beyond coursework. • Results for Local Exams • Students: Fine or Bomb. No need to split hairs. C3 How? Fred DePiero fdepiero@calpoly.edu

  34. Assessment Instruments -Consider Worst-Case Measures • Shock of summative direct measures! • Faculty provided questions that students failed to answer, in a summative setting. • Results unexpected, hence faculty engaged. • Student learning can be compartmentalized. • Often faculty view is in terms of courses. Need verification of summative abilities. • Industrial Advisors raised this concern (disciplinary skills, quiz given during interviews). C3 How? Fred DePiero fdepiero@calpoly.edu

  35. Criterion 4: Improvement • (C4-ABET for 2010/11-Proposed) The program must regularly use appropriate, documented processes for evaluating the extent to which both the PEOs and student outcomes are being attained. The results of these evaluations must be utilized as input for the continuous improvement of the program. Other available information may also be used to assist in the continuous improvement of the program. • My read: • Assessment data used to justify improvements • Continuously measure and improve • Other info OK (Good!) • Does not say… • A given change must (+) impact the measure that had originally inspired the change C4 What? Fred DePiero fdepiero@calpoly.edu

  36. Criterion 4: Improvement • Logic of Justification: Many-to-Many Mapping • Based on many sources of info (typical, across univ) • Justification could include: • Measurements AND • Faculty observation, IAB observation, surveys / exit interviews (open ended) • Don’t put ‘cart before horse’ regarding changes. • Look for a control group, and validating measurement • Deploy quiz/survey for validation purpose C4 What? Fred DePiero fdepiero@calpoly.edu

  37. C4: Example from IE • Improvement Area #1: Statistics sequences: prerequisites and redundancy • Opportunity: Data of 2006 senior exit interview showed that statistics sequence (STAT312 --> IME326 --> IME430) was much overlapped. Instructors who teach IME326 also found IME326 course description was similar to STAT312. • Responsible Party: John ... • Root Causes: Prior curriculum had a fault that IME326 and STAT312 had similar course outlines. Lack of curricular flow chart showing length of sequences: Students realize too late that sequences of prerequisite courses are long • Planned Activities and approximate timeline: Action #1. Change pre-requisite of IME326 from STAT312 to STAT321 in catalog 2009-2011. Action #2. Compare students’ performance in IME326 between who took STAT312 and STAT321 as pre-requisite through a pre-requisite quiz in the first week of IME326. Action #3. Flow chart for IE and MfgE to be developed and made available to students Action #4. Prerequisites to be enforced by computer Action #5. Review Exit Exam and modify Stat questions, if necessary Fred DePiero fdepiero@calpoly.edu

  38. C4: Example from IE • Improvement Area #1: Statistics sequences: prerequisites and redundancy… • Actions Completed: Action #1. Completed. Action #2. two pre-requisite quizzes were conducted in Fall 2007 and Winter 2008. The effect of the pre-requisite course (STAT312 vs. STAT321) on the student performance in IME326 was studied in Fall 2007 and Winter 2008 quarters. Action #3. Completed May 2007 (Pan) Action #4. Completed November 2007 Action #5. Exit exam reviewed and several additional questions on stats knowledge added November 2007 • Evidence of Improvements: Action #1. It seems that student performance in IME326 of those who took STAT321 is slightly better than that of those who took STAT312, though the improvement is not statistically significant. Note that the study was an observational study and the sample size of STAT321 was very small. Expected evidence in exit exams in 2008-2009 after catalog change takes effect regarding stat background consistency • Other Examples: IE (pg 77), Iowa State (pg 58), ME (pg 131) Fred DePiero fdepiero@calpoly.edu

  39. C4: Example from IE • Evaluating the level of attainment of Outcome 3A • 3a) Students will attain an ability to apply knowledge of mathematics, science, and engineering Alumni and industrial advisory board members both agree that 3(a) applying math/science/engineering knowledge is a fairly important outcome (ranked 6th out of 11 with a composite importance score of 4.12). Advisory board members judge this outcome to be achieved at an acceptable level (3.9 out of 5). This is also considered an appropriate level based on its relative importance as rated by alumni and IAB members. Employers of IE graduates also see the application of math/science/ engineering knowledge as being achieved at an acceptable (4.09) and appropriate (4.4 for importance) level. In addition, “general engineering skills” are identified as one of the most desirable skills from employer job postings. Senior exam scores related to this outcome show that our graduates achieved better than acceptable level (62.37% correct answers on the exit exam). Overall our students’ performance related to 3(a) is acceptable and at an appropriate level given its importance. Graduating seniors consistently rate their level of attainment of this outcome as well above an acceptable level (4.31 out of 5 on average). They are more confident with general math/science/engineering knowledge than with materials knowledge. Relatively, the scores are somewhat low given their importance, but they do not stand out from the other Outcomes based on the survey responses. Senior Exit Interviews show that the curriculum may have an opportunity for improvement. Several comments related to how engineering statics and dynamics courses are related to the curriculum. It is not clear however, if this was primarily an issue for Industrial Engineering majors. The issue will continue to be monitored. Although few Industrial Engineering students take the FE/EIT exam, the exam results will continue to be monitored for signs of obvious trends in performance. Based on these findings, this outcome appears to be achieved at an acceptable level … continuous improvement efforts are better focused elsewhere. Fred DePiero fdepiero@calpoly.edu

  40. Criterion 4: Use Dedicated Survey to Close Loop • To help program improvement process, for quick loop closure! • Ask directly about impact in classroom • Possibilities? Priorities? Benefits? • Students, Faculty, Industry Advisors all partake & appreciate giving input - Rank importance of changes. • Impact of some improvements diffuse, assess locally with a control group. Rather than trying to measure benefit in a summative fashion (with seniors)! C4 How? Fred DePiero fdepiero@calpoly.edu

  41. (My) Rubric Describing Quality of Improvements • Very Strong Many improvements with clear ties to assessment data and at least one follow-on measurement to close the loop using identical assessment instruments before and after the change • Strong Many improvements with clear ties to assessment data and at least one follow-on measurement to close the loop • Good Improvements with ties to assessment data and some loose efforts to close the loop • OK (Still ‘OK’, but thin ice) Improvements with some ties to assessment data C4 What? Fred DePiero fdepiero@calpoly.edu

  42. Examples of Improvements • Added numerical methods based on FE Exam (direct, computers area) and other indications. Since have observed a (slight) improvement in these scores. A similar change was justified/validated in the area of ethics. • Increased emphasis on design in curriculum based on senior survey input. Surveys have since indicated better abilities in this area. • Stepped up undergrad advising by assigning individual faculty to students, based on survey input. Surveys have since shown an improvement. • Adjusted support courses and prerequisites related to statistics area, based on exit interviews. • Improved advisingof students regarding benefits of a graduate degree, based on input from Senior Surveys. C4 What? Fred DePiero fdepiero@calpoly.edu

  43. C4: Centralize Common Improvements? • One common improvement is gold! • Students ‘Multidisciplinary Certified’ • Pools co-curricular experiences to handle throughput and establish solid mapping • Tech Writing, Psychology • Strengthened curricular mapping C4 How? Fred DePiero fdepiero@calpoly.edu

  44. Criterion 5: Curriculum • Prerequisite Issues • ‘Basic science’ vs ‘Engineering Science’ vs ‘Engineering Design’ • Watch prereqs of capstone courses • Identify standards and constraints in Self Study C5 What? Fred DePiero fdepiero@calpoly.edu

  45. Culture Change: Data-Driven Decision Making • “Others are doing X. Let’s do that too!” • What determines success? Use an anecdotal start & finish? Or use multi-year measures. • How to prioritize possible options for changes? • “We know water is wet. Why measure?” • Consistent benchmarks over many years. • Avoid anecdotal decisions • For ABET Evaluator: Prove water is wet. • Assessment evidence beneficial when preparing publication on innovative teaching methods for peer review. • Goal: Data with ‘pedagogical significance’ rather than emphasizing statistical significance. Use judgment. Benefits! Fred DePiero fdepiero@calpoly.edu

  46. Use Technology Appropriately • “Where can I put all this info?” • Document mgmt - (e.g. via BlackBoard) • One Bb course per Program • Create template with slots to fill • Posted items document time line (e.g. surveys) • Bb system supported by university, with training for users • Provided PEVs with access prior to visit How? Fred DePiero fdepiero@calpoly.edu

  47. Use Technology Appropriately • Wish List • New survey tool: • Link to campus data for email addresses • Program-specific questions & deployment • Tag questions to outcomes (ABET & university) • Agglomerate results at program, college & university levels (for ABET & WASC) How? Fred DePiero fdepiero@calpoly.edu

  48. Seek Partners Across Campus • Career Services: Employer surveys (on C3) • New likert scale (*) • Not important for entry‐level • Weak –but no concern • Weak –needs strengthening • Sufficient. No Concerns • Strong • Strong –needs more emphasis • Example provided • Careful: ‘Abilities at start of employment…’ • (*) From Bob Warrington How? Fred DePiero fdepiero@calpoly.edu

  49. Seek Partners Across Campus • Institutional Planning & Assessment • Writing Proficiency Exam, grad requirement • NSSE + FSSE… (via WASC) • A 4th, 5th leg of stool? • Might reveal a problem worth exploring? • Might confirm an improvement under consideration? How? Fred DePiero fdepiero@calpoly.edu

  50. Seek Partners Across Campus • GE Community • Help firm-up curricular mapping (Benefits all programs!) • Win-win, liberal arts instructors emphasize need to engineering students • Math/Science: Understand common needs and expectations. • Who has ‘quantitative reasoning’ on their syllabus (None) • Who helps introduce, develop & master this (Many) • Liberal Arts: ‘Transferable skills’ How? Fred DePiero fdepiero@calpoly.edu

More Related