1 / 76

ABET Program Evaluator Re-Training – Materials Engineering

ABET Program Evaluator Re-Training – Materials Engineering Objectives of Accreditation (1) Assure that graduates of an accredited program are adequately prepared to enter and continue the practice of engineering. (2) Stimulate the improvement of engineering education.

jacob
Download Presentation

ABET Program Evaluator Re-Training – Materials Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ABETProgram Evaluator Re-Training – Materials Engineering

  2. Objectives of Accreditation (1) Assure that graduates of an accredited program are adequately prepared to enter and continue the practice of engineering. (2) Stimulate the improvement of engineering education. (3) Encourage new and innovative approaches to engineering education and its assessment. (4) Identify accredited programs to the public.

  3. Role of ABET Accreditation The role of ABET accreditation is to provide periodic external assessment and evaluation in support of the continuous quality improvement program of the institution.

  4. Evaluator Role • Evaluators are selected by their professional • society (TMS or NICE) to participate in the accreditation of engineering programs. • In this capacity they: • Represent the Materials Community and ABET on campus visits. • Work with a team of colleagues from other professional societies.

  5. Evaluators Provide • As an ABET team member evaluators provide knowledge concerning: • Professional practice • Professional preparation • Continuous quality improvement

  6. Evaluator Commitment • Commitment is critical to a successful visit. • Exercise sound judgement. • Base all decisions on the criteria. • Do careful pre-visit preparation. • Perform on-site visit thoroughly. • Develop and communicate succinct conclusions. • Participate in team decisions. • Promptly submit all reports and forms. • When in doubt or when a question arises, • use the team chair as a resource.

  7. What the Visit Team Produces At the end of the visit… • Team submits a Visit Report • Basis for the Draft Statement to Institution • Criteria-based • All visit forms feed into this document • Includes statements of program strengths, deficiencies, weaknesses, and concerns for each program based on documentation and observations from pre-visit and on-site

  8. Important Expectations • Evaluators represent TMS/NICE and the EAC of ABET. • ABET accredits programs to the state and to the public that they satisfy the criteria. • Team effort—team decisions. • Confidentiality • Conflict of interest—every visitor should have signed a conflict-of-interest statement. • Observers • no evaluative statements to the institution. • exit interview—thank-you only.

  9. Conflict of Interest • ABET representatives will: • Behave in an ethical and professional manner. • Disclose real or perceived conflicts of interest. • Recuse themselves from discussions or decisions related to real or perceived conflicts of interest.

  10. Confidentiality Do not discuss final conclusions with faculty, students, and others Keep all materials until the July EAC meeting. At conclusion of accreditation process (August after visit) materials are to be destroyed Information specific to the institution is to remain confidential without time limit Institutional data are confidential except with written authorization of institution ABET materials only released by ABET staff

  11. Communication Maintain open line of communication with the program head Identify deficiencies as soon as possible Discuss all issues with the program head at the debriefing Do not discuss the recommended accreditation action with anyone except team members

  12. General Visit Outline • Pre-visit • Communicate with Program Head. • Try not to surprise the program. Let them know about issues as soon a possible. • Give program opportunity to respond prior to visit. • Pose pre-visit questions in an appropriate manner. Clear questions, not dictatorially or confrontationally. • Keep the Team Chair in the loop on these communications. • Complete transcript analysis well before visit. • Complete curriculum analysis well before visit. • Create draft of exit statement before visit.

  13. General Visit Outline (cont.) • Sunday – • Visit campus, review curriculum materials. • Evening: Share what was learned and plan for rest of visit. • Monday – • AM: Dean’s overview, PEV’s in departments. • Lunch with institutional officials. • PM: Support areas, PEV’s in departments. • Evening: Share findings, discuss issues, tentative conclusions; draft exit statements. • Tuesday – • Tie up loose ends. • Draft exit statements by late morning. • Debrief department heads before lunch. Discuss all issues. • Exit interview early afternoon. Read statement.

  14. Visit Report and Forms • Important forms – done by Tues AM • Program Evaluator Report—soft copy to Team Chair • Transcript & curriculum analyses (in Evaluator Report) • Level of Implementation (in Evaluator Report) • Exit Statement (in Evaluator Report) • Program Audit Form + Explanation of Shortcoming (two hard copies to Team Chair—one will be left with Dean—one soft copy to Team Chair) • Program Evaluator Worksheet—soft copy to Team Chair • Send copies of these forms to TMS or NICE

  15. Program Evaluation:Campus Visit Details • Consistency—things to look for • Evaluation of the extent to which objectives are attained • Assessment process with results that demonstrates and measures outcomes • Program improvement (closing the loop) • Curricular and program issues • Faculty and students • As they relate to undergraduate education

  16. Deliverables From each PEV, the team chair must have: A recommended action relative to the program under review that is consistent with the team’s conclusions (remember, it is a TEAM recommendation) A PAF that accurately reflects the team findings An Exit Statement that will: Be read verbatim in the Exit Meeting Be used as the basis for construction of the draft statement to the institution

  17. Program Audit Form (for GR or New)(We provide a copy to the institution at the Exit Meeting) If heading doesn’t have the current visit year, you are using an old form! Do not use old forms!!

  18. Program Audit Form (for Interim Visit Evaluations)We provide a copy to the institution at the Exit Meeting for IV only If the heading does not say INTERIM VISIT do not use it for an IV evaluation

  19. Working Definition of Key Terms • Deficiency: assigned to any criterion, policy, or procedure that is totally or largely unmet. • Weakness: criterion, policy, or procedure is met to some meaningful extent, but compliance is insufficient to fully satisfy requirements. • Concern: criterion, policy, or procedure is fully met, but there is potential for non-compliance in the near future. • Observation: general commentary possibly, but not necessarily, related to criteria.

  20. Limit Use of Key Terms Use Key Term only in reference to overall evaluation of each criterion. The Key Term (defined in the previous slides) is the overall assessment for the criterion as a whole, not the worst finding among the sub-areas on the worksheet. Do not, for example, give a deficiency to a program that lacks only a measurement for outcome 3(e).

  21. Criteria Emphasis • Practice of continuous improvement. • Input of constituencies • Process focus • Outcomes and assessment linked to objectives • Knowledge required for entry into the engineering profession. • Student, faculty, facilities, institutional support, and financial resources linked to program objectives.

  22. Program Evaluation:Things to Look for • Evaluationof the extent to which objectives are attained • Assessment process with results that demonstrates and measures outcomes • Program improvement (closing the loop) • Curricular and program issues • Faculty and students • As they relate to undergraduate education

  23. Programs with Shortcomings Prior to Due Process (2006/7) Data reflect 260 programs involved in general reviews.

  24. Materials Programs Shortcomings Prior to Due Process (2003 to 2008) Data reflect review of 86 programs over 6 years Note: Improvement new criteria in 2008

  25. Materials Programs Strengths (2003 to 2008) Data reflect review of 86 programs over 6 years Note: Improvement new criteria in 2008

  26. Consistency Checks EAC Consistency Committee: Final check EAC Meeting ABET HQ: Accreditation Director Editor 2 checks among all reports received Editor 2 Director checks higher-level consistency Editor 1 checks among all reports received Editor 1 Professional Societies Team Chair Team Chair Team Chair Team Chairs check among evaluators Team Team Team Team Team

  27. What’s New in 2008 • Training process - PAVE (partnership to advance volunteer excellence) required for new evaluators • No institutional shortcomings • Dual level accreditation • Criteria renumbering 

  28. Applicable Criteria • In 2009-2010, the “Renumbered Criteria” apply – there are some differences from the Criteria that were applicable in 2007 -2008!

  29. Criteria Renumbering

  30. Criterion 1: Students The program must: • evaluate student performance, advise students, and monitor students’ progress – no change • have and enforce policies for acceptance of transfer students and validation of courses taken elsewhere – no change • have and enforce procedures to assure that all students meet all program requirements – no change The Criterion is effectively unchanged – in 2007-2008, it was the institution that had responsibility.

  31. Criterion 2: Program Educational Objectives The program must have in place: • published PEO’s consistent with mission and these Criteria – no change • process that periodically documents and demonstrates that the PEO’s are based on the needs of the program’s constituencies – change here • an assessment and evaluation process that periodically documents and demonstrates the extent to which these objectives are attained – change here

  32. Applicable Criteria Use Criteria for 2009-2010 Accreditation Cycle. These are the same as the renumbered criteria used in 2008-2009 If this is an interim visit, the criteria in effect at the time the prior shortcomings were identified should be used, unless it is to the program’s benefit to use the currently applicable ones, and it is the program’s choice.

  33. Application & Interpretation of Criteria In 2008-09, EAC editors observed that: Issues arose related to the linkage between Criterion 2, Criterion 3, and Criterion 4 Criterion 2 and new programs Criterion 4 and new programs

  34. What Does Criterion 2 Say? The program must have in place: Published PEO’s consistent with mission and these Criteria Process that periodically documents and demonstrates that the PEO’s are based on the needs of the program’s constituencies An assessment and evaluation process that periodically documents and demonstrates the extent to which these objectives are attained

  35. Criterion 2 Highlights The process needs to document and demonstrate that the PEO’s are based on constituent needs – NOT “a process based on the constituents needs in which PEO’s are determined and evaluated” (the old language) Requirement for assessment and evaluation is the same, but there is no longer language in Criterion 2 that requires that the results of the assessment and evaluation process be used for program improvement. (Program improvement is now in Criterion 4.)

  36. Consistency Issues Criterion 2: Do the published PEO’s meet the definition? Does the program convince the team that the PEO’s are based on constituent needs? Notice that there is no language that insists on constituent approval or involvement! The program does need to convince the team that it has a way of determining what the needs of its constituencies are. Is there an assessment and evaluation process in place that gives info about the extent to which PEO’s are attained by grads

  37. Criterion 2 Problem in 08-09 Several instances in which programs were being required to show how results of C2 or C3 assessment and evaluation processes were being used to improve the program. This reflects application of old criteria. That's not in C2 or C3 any more. C4 refers to results of C2 and C3 processes as possible sources of information upon which continuous improvement could be based.

  38. Criterion 2 FAQ’s What if the PEO’s really sound like outcomes (instead of objectives? If PEO’s are not PEO’s, there is a C2 shortcoming. What if PEO’s are ambiguous or reflect outcomes retooled to apply after graduation? Team judgment – do they meet the intent of the Criterion? Is an assessment process for PEO’s that considers predominately data based on accomplishments of current undergraduates adequate? Probably not

  39. What Does Criterion 3 Say? The program must demonstrate that (a) – (k) are attained Program outcomes are defined as (a) – (k) plus any additional ones articulated by the program Program outcomes must foster attainment of the PEO’s There must be an assessment and evaluation process that periodically documents and demonstrates the degree to which outcomes are attained

  40. Important – for Criterion 3 The definition of program outcomes as being (a) – (k) plus locally articulated ones The program may not have its outcomes expressed as (a) – (k) plus others. It may have just identified a set of outcomes. As long as the program has demonstrated attainment of (a) – (k) and its own outcomes, this element of the criterion is met. Requirement for assessment and evaluation is the same, but there is no language in Criterion 3 that results of the assessment process be applied to further development of the program. (Program improvement is now in Criterion 4.)

  41. Criterion 3 Consistency Issues Be sure to apply this criterion in a holistic sense The process of assessment and evaluation needs to demonstrate the degree to which outcomes are attained, but … There is no language that says all outcomes must be attained to the same degree There is no language that says anything about a numeric scale measuring degree of attainment There is no language that says the outcomes must be measured There is nothing in Criterion 3 that says anything about use of the assessment and evaluation information for program improvement

  42. Criterion 3 FAQ’s What about assessment data? What is adequate data? Does it all have to be objective/direct? (NO) Can it be subjective? (Some of it may be; nothing says it has to be) Is the observation or conclusion of course instructor adequate? (What was his or her basis for the observation?) Does evidence for each outcome have to be in the form of work the student has produced? (No, but the team needs to be convinced that outcome attainment has been demonstrated.)

  43. What Does Criterion 4 Say? “Each program must show evidence of actions to improve the program. These actions should be based on available information, such as results from Criterion 2 and Criterion 3 processes.” The improvements can be based on any available information!

  44. Consistency Issues The language of Criterion 4 simply insists on evidence of action to improve the program. Such actions could be stimulated by results of the C2 and C3 assessment and evaluation processes But they could also be stimulated by other information The language of this Criterion does not require that the C2 and C3 information be used as the basis for program improvement. It suggests use of the results of C2 and C3 processes as sources of information for program improvement.

  45. INTRODUCTION—USEFUL PROGRAM STATISTICS PROGRAM: Strengths (special, unique or particularly conspicuous strengths) 1. 2. Deficiencies (In order, only for those criteria where deficiencies exist) 1. XXX 2. etc. Weaknesses(In order, only for those criteria where weaknesses exist) 1. YYY 2. etc. Concerns(In order, where concerns exist) 1. ZZZ 2. etc. Observations (do not have to relate to criteria) 1. etc. Exit Statement Format

  46. Writing the Exit Statement • For each shortcoming – use the following structure for your statement of finding: • What is required: • Describe what criterion or policy is applicable. • Describe what that criterion or policy requires. • What was observed: • Describe what was observed. • Describe how your finding differs criterion or policy. • Negative impact: • Describe the negative impact it has on the students or the program (for D or W) or the potential future impact on the students or program (C)

  47. What does your TC need in the exit statement you write? The language must be clear: “There is inadequate assessment of outcome 3(j).” How does the program know why its assessment of student knowledge of contemporary issues is inadequate? “There is incomplete evidence that students attain outcome 3(i).” What is it that is incomplete about the evidence that students have a recognition of the need for and the ability to engage in life-long learning. “It appears that outcome 3(h) is not fully assessed.” Why did the team find that the broad education necessary to understand the impact of engineering solutions in a global, economic, environmental, and societal context is not fully assessed? What does “full assessment” mean? The program has to understand what the shortcoming really is in order to fix it, the TC has to be able to explain the team recommendation, and later teams have to understand the issues

  48. What Would Make it Clearer? Instead of“There is inadequate assessment of outcome 3(j).” The only mechanism used for assessment of student knowledge of contemporary issues is through administration of a survey instrument asking students whether they have knowledge of contemporary issues. This mechanism does not provide adequate information for determining the degree to which this program outcome is attained. Instead of“There is incomplete evidence that students attain outcome 3(i).” Although student grades in the “Introduction to the Profession” course were claimed as documentation and demonstration that students have recognition of the need for and an ability to engage in life-long learning, no clear links between the student grades and this outcome were established. Instead of“It appears that outcome 3(h) is not fully assessed.” Even though some anecdotal assessment of whether students do have the requisite broad general education was described in the self-study report, evidence shows only preliminary plans for the development of a systematic process for assessment and evaluation of this program outcome.

  49. The language must support the team recommendation If it is a D, the words have to clearly say that the criterion is totally or largely unmet “there is no evidence that” “there is no assessment and evaluation process” “not all students are required to engage in a major design experience”

More Related