1 / 55

Venue: M038 Date: Monday April 18,2011 Time: 10:00 AM

JIC ABET WORKSHOP No.3 Guidelines on Criterion 4: Continuous Improvement I PEOs Assessment III SOs Assessment II Performance Indicators , Attributes and Course Learning Objectives IV Student Exit Survey Questionnaire V Alumni Survey Questionnaire

Download Presentation

Venue: M038 Date: Monday April 18,2011 Time: 10:00 AM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. JIC ABET WORKSHOP No.3 • Guidelines on • Criterion 4: Continuous Improvement • IPEOs Assessment • III SOs Assessment • II Performance Indicators , Attributes and Course Learning Objectives • IV Student Exit Survey Questionnaire • V Alumni Survey Questionnaire • V I Employers Survey Questionnaire • Presented by: • JIC ABET COMMITTEE Venue: M038 Date: Monday April 18,2011 Time: 10:00 AM

  2. I- PEOs ASSESSMENT

  3. A listing and description of the assessment processes used to gather the data upon which the evaluation of each program educational objective is based Examples of data collection processes may include, but are not limited to, employer surveys, graduate surveys, focus groups, industrial advisory committee meetings, or other processes that are relevant and appropriate to the program The frequency with which these assessment processes are carried out The expected level of attainment for each of the program educational objectives Summaries of the results of the evaluation processes and an analysis illustrating the extent to which each of the program educational objectives is being attained How the results are documented and maintained

  4. Performance targets • Performance targets --- the target criteria for the outcome indicators. • Examples: _ The [average score, score earned by at least 80%] of the program graduates on the [standardized test, standardized test item, capstone design report, portfolio evaluation] must be at least 75/100. • The [median rating for, rating earned by at least 80% of] the program graduates on the [self-rating sheet, peer rating sheet, senior survey, alumni survey, employer survey, final oral presentation] must be at least [75/100, 4.0 on a 1–5 Likert scale, “Very good”].

  5. Program Educational Objectives for the 2006-2011ABET Cycle

  6. 5. Summaries of the results of the evaluation processes and an analysis illustrating the extent to which each of the program educational objectives is being attained

  7. Results 2007: All students who had graduated in 2002-06 were surveyed. There were 308 graduates of which we were able to locate the email addresses of the 225 (73%). There were 98 respondents (44%). Of this number, 88 (89%) were practicing engineering technology, 8 were in graduate school (8%) and the remainder were in other fields. The survey asked the alumni whether or not they had had an opportunity to demonstrate each of the objectives. The results are presented in Table 4.2. Table 4.2. 2007 Alumni Survey Results – Percent of Graduates Who Indicated That They were Prepared

  8. 2007 Evaluation of Alumni Survey results: This was the first cycle where we used an electronic survey format to poll our alumni on their achievement of the program educational objectives (Survey Monkey). We attribute the positive response rate to the fact that we were able to streamline the assessment process and better track who had responded and who had not. The overallsurvey results indicate that the alumni are meeting the program educational objectives. However, there was some concern that the recently graduated classes (2005 and 2006) were not as positive in their responses as the alumni who had been out three years or more. On further analysis and a review of the written comments, it is clear that the quality of the work experience increases with time and many of the recent graduates had not had experiences which provided them with an opportunity to experience some of the program educational objectives (e.g., work in cross-functional teams, confront an ethical issue, get involved in service activities). 2007 Actions taken: The faculty were satisfied with the results and concluded that the alumni were meeting the program educational objectives and there was not a need to take any action at this time. However, there was some concern about the engagement of early graduates in service activities. This is an area that we will continue to monitor.

  9. Results 2010: All students who had graduated in 2005-09 were surveyed. There were 312 graduates of which we were able to locate the email addresses of the 240 (77%). There were 96 respondents (40%). Of this number, 89 (93%) were practicing engineering technology, 5 were in graduate school (5%) and the remainder were in other fields. The survey asked the alumni whether or not they had had an opportunity to demonstrate each of the objectives. The results are presented in Table 4.4.

  10. 2010 Evaluation of Alumni Survey results: Overall, the survey results indicate that the alumni continue to meet the program educational objectives. In the 2007 evaluation there was some concern that the most recently graduated classes (2005 and 2006) were not as positive in their responses as the alumni who had been out three years or more. In this survey we again surveyed the 2005 and 2006 graduates and were able to validate our belief that some objectives are best demonstrated after graduates have been out after two years. This analysis clearly demonstrates that as graduates have more work related experiences their responses are more positive. This is demonstrated by looking at the 2005 and 2006 graduates when surveyed in 2007 and again in 2010. This comparison is shown in Table 4.5.

  11. 2010 Actions taken: The target performances were either met or exceeded based on the survey results, there are no actions taken at this time. Copies of all surveys and survey methodology will be available in the ABET resource room at the time of the visit.

  12. Summary of Advisory Committee Discussions: Every other year the Engineering Technology Advisory Committee reviews and discusses the program educational objectives and the attributes that are demonstrated by the program graduates. The advisory committee is made up of employers (over half of whom are also alumni) and graduates. They meet with faculty yearly to discuss curricular and resource issues as well as current trends and issues in the discipline. On the even numbered years, they discuss their personal experiences or their experiences with the program graduates as they relate to the program educational objectives. The objectives that are the primary focus are the application of engineering technology principles (Obj. 1), ability to work in cross functional teams (Obj. 3), continued learning (Obj. 5), and ethical conduct (Obj.3). They consistently are in consensus that the program educational objectives are being met. The minutes from their meetings are summarized and available for review in the ABET resource room and will be available at the time of the visit.

  13. 6. How the results are documented and maintained

  14. Documentation: The assessment and evaluation documentation is in digital format and is maintained by the office administrator. It is accessible on the intranet and all faculty can review and comment on any of the continuous quality improvement (CQI) processes. All comments are reviewed annually as a part of the program educational objectives and student outcomes review processes. The CQI website will be made available to the team at the time of the ABET visit.

  15. II- SOs ASSESSMENT

  16. A listing and description of the assessment processes used to gather the data upon which the evaluation of each student outcome is based 2. Examples of data collection processes may include, but are not limited to, specific exam questions, student portfolios, internally developed assessment exams, senior project presentations, nationally-normed exams, oral exams, focus groups, industrial advisory committee meetings, or other processes that are relevant and appropriate to the program 3. The frequency with which these assessment processes are carried out 4. The expected level of attainment for each of the student outcomes 5. Summaries of the results of the evaluation process and an analysis illustrating the extent to which each of the student outcomes is being attained 6. How the results are documented and maintained

  17. The assessment of student outcomes is done on a six-year cycle. The cycle that was used for the current ABET cycle is illustrated in Table 4.6.

  18. Although data are only collected every three years, there are activities which are taking place on each outcome each year. The cycle of activity is shown in Table 4.7. Each outcome has been mapped to the engineering technology courses as depicted in Table 4.8. This map was used to make decisions about where the summative data would be collected.

  19. Each outcome has been mapped to the engineering technology courses as depicted in Table 4.8. This map was used to make decisions about where the summative data would be collected.

  20. Results for each student outcome are reported separately in the following tables and all supporting documentations will be available in the ABET resource room at the time of the visit. Each table represents the activity for the current ABET accreditation cycle. Each outcome table includes performance indicators, courses and/or co-curricular activities (educational strategies) that provide students an opportunity to demonstrate the indicator, where summative data are collected, timetable, method of assessment and the performance target. Each table is followed by a graph showing the results with a three cycle trend line. Student Outcome #1: ability to identify, analyze, and solve engineering technology problems

  21. Assessment Results (direct measures) 2007: For summative assessment (end of program), the decision was made to focus on the direct assessment for all indicators. Summative data for Indicators #1 and #2 were collected in the Engineering Technology Design I course (ET 4090) where students are asked to develop their statement of the problem and project planning documentation. For indicator #3 the assessment was completed in the second semester design course (ET 4092) as a part of the final assessment of the course. The percent of students who demonstrated each of the criteria were as follows: Indicator #1-80%; Indicator #2-80%; and Indicator #3-84%. Evaluation and Actions 2008: The assessment results were reviewed by the faculty who are responsible for the Senior Design sequence. A presentation was made at the faculty retreat which was held in August of 2008. Although the students are making progress from the previous assessment in 2004 on Indicator #1 (up from 74%) there was still concern that their problem statements did not reflect an adequate understanding of what was expected. The decision was made to provide them some examples of both poor and well-written problem statements and require them to do an analysis of the difference. They would then be asked to do a self-assessment of how well their problem statements reflected what they identified in the well-written statements and submit their analysis with their problem statement. In a review of the results of Indicator #2, it was determined that the students were performing significantly better than the previous assessment (68%) and that the faculty would continue to monitor the students progress in the following year (2008-09). This improvement was attributed to the fact that the faculty had implemented a two-session sequence in ET4090 on project planning with direct feedback to students in the planning process using the rubric used to assess Indicator #2. Faculty members are satisfied that students are meeting the expectations for Indicator #3. The use of industry-based problems with industry mentors has improved the performance of students in the quality of their solutions and their ability to recognize the constraints that effect their solutions.

  22. Second-Cycle Results (direct measures) 2010: This cycle of summative data was taken in the same courses as the 2007 cycle. Based on the actions taken as a result of the 2008 evaluation process, the following results were found: Indicator #1 up 14% (94%); Indicator #2 up 4% (84%); and Indicator #3 was the same (84%). Faculty will discuss their findings at the August 2010 faculty retreat and report the findings at the time of the ABET site visit. Figure 4.9. Trend line for Student Outcome #2: ability to identify, formulate, and solve engineering problems

  23. Display materials available at time of visit in the ABET resource room: • Rubrics used by faculty to assess the indicators • Indicator #1 sample problem statements documentation • Indicator #2 project planning guide • Senior survey questions with results and faculty evaluation of results • Minutes of faculty retreat where actions were taken in 2008 and 2011

  24. III- Performance Indicators , Attributes and Course Learning Objectives

  25. BLOOM’S TAXONOMY The cognitive domain (Bloom, 1956) involves knowledge and the development of intellectual skills. This includes the recall or recognition of specific facts, procedural patterns, and concepts that serve in the development of intellectual abilities and skills. There are six major categories, which are listed in order below, starting from the simplest behavior to the most complex. The categories can be thought of as degrees of difficulties. That is, the first ones must normally be mastered before the next ones can take place.

  26. Sample behavior • Knowledge- Name Bloom’s 6 levels of cognitive domain • Comprehension- Explain each cognitive level • Application- Write course learning outcomes using Bloom’s Taxonomy • Analysis- Categorize the course learning outcomes into 6 levels • Synthesis- Develop a course plan using Bloom’s 6 cognitive levels • Evaluation-Critique the effectiveness of each cognitive level in implementing the course plan.

  27. Outcome elements (Performance Indicators) Outcome elements—different abilities specified in a single outcome that would generally require different assessment measures.

  28. Outcome Attributes (Measures) Outcome attributes --actions that explicitly demonstrate mastery of the abilities specified in an outcome or outcome element

  29. Course learning objectives (CLOs) are instructional objectives statements of observable student actions that serve as evidence of the knowledge, skills, and attitudes acquired in a course • Examples: The students will be able to: • Solve a second-order ordinary differential equation with specified initial conditions using Matlab • Design and carry out an experiment to measure a tensile strength and determine a 95% confidence interval for its true value • Define the four stages of team functioning and outline the responsibilities of a team coordinator, recorder, checker, and process monitor

  30. Course learning objectives (CLOs) are instructional objectives statements of observable student actions that serve as evidence of the knowledge, skills, and attitudes acquired in a course Learning objectives should begin with observable action words (such as explain, outline, calculate, model, design, and evaluate) and should be as specific as possible, so that an observer would have no trouble determining whether and how well students have accomplished the specified task. Words like “know,” “learn,” “understand,” and “appreciate” may be suitable for use in educational objectives or program or course outcomes but not learning objectives.

  31. IV- Student Exit Survey Questionnaire

  32. Student Exit Survey Template Jubail Industrial College is striving to monitor and improve the quality of its academic programs. Therefore, we would appreciate receiving your opinion about your program of study during the period that you spent at the college. Your views and opinions are crucial for the future improvements of the quality of the program, and will be treated with confidentiality. We hope that your answers are both honest and constructive, and will add value to the learning experience at JIC. Please rate from 5=“Strongly Agree” to 1=“Strongly Disagree”

  33. IV- Alumni Survey Questionnaire

More Related