1 / 33

Dr. Chell Roberts Executive Dean

College of Technology and Innovation Arizona State University. VULII Quality Assurance Institute February 28, 2013. Dr. Chell Roberts Executive Dean. Dr. Scott Danielson, P.E. Associate Dean of Academic Programs. Dr. Kathy Wigal Associate Director of Curricular Innovation.

gautam
Download Presentation

Dr. Chell Roberts Executive Dean

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. College of Technology and Innovation Arizona State University VULII Quality Assurance Institute February 28, 2013 Dr. Chell Roberts Executive Dean Dr. Scott Danielson, P.E. Associate Dean of Academic Programs Dr. Kathy Wigal Associate Director of Curricular Innovation

  2. College of Technology and Innovation Arizona State University VULII Quality Assurance Institute February 28, 2013 • CTI’s pursuit of specialized accreditation· • CTI program and course-level assessment practices • How CTI handles course evaluations • How CTI collects and/or uses data for planning and decision-making

  3. Specialized Accreditation Specialized and professional accreditors, like ABET or the Aviation Accreditation Board International (AABI), accredit degree-granting programs in particular disciplines or program areas.

  4. Value of Accreditation Accreditation assures students and prospective employers that an educational degree program has met stringent quality standards. Accreditation helps ensure, via assessment and continuous improvement processes, that graduates have received a quality education and are capable of performing a broad range of professional responsibilities.

  5. ABET ABET, previously known as the Accreditation Board for Engineering and Technology, is a premier specialized accreditation organization. ABET accredits programs in the US and other countries. It accredits applied science, computing, engineering, and engineering technology programs via its four Commissions. • Applied Science Accreditation Commission  • Computing Accreditation Commission  • Engineering Accreditation Commission  • Engineering Technology Accreditation Commission 

  6. International Accords and Engineering Accreditation • The Dublin Accord focuses on 2-year engineering technician programs. • The Sydney Accord Global focuses on accreditation of 4-year (B.S.) engineering technology programs. • The Washington Accord focuses on accreditation of engineering programs • The program characteristics related to these accords are described in more detail in the International Engineering Alliance (IEA) documents. For instance, the IEA’s Graduate Attributes and Professional Competencies Version 2, dated 18 June 2009, describes the program attributes of engineering technology programs.

  7. Engineering Accreditation (EAC) General Criteria for Engineering Programs • Criterion 1. Students • Criteria 2. Program Educational Objectives • Criteria 3. Student Outcomes • Criteria 4. Continuous Improvement • Criteria 5. Curriculum • Criteria 6. Faculty • Criteria 7. Facilities • Criterion 8. Institutional Support

  8. Engineering Technology Accreditation • ABET accreditation for 4-year (B.S. in Mechanical Engineering Technology) degrees and 2-year degrees (technician degrees) in engineering technology. • ABET’s Engineering Technology Accreditation Commission is responsible for these accreditation actions.

  9. Characteristics of US-based Engineering Technology Programs • Typically taught by faculty with engineering experience in industry. • Applied-engineering programs (at the B.S. levels). • Require math (calculus) and science (physics) but are focused on the skills and abilities used by engineers in industry. • Often include more manufacturing content.

  10. Global Accreditation by ABET of Computing, Engineering and Engineering Technology ABET has accredited programs in the Middle-East, South America, and Southeast Asia (Philippines, Singapore, Indonesia). More international programs are being accredited by ABET every year.

  11. Other examples of specialized accreditation organizations • Aviation Accreditation Board International (AABI) • Association of Technology, Management, and Applied Engineering (ATMAE) • Various countries also have, or are forming, their own specialized accreditation bodies, e.g., ICACIT in Peru.

  12. Program and Course Level Assessment Practices Student Performance Outcomes Student Experience as Learners to improve the quality of teaching and learning that takes place in your classroom to evaluate student mastery of course level objectives or outcomes.

  13. Program and Course Level Assessment Practices • Graded Homework • Portfolios • Normed Exams • Advisory Boards How do we choose? Lab Reports Multiple Methods Direct & Indirect Qualitative & Quantitative Triangulation Student Experience as Learners Student Performance Outcomes

  14. Course Level Assessment Instruments for Student Performance / OutcomesDEAAR teacher . . .

  15. Program Assessment Practices Industrial Advisory Boards (IABs) Each engineering education program should have an engaged Industrial Advisory Board with representation from organizations being served by the program graduates. The IAB should periodically review the program’s curriculum and provide advisement on current and future aspects of the technical fields for which the graduates are being prepared.

  16. Program Assessment Practices Discipline-specific standard or normed exams are used within the USA to provide assessment data. For instance, there is a electronics engineering technology assessment tool developed by engineering technology educators available via the Society for Manufacturing Engineers (SME)—see http://www.sme.org/eetexam/. Also the SME provides manufacturing technologist and manufacturing engineering certification exams that can provide assessment data and a professional certification.

  17. Portfolios and Other Techniques

  18. Program and Course Level Assessment Practices Student Performance Outcomes Student Experience as Learners to improve the quality of teaching and learning that takes place in your classroom to evaluate student mastery of course level objectives or outcomes.

  19. Course Level Assessment Instruments forStudent Experience as Learners • Improve the quality of classroom teaching and learning • Systematic data collection so that all students contribute. • Ungraded, anonymous. Students provide useful information that won't be used to evaluate them. • Calls for a response from the instructor. Are you prepared to change? • Regular ongoing feedback . . . DEAAR Teacher . . .

  20. Course Level Assessment Instruments forStudent Experience as Learners After Course Completion In-Class Techniques

  21. Faculty/Course Evaluationsthe CTI process

  22. Faculty/ Course Evaluationseducational research shows ... • Course evaluation instruments provide consistent and stable measures for specific items • Students as evaluators (some good some bad news) • Some elements are difficult for students to assess level, amount and accuracy of course content and an instructor’s knowledge of, or competency in, his or her discipline • To make valid inferences about student ratings of instruction, the rating items must be relevant to and representative of the processes, strategies, and knowledge domain of teaching quality

  23. Faculty/ Course Evaluationseducational research shows ... • Potential negative effects from delineating specific characteristics of effective teaching. (e.g. enthusiasm, organization, warmth) because teaching effectiveness can be achieved in many ways. • Grouping of teaching behaviors into dimensions, noting that this assists with reading and interpreting data and therefore is more likely to lead to improvement

  24. Faculty/ Course Evaluationseducational research shows ... • a major challenge to the validity of student ratings is the minimal facility many administrators have in interpreting the results they receive and the lack of training available to them to improve these skills. • Institutions should improve efforts to assist students to become better evaluators of teaching and to better train administrators in the interpretation and use of ratings data for personnel decisions. • In order to improve the decision-making process institutions should determine their evaluation strategy in advance, suggesting either norm-referenced or criterion-referenced evaluations.

  25. Faculty / Course EvaluationsRecommendations • Set clear evaluation goals, including clear definitions of what constitutes effective teaching at your institution and ensure that questions reflect these goals • Design and test instruments according to rigorous theoretical and psychometric standards • Establish appropriate and standardized policies and processes for the administration of course evaluations • Provide sufficient information to students about the administration and use of evaluations • Provide students with access to appropriate evaluation results • Offer students other means to provide feedback • Request an accompanying narrative from faculty • Use evaluation data as a means of providing formative feedback • Encourage and provide the infrastructure for consultation on teaching evaluations

  26. Faculty / Course EvaluationsRecommendations • Provide an opportunity for instructors to receive individualized assessment • Provide faculty with information about evaluation data collection and use • Use evaluation data for summative purposes • Educate and train administrators • Present data so that it can be easily and accurately interpreted • Include appropriate supplementary evidence with evaluation data • Test and review instruments when institutional priorities or teaching practices change • Establish policy frameworks for the collection, administration and use of student course evaluation system • Establish clear administrative practices • Articulate evaluation goals and purpose • Develop educational materials and support networks for users

  27. Faculty / Course EvaluationsProcess & Literature • The usual purpose of an evaluation system is to positively impact behavior. Clarify the purpose of the system. • A positive impact needs to include Faculty in the development of the system • Measurement is a process of assigning numbers (or something that equates to numbers) Faculty need to understand this early. Measurement (and evaluation) should come from several different sources (triangulation) • Some triangulation ideas: peer, student, chair, and expert • Evaluation is the process of interpreting measurement data, not simply reporting the measurement. The successful will be both summative and formative • Don’t measure and evaluated on criteria for which there is no resource to enable faculty to gain expertise (Otherwise they will be viewed as punitive) • Establish a facilitative reward structure • Tie promotion and merit to the evaluation and professional enrichment program • Define expected roles

  28. Faculty / Course EvaluationsExamples of Faculty Roles • Information Technology • Graphic Design • Public Speaking • Communication Styles • Conflict Management • Group Process/ team building • Resource management • Personnel management • Budgeting development • Dissemination of scholarship • Content Expertise • Instructional Delivery • Instructional Design • Instructional Assessment • Course Management • Instructional research • Psychometrics • Epistemology • Learning Theory • Human Development • Discovery • Integration • Application • Teaching Scholarship Areas

  29. Faculty / Course Evaluationsinterpreting results Data resulting for student evaluation of instruction with a valid instrument need to provided to faculty and their supervisors. But, these data need to be compared to a departmental average or norm and tracked over multiple semesters. Then faculty can compare themselves to their peers.

  30. Departmental Averages Example

  31. Data for planning and decision-making

  32. Data for planning and decision-making • Enrollment/graduation over multiple semesters By academic department, by degree program, low/high enrollment • Faculty work load assessmentTeaching load, research, committee work, service • Student Credit Hours (SCH) Faculty load, funding, by department • Demographic Trends Enrollment of women, by program • Capacity and Growth Load, funding, enrollment projections, program planning and more . . .

  33. Data for planning and decision-making Your turn to practice! Using the data report (handouts) provided during the session, try your hand at interpretations and analysis in order to answer the question posed.

More Related