1 / 16

ABET Assessing Program Outcomes

ABET Assessing Program Outcomes. Amir Rezaei. Outline. Context of Assessment Process of Assessment Similarities and differences between classroom and program assessment Developing measurable outcomes Developing scoring rubrics Assessment methods Direct measures/Indirect measures

mahler
Download Presentation

ABET Assessing Program Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ABETAssessing Program Outcomes Amir Rezaei

  2. Outline • Context of Assessment • Process of Assessment • Similarities and differences between classroom and program assessment • Developing measurable outcomes • Developing scoring rubrics • Assessment methods • Direct measures/Indirect measures • Data collections • Closing Loops • Writing the report

  3. Assessment of Outputs provides or direct measurement of the effectiveness Assessment of inputs and process only establishes the capability or capacity of program Assessment of Outputs serve as indirect measures for effectiveness What are we doing with the inputs? What comes into the system? What is the effect? How many? Context of Assessment

  4. Collect data when they graduate Process of Assessment • Can we demonstrate that student have learned outcomes xx to an appropriate level by the time of graduation? • Can we demonstrate that we have added value to student learning of outcome xx to an appropriate level by the time of graduation? Pre and post data collection

  5. ABET uses these terms, Use same language to reduce the confusion

  6. Similarities and differences between classroom and program assessment • Degree of complexity • Time span • Accountability for the assessment process • Cost • Level of faculty buy-in • Level of precision of the measure

  7. Timeline 1 quarter Classroom Assessment Context Subject matter Faculty member Pedagogy Student Facility Concepts Forces in 2d & 3D Moment f a force about.. Eq in 2D & 3D FBD Trusses, Frames and Machines Friction Topics Statics of particles Equivalent system of forces Eq. Rigid bodies Equilibrium Structures Friction Subject Statics Assessment Focus: Evaluate individual studentperformance (grades)Evaluate teaching/learning

  8. Environmental Factor InstitutionalContext Out-of-class Experiences Co-curricular; Co-ops; internship Coursework& Curricular Patterns Classes chosen; major EducationalObjective Timeline xx years Program Assessment Student Pre-college Traits Classroom Experience Pedagogy; Facility; Faculty & student Characteristics

  9. Developing measurable outcomes Performance Criteria Outcomes Researches and gathers information Fulfill duties of team roles Shares work equally Listen to other teammates Objetive Ability to function on multi-disciplinary team Work Effectively with other Makes contributions Takes responsibility Value other viewpoints

  10. Developing scoring rubrics • A rubric is a set of categories which define and describe the important components of the work being completed, critiqued, or assessed • Purposes • Information to/about individual student competence (Analytic) • Overall examination of the status of the performance of a group of students (Holistic)

  11. Developing scoring rubrics • Generic • Big picture approach • Element of subjectivity • Task-specific • Single task • Focused approach • Less subjective • You don’t have to develop a rubric for every outcomes -Note Rubric template

  12. Assessment methods No rubric Rubric Rubric No rubric No rubric No rubric Rubric Rubric Rubric Rubric Rubric Rubric • Written surveys and questionnaires • Exit and other interviews • Commercial, standardized exams • Locally developed exam • Archival records • Focus group • Portfolios • Simulations • Performance Appraisal • External examiner • Oral exam • Behavioral observations

  13. Direct and Indirect Measures • Direct measures: • Direct measures provide for the direct examination or observations of student knowledge or skills against measurable learning outcomes • Indirect Measures: • Indirect measures are those that ascertain the opinion of self-report of the extent or value of learning experiences

  14. Direct Indirect • Exit and other interviews • Standardized exams • Locally developed exams • Portfolios • Simulations • Performance Appraisal • External examiner • Oral exam • Behavioral observation • Written surveys and questionnaires • Exit and other interviews • Archival records • Focus groups

  15. Sampling And Data collection • For program assessment, sampling is acceptable and desirable for programs of sufficient size Year 1 Year 2 Year 3 Year 4 Define Outcomes/ Map curr. Data Collection Evaluation & Design of implementation Implement Improvements & Data collec.

  16. Closing the Loop Evaluation committee receive and evaluate all data; makes report and refers recommendations to appropriate areas Institute acts on the recommendations of the Eval. Comm. Report of actions taken by the Institute and the targeted areas are returned to the Eval Comm. For iterative evaluation Institute assessment comm. Prepares reports for submission to dept. Heads of the collected data (e.g. survey, e-portfolio rating)

More Related