1 / 15

Best Assessment Practices

Best Assessment Practices. ERAU Assessment Workshop for Faculty, October 2007. Assessment Role Best Practices. Dean/Dept. Chairs Ensure assessment is meaningful, robust, incorporates best practices Love and support assessment coordinators

brock
Download Presentation

Best Assessment Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Best Assessment Practices ERAU Assessment Workshop for Faculty, October 2007

  2. Assessment Role Best Practices • Dean/Dept. Chairs • Ensure assessment is meaningful, robust, incorporates best practices • Love and support assessment coordinators • Opportunities for faculty to collaboratively reflect on assessment • Approve program-level assessment plans

  3. Assessment Role Best Practices • Program Assessment Coordinators • Coordinate assessment / request resource support • Involve colleagues • Submit program assessment plans to Dept. Chair/Dean for approval

  4. Assessment Role Best Practices • Other Faculty • Be aware of course contribution to program outcomes • Communicate program outcomes to students • Get involved in program assessment efforts

  5. Assessment at ERAU • ERAU uses “5-step” assessment process • Academic programs and administrative / support departments do assessment • Annual cycle begins in fall, ends fall of the following year • Close out current cycle by submitting Steps 4-5; ALSO launch new cycle with Steps 1-3 • Document online through ERPP (http://spa.erau.edu)

  6. Areas of Progress • Departments / Colleges Planning Better Assessment • Validating Outcomes and Curriculum Alignment • Assessment Mini-Grants • Direct Assessment • External exams (ETS, FL Board of Prof. Engineers) • Evaluating student work w/ faculty juries • Rubrics • Department-created comprehensive exam • Assessing work in capstone courses • Identifying ‘indicator’ courses from which to draw assessment material and feed back results

  7. Areas to Improve • Grades Used for Program Assessment • Over-abundance of Indirect Evidence • Senior exit interviews • Alumni survey results • Anemic Documentation of Methods, Results and Use of Results • Very Little Evidence of Improvements

  8. 5-Step Assessment Method • Step 1: Convey Mission Statement • Step 2: Define Outcomes • Discipline-Specific Student Learning Outcomes • General Education Learning Outcomes • Program Outcomes (job placement, grad. rates, etc.) • Step 3: Identify How to Measure Attainment of Outcomes • Select Methods • Set Performance Criteria • Step 4: Interpret and Report Data • Step 5: Implement and Document Improvements

  9. Step 1: Program Mission Statement • Does your mission statement convey program uniqueness? • Is uniqueness carrying through into assessment? (“tweaking” AABI outcomes)

  10. Step 2: Develop Learning Outcomes • Reflect any program accreditation requirements • Validated with external constituencies • Comprehensive – A through J enough? • Outcomes and importance communicated to students (alumni / employer survey results showing importance)

  11. Step 3: Methods and Performance Criteria • Where in students’ progression to assess? • Just before / at graduation • After graduation? (ABET), midpoint?, adequate pre-requisite knowledge? • Appropriate level (Bloom’s taxonomy, etc.) • Select methods • Multiple methods (triangulation) • Direct evidence • Set expected performance criteria – what is ‘good enough?’

  12. Step 4: Data Interpretation / Reporting • Gather data • Share results among faculty and interpret collaboratively • Is performance acceptable? • Further inquiry / analysis? • Improvements to be made? • Good documentation

  13. Step 5: Implement Improvements • Make improvements • Changes to course content • Changes to course sequencing • Pedagogical changes • Pre-requisites • Share improvements – with faculty and students • Document well • Mission-Critical Budget Request Form • Check to see if intended improvements worked in next assessment cycle

  14. Step 5: Implement Improvements Poor Example: • “Pedagogical modifications” Good Example: BS Human Factors & Psychology (DB) • “We have changed our core curriculum to integrate a broader range of system modeling skills in the HFI - HFIV series. Specifically, HFIII was changed to incorporate more system skills that had been absent from the sequence. The old HFIII content (Ergonomics and Bioengineering) was given a new course number and is still required in our core curriculum.”

  15. Next Steps • Assessment plans due to Dept. Chairs/Dean by end of November 2007 • Steps 4-5 to complete 2006-07 cycle • Steps 1-3 to launch 2007-08 cycle • Final plans reviewed and approved by Dept. Chairs/Dean by end of December 2007 • New Program Coordinators / Dept. Chairs – Set up Training for Plan Input and Approval with Tiffany Phagan

More Related