1 / 35

ASSESSMENT for Academic Degree Programs Prescott September 2002

ASSESSMENT for Academic Degree Programs Prescott September 2002. Overview. What’s New? Latest developments in assessment Review new deadlines What’s Next? Finish old business (complete 01-02 reports) Prepare for next assessment cycle Questions / Resources. Timing of Assessment Cycle.

baris
Download Presentation

ASSESSMENT for Academic Degree Programs Prescott September 2002

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASSESSMENTfor Academic Degree ProgramsPrescottSeptember 2002

  2. Overview • What’s New? • Latest developments in assessment • Review new deadlines • What’s Next? • Finish old business (complete 01-02 reports) • Prepare for next assessment cycle • Questions / Resources

  3. Timing of Assessment Cycle • Fall too busy – move cycle to spring • Finish current (01-02) assessment cycle now • Columns/steps 4 & 5 of current assessment reports due Oct. 15, 2002 • Slight reprieve • Delay start of next cycle (03-04) until spring • Steps 1-3 due spring 2003 • Steps 4 & 5 due spring 2004

  4. Planning and Assessment Portal • Created a single point of entry for strategic planning and assessment activities • Access through ERAU Online / Blackboard • Overview of planning and assessment processes • Directory of assessment and strategic planning units • University Planning and Assessment Policy (APPM 4.3)

  5. Other Changes • “Best practices” adopted from institutions with successful assessment process: • Use committees to guide ongoing assessment • Set up “peer review” process for unit reports • Annual summary of university-wide assessment • Incentives for doing assessment well • Establish centralized support services

  6. Assessment Committees • Committee structure pushes ownership of assessment process further down into ERAU’s “foundation” (faculty/staff) • 4 Campus Assessment Committees (CAC) DB, EC, Prescott, and University Admin (UA) • Faculty & staff on each CAC (UA – staff only) CAC co-chaired by a faculty and a staff member • CAC co-chairs = University Assessment Committee (UAC)

  7. CAC Duties • In broad terms, CAC will… • Work with individual units to develop effective assessment reports • Facilitate communication across departments and programs • Be a vehicle for feedback to / from Chancellor • Nominate exemplary assessment reports

  8. CAC Duties • Work with individual units to develop effective assessment reports • Guidance on 5-Step model / assessment process • Peer review assessment reports at 2 points in time using a checklist of assessment guidelines • Upon draft submission of Steps 1-3 at start of assessment cycle • Submission of Steps 4 & 5 at the end of the assessment cycle • Peer review is to ensure the PROCESS; interpretation of results and the decision about what to do with them is YOUR call - this is NOT a prescriptive review!

  9. CAC Duties • Facilitate communication across departments and programs about assessment and continuous improvement • Encourage sharing of best practices and useful assessment measures • Avoid duplication of efforts • Summarize campus assessment activities

  10. CAC Duties • Nominate exemplary assessment reports as part of incentive • Submit “best assessment report” nominees to UAC • UAC will pick a “best report” per CAC (4 total) • “Best reports” selected by UAC will receive $500 to use toward assessment-related activities

  11. CAC-Prescott Members Chuck Cone Chuck Ahlstrand Brian Nordstrom Mary DeWitt Wesley Stanfield David Hall Sarah Thomas

  12. UAC Duties • Unify campus-level assessment activities • Facilitate communication re: assessment and continuous improvement across campuses and UA; encourage sharing of best practices • Annual summary of assessment activities at the university level • Vehicle for feedback to / from Cabinet • Vote on “best assessment reports” nominated by CACs

  13. Institutional Research Duties • Institutional Research (IR) moves to a supporting role • Assessment coordinator available to assist with • development of assessment measures • administration of surveys for assessment • id existing sources of data (IR and external) • help do research for assessment techniques • IR website has survey data and project calendar • IR houses archived assessment reports

  14. Complete Current Assessment • Close out 01-02 assessment report • Download current report from assessment website • Steps 1-3 were submitted last October • Word format this cycle; web-based next cycle • Complete Steps 4 & 5 • Summary of Data Collected • Use of Results • Submit completed reports to Wes Stanfield by Oct. 15, 2002

  15. Complete Current Assessment • Steps 4 & 5 are straightforward IF…. • Assessment data were actually gathered • Data provided information to determine if outcomes were actually met • Thought was put into how various results might be used

  16. Possible Scenarios • “Winning” Scenarios • Criteria for success were met • Criteria for success were NOT met and results were used to make improvements (even better?) • “Problematic” Scenarios • No use of results are shown • Insufficient data without offering a “fix”

  17. Winning Scenarios • Criteria for success were met • Step 4: Summarize assessment data collected • Step 5: State that criteria were met and indicate future of intended outcome • no further action required and retire outcome from next assessment cycle • re-assess next cycle using different criteria / measures

  18. Winning Scenarios • Criteria were NOT met / results were used • Step 4: Summarize assessment data collected • Step 5: State that criteria were not met and explain how results have been used to make improvements • Changes made to program • Change criteria (criteria too strict?) • Use different assessment method (corroborate) • Step 5: Indicate future of outcome • No further action required (?) • Assess again using different criteria / assessment method? • Sparked new initiative for strategic plan / re-assess at later date?

  19. Problematic Scenario Solution • Potential Problem: Haven’t yet used results; can’t “close the loop” by end of assessment cycle • Solution: • Not a problem IF new / strategic initiatives must be taken in order to make improvements – write these into strategic plan and reference assessment report • Otherwise, need careful wording to put “will” into past tense. Hold meetings / make plans prior to submission of report so that decision actions may be stated in the past tense.

  20. Problematic “We are planning a retreat to discuss results” …or… “We will…” Preferred “Assessment results revealed insufficient student access to the internet. See new initiative regarding additional workstations in 03-04 strategic plan “ …or… “We have met and have agreed that these are the actions to take… (outline a plan)” Problematic Scenario Solution

  21. Problematic Scenario Solution • Potential Problem: Insufficient data • Solution: Explain why (be specific about nature of the problem) and state what is to be done differently next time to obtain data

  22. Problematic “No data available” … or … “Sample size too small” Preferred “Survey administration was delayed; no data collected. Same outcome/criteria will be carried over to next cycle when survey is to be administered” … or … “Sample size (n=3) was too small to determine whether criteria for success was met. Outcome carried over to next assessment cycle; will combine three years of survey data to ensure sufficient sample size” Problematic Scenario Solution

  23. Peer Observations • Typical use of results from peer institutions • “What is taught” • Closer alignment of coursework with “world of work” • Change in sequence of courses • Additional courses required for degree completion • “How it is taught” • Methodology / technology • Active participation

  24. Preparing for Next Cycle • There is an expectation that the assessment process will evolve and mature • As you close out current cycle (Oct. 15, 2002) • complete Steps 4 & 5 – no need for re-writes now • use troubleshooting tips if useful • start thinking about Steps 1-3 that will guide your assessment activities in the next cycle • Some areas that could use improvement…

  25. Preparing for Next Cycle • Step 1: Expanded Statement of Institutional Purpose • Academic programs probably won’t change purpose often • Program purpose “links” into university mission & goals • Step 2: Intended Educational Outcomes • Fairly straightforward for academic programs • What should students know (cognitive), think (affective) or do (behavioral) as the result of this academic program? • Step 3: Criteria for Success & Means of Assessment • Predominant use of indirect criteria • Limited use of assessment means

  26. Diversify Criteria • Consider using more direct measures • Use indirect measures (student /alumni opinion about own skill preparation) as secondary criteria. Employer feedback can be primary. • Examples of direct measures: • “75% of internship supervisors rate intern communication skills as good or excellent” • % passing certification / licensure exams on 1st try • “As part of final semester capstone course, students will critique a short draft essay - 80% will correctly identify 70% of mistakes”

  27. Diversify Criteria • Consider using subscale scores • Easier to formulate a specific response for use of results using subscale scores than using overall scores only. • Example of subscale use: “Overall, at least 75% of employers responding to the Employer Feedback Survey agree or strongly agree that the education of their ERAU graduate meets their company’s needs and on no degree-specific skill of the 10 listed for this program will 25% of employers or more rate competence as poor or fair.

  28. Broaden Means of Assessment • Consider using rubrics • Capstone course project, senior thesis or student portfolio • Assessing degree program, not courses or individuals • May need multiple or external evaluators for objectivity

  29. Example of critical thinking rubric Broaden Means of Assessment

  30. Broaden Means of Assessment • Example sub-score and rubric use • Average score of 3.5 or higher with no component average score less than 3.0

  31. Broaden Means of Assessment • Consider external evaluators • Reciprocal agreements with peer evaluators • Example: “A jury of computer engineering dept. faculty from an institution comparable to ERAU will judge 80% of some senior projects to be acceptable according to the attached agreed upon set of standards” • Tests / rubrics jointly developed by faculty and Industry Advisory Board

  32. Broaden Means of Assessment • Qualitative (focus groups, etc.) • Instead of survey • To clarify survey results • References to feedback from course evaluators, Industry Advisory Board members, students is qualitative

  33. Review • This is a learning process • Incorporate existing assessment; don’t duplicate efforts (IR, ABET, CAA, ACBSP, grants) • Establish non-threatening, non-accusatory environment; use results only for improvement • USE results

  34. Resources • Campus Assessment Committee • Institutional Research • Assessment support office • Provides logistical means for conducting and processing surveys • Website contains survey data and calendar of projects • URL: http://irweb.erau.edu • Assessment coordinator, Tiffany Phagan – contact via phone 386-226-6224 or via email phagant@erau.edu • Assessment Website • http://irweb.erau.edu, then click on Assessment Planning • Forms, training materials • Archived assessment reports

  35. Questions and Discussion

More Related