1 / 30

PROGRAM REVIEW & ASSESSMENT

PROGRAM REVIEW & ASSESSMENT. Overview SERC Program Assassment. Why assess your instructional program?. For all the reasons Cathy reviewed! May be required in program performance reviews or curricular reform proposals Part I: Program Performance Reviews

Download Presentation

PROGRAM REVIEW & ASSESSMENT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PROGRAM REVIEW & ASSESSMENT Overview SERC Program Assassment

  2. Why assess your instructional program? • For all the reasons Cathy reviewed! • May be required in program performance reviews or curricular reform proposals • Part I: Program Performance Reviews • Part II: Instructional Program Assessment • Assessment strategies • Data collection

  3. PROGRAM REVIEW & ASSESSMENT Part 1: Program Performance Reviews (PPR)

  4. Common Elements of a PPR • Department Context • SWOT Analysis • Learning Assessment • Goals/strategies/actions for next review period

  5. Background Work • A. Department Context • Mission statement • What we are • Vision statement • What we strive to be • Student Learning Outcomes (SLO) statement • What our graduates are capable of • Examples of these statements are available at • Assessment Planning Documents • Case Study: Cal State Fullerton’s Department of Geological Sciences

  6. CSUF Mission Statement

  7. Vision Statement Details from Mission-Crafting disc. were crafted into a Vision:

  8. Student Learning Outcomes Statement

  9. Student Learning Outcomes Statement • Carleton's SLO

  10. A. Department Context • Mission statement • What we are • Vision statement • What we strive to be • Student Learning Outcomes (SLO) statement • What our graduates are capable of • What is the basis for these claims? • The scientific “urge” to back Mission/SLO claims up with data motivates the program assessment. • How do we attain our Vision? • Problem-solving “urge” motivates identification of goals/strategies/action plans in response to Vision.

  11. B. SWOT Analysis • An exercise in hearing all voices: • Arm yourself with a pen and sticky notes • IN SILENCE, visit all 4 stations (S-W-O-T) and post your ideas about OUR department • You may revisit stations and reflect on notes, but do so in silence • Each group of 3-4 should gather in front of one of the stations and, IN SILENCE, take turns grouping/rearranging notes into themes. • Once done, construct statements that summarize the various themes at your station. • Report back to whole group.

  12. B. SWOT Analysis: example

  13. B. SWOT Analysis • Suggestion: Solicit alumni input prior to exercise re their perceptions of strengths/weaknesses. • Volunteer one member of each group to be on a SWOT subcommittee to finalize SWOT statement . • Subcommittee can also perform following analysis: • Leverage (O+S) • Vulnerability (T+S) • Constraints (O+W) • Problems (T+W)

  14. C. Assessment of Student Learning Develop assessment plan to assess degree to which SLO and teaching-related Visions are achieved, e.g.

  15. C. Assessment of Student Learning • UT Austin site on Assessment of Instructional Programs Assessment Resources • Planning steps (with worksheets!): • Describe program context • Identify stakeholders and their central questions • Students: Will this degree prepare me/help me get a job? • Determine the evaluation purpose • Identify intended uses of data • Create an evaluation plan

  16. D. Goals • Goals can be identified by inspecting Vision statements and asking • What are we currently doing to embody this Vision? • Is there something we could do differently to more fully realize our vision? • WHAT IS YOUR THEORY OF CHANGE?? • Make a logic model! • Prioritize goals (1-3 years; 3-6 years) • Identify strategies to achieve goals • Develop an action plan • Assign something to everyone!

  17. INSTRUCTIONAL PROGRAM ASSESSMENT (IPA) Part 2: Assessment Strategies

  18. Common elements of an IPA • Assessment of Student Learning • Define learning objectives for individual courses • Analysis of degree curriculum • Benchmark exams • Capstone experiences • Student surveys • Employer surveys • There are LOTS of examples of these on the SERC site... • Program Metrics and Instruments

  19. Student Learning: Individual Courses • Define learning objectives for individual courses • “Successful completion satisfactorily demonstrates student mastery of SLO” • assessed by pass/fail rate • Student ratings of instruction • Annual student achievement data for high-enrollment, “standardized” classes • Performance Exams • Pre-/Post- Testing • Assessed by changes in scores

  20. Analysis of degree curriculum • “map” courses to the degree’s SLO statement using SLOs for individual courses • assessment consists of demonstration that SLO are met by students passing required coursework • Pt. #4, Carleton College's Assessment Plan • “maps” can also be made to the institution’s Mission and Goals • This is a useful springboard for redesigning the curriculum…

  21. Analysis of degree curriculum: CSUF

  22. Student Learning: Benchmark exams • Annual exams • e.g., Winona State • http://serc.carleton.edu/departments/assessment/instruments/Winona_annual_exam.html • How to keep students serious about taking a demanding, ungraded exam? • Exit exams • e.g., ASBOG • Does the content reflect your degree’s SLO?

  23. Student Learning: Capstone Experiences • Research experiences/theses, assessed by • satisfactory completion • Rubrics used by advisors • presentation at professional meetings • In-house oral presentation • Research Days http://serc.carleton.edu/departments/assessment/instruments.html • Judged by industry and/or colleagues from neighboring institutions • Field camp, assessed by • satisfactory completion • final field exam

  24. PROGRAM ASSESSMENT Part 3: Collecting Data

  25. Student Surveys • Exit surveys and interviews • Surveys given in classes populated by seniors • or by mail • Interviews conducted by Chair or by advisor • Alumni surveys • Collect similar data over extended time period • May want to separate data by graduation date • Can add one-time questions to guide curriculum development, scheduling, outreach, etc. • Collect data re graduate programs for use in advising seniors • Lists of Exit/Alumni Survey Questions

  26. Employer Surveys • Brainstorm re possible survey questions • How many graduates employed? • Strengths/weaknesses? • Compare to graduates of other programs? • Incentives to participate? • Possibility of improving the “product”? • Opportunity to give expert advice • Role of Advisory Boards

  27. Web-based Survey Instruments • http://serc.carleton.edu/departments/assessment/survey_tools.html • SurveyMethods, Zoomerang, SurveyMonkey • Note factors to consider • Cost (group rate for institution?) • Number of expected responses • Types of questions you’d like to use • Desired access to raw data • How to “code” qualitative data • http://www.utexas.edu/academic/diia/assessment/iar/programs/report/focus-QualCode.php?task=programs

  28. Web-based Survey Instruments • Brainstorm: How could you get acceptable response rates? • Consider low/no cost options • Response rates (link/eval prog w/surveys) • Note: “Guidelines for maximizing response rates” • http://www.utexas.edu/academic/diia/assessment/iar/programs/gather/method/survey.php • On-line surveys: • forewarn; remind; provide incentives

  29. Closing Thoughts • A possible key to obtaining faculty “buy-in” to Program Reviews/Learning Assessments is by crafting Mission, Vision and SLO statements together • Get their claims of success on record, then • let the scientist’s urge to collect supporting data and to solve problems kick in! • Retreat, and hire a professional facilitator • Successful implementation of quantitative assessment plan will increase the integration of your educational and research missions.

  30. PROGRAM REVIEW & ASSESSMENT Overview SERC Program Assassment

More Related