1 / 44

Connecting Student Learning and Assessment to Program Review

Connecting Student Learning and Assessment to Program Review. Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education and Co-Director of the Center for Educational Leadership, Innovation, and Policy San Diego State University 3590 Camino Del Rio North San Diego, California, U.S.A.

borka
Download Presentation

Connecting Student Learning and Assessment to Program Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Connecting Student Learning and Assessment to Program Review Marilee J. Bresciani, Ph.D. Professor, Postsecondary Education and Co-Director of the Center for Educational Leadership, Innovation, and Policy San Diego State University 3590 Camino Del Rio North San Diego, California, U.S.A. 619-594-8318 Marilee.Bresciani@mail.sdsu.edu

  2. Presentation Overview • Overview of Outcomes-Based Assessment (OBA) • Ways in Which Results can Be Used • Elements of Outcomes-Based Program review (OBPR) • Documentation and Feedback • Questions and Discussion Bresciani, M.J.

  3. Ask Yourself These Questions • How would you describe the purpose of assessment to your colleagues? • How would you describe the purpose of program review to your colleagues? Bresciani, M.J.

  4. The Assessment Cycle(Bresciani, 2006) • The key questions… • What are we trying to do and why? or • What is my program supposed to accomplish? or • What do I want students to be able to do and/or know as a result of my course/workshop/orientation/program? • How well are we doing it? • How do we know? • How do we use the information to improve or celebrate successes? • Do the improvements we make contribute to our intended end results? Bresciani, M.J.

  5. The IterativeSystematicOBPR CycleAdapted from Peggy Maki, Ph.D. by Marilee J. Bresciani, Ph.D. Gather Data Interpret Evidence Mission/Purposes Goals Outcomes Implement Methods to Deliver Outcomes (Action Planning) and Methods to Gather Data Make decisions to improve programs; enhance student learning and development; inform institutional decision- making, planning, budgeting, policy, public accountability Strategic Planning/ Inputs/Capacity External Review

  6. Frame Shift (Jenefsky et al, 2009) • From traditional input-based model to outcomes-based model • Heightened attention to improving the quality of student learning • From description & advocacy to evidence-based analyses and planning • From audit to collective inquiry & reflection • From focus on conducting effective program review to using the results effectively 6

  7. What are you already doing that could be considered outcomes-based assessment? How could you readily incorporate that into your program review process (e.g., curriculum alignment and professional accreditation)?

  8. Report Out How does your intended purpose for OBPR and current process support your intended use of the data generated from OBPR? Bresciani, M.J.

  9. NEASC Standard Two • 2.2  The institution undertakes short- and long-term planning, including realistic analyses of internal and external opportunities and constraints. The institution systematically collects and uses data necessary to support its planning efforts and to enhance institutional effectiveness. It plans for and responds to financial and other contingencies, establishes feasible priorities, and develops a realistic course of action to achieve identified objectives. Institutional decision-making, particularly the allocation of resources, is consistent with planning priorities.  Bresciani, M.J.

  10. NEASC Standard Two • 2.2  The institution undertakes short- and long-term planning, including realistic analyses of internal and external opportunities and constraints. The institution systematically collects and uses data necessary to support its planning efforts and to enhance institutional effectiveness. It plans for and responds to financial and other contingencies, establishes feasible priorities, and develops a realistic course of action to achieve identified objectives. Institutional decision-making, particularly the allocation of resources, is consistent with planning priorities.  Bresciani, M.J.

  11. NEASC Standard Two 2.5  The institution has a system of periodic review of academic and other programs that includes the use of external perspectives. Bresciani, M.J.

  12. NEASC Standard Four 4.45  The institution’s approach to understanding student learning focuses on the course, program, and institutional level.  Data and other evidence generated through this approach are considered at the appropriate level of focus, with the results being a demonstrable factor in improving the learning opportunities and results for students. Bresciani, M.J.

  13. NEASC Standard Four 4.49  The institution ensures that students have systematic, substantial, and sequential opportunities to learn important skills and understandings and actively engage in important problems of their discipline or profession and that they are provided with regular and constructive feedback designed to help them improve their achievement. Bresciani, M.J.

  14. Uses of Assessment Results(WASC Program Review Guidelines, 2009) • Developing program learning outcomes and identifying appropriate means for assessing their achievement • Better aligning department, college and institutional goals • Refining departmental access, and other interventions to improve retention/attrition, and graduation rates Bresciani, M.J.

  15. Uses, Cont.(WASC Program Review Guidelines, 2009) • Designing needed professional development programs, especially for faculty to learn how to develop and assess learning outcomes • Reorganizing or refocusing resources to advance specific research agendas • Re-assigning faculty/staff or requesting new lines Bresciani, M.J.

  16. Uses, Cont.(WASC Program Review Guidelines, 2009) • Illuminating potential intra-institutional synergies • Developing specific plans for modifications and improvements • Informing decision making, planning and budgeting, including resource re/allocation • Linking and, as appropriate, aggregating program review results to the institution’s broader quality assurance/improvement efforts Bresciani, M.J.

  17. In order for these Uses to Occur, An Institution Needs…(Bresciani, 2006) • Set priorities around institutional values • Communicate a shared conceptual framework and common language • Systematically gather data that actually evaluates outcomes • Document how information gets used to actually inform decisions Bresciani, M.J.

  18. In order for these Uses to Occur, An Institution Needs…(Bresciani, 2006) • Provide professional development and support for faculty and staff • Demonstrate leadership commitment to support the process and use the data to improve programs, re-allocate resources, and reinforce institutional priorities Bresciani, M.J.

  19. In order for these Uses to Occur, An Institution Needs…(Bresciani, 2006) • Commit to re-allocate time to intentional reflection and its systematic documentation of student learning as well as research • Centralize coordination of data/report management • Manage a way to systematically engage in documentation • Conduct a meta-assessment of the process Bresciani, M.J.

  20. How do you see using the results of Outcomes-Based Program Review (OBPR)? What do you need to do differently with your process in order to utilize the results? Bresciani, M.J.

  21. Report Out Design the Program Review Process so that you can use the Results to achieve your Purpose Bresciani, M.J.

  22. So, what do we need to document? Well… (insert technical disclaimer)

  23. Typical Components of OBA(Bresciani, 2006) • Program Name • Program Mission or Purpose • Goals • Align with your strategic plan, strategic initiatives, institutional goals, division goals, or department goals • Outcomes • Student Learning and Program • Planning for Delivery of Outcomes • Concept Mapping/Curriculum Alignment Matrix • Course/Workshop Design (e.g., syllabus for the workshop) Bresciani, M.J.

  24. Typical Components of An OBA (Bresciani, 2006) • Evaluation Methods/Tools • Link the method/tool directly to the outcome • Include criteria for each method as it relates to each outcome • Add Limitations, if necessary • Include Division, Institutional, or State Indicators • Determine acceptable level of performance and why Bresciani, M.J.

  25. Typical Components of OBA • Implementation of Assessment Process • Identify who is responsible for doing each step in the evaluation process (list all of the people involved in the assessment process at each step of the process) • Outline the timeline for implementation • Identify who will be evaluated • Identify other programs who are assisting with the evaluation • Identify who will be participating in interpreting the data and making recommendations and decisions Bresciani, M.J.

  26. Typical Components of OBA • Program Name • Outcomes • Results • Summarize the results for each outcome • Summarize the process to verify/validate the results • Summarize how the results link with performance indicators Bresciani, M.J.

  27. Typical Components of OBA • Decisions and Recommendations • Summarize the decisions/recommendations made for each outcome • Identify the groups who participated in the discussion of the evidence that led to the recommendations and decisions • Summarize how the decisions /recommendations may improve performance indicators • Identify how intended improvements enhance strategic initiatives, if applicable Bresciani, M.J.

  28. Typical Components of OBA • Decisions and Recommendations, Cont. • Summarize the suggestions for improving the assessment process • Identify when each outcome will be evaluated again (if the outcome is to be retained) • Identify those responsible for implementing the recommended changes • Identify the resources needed to make the necessary improvements, if applicable Bresciani, M.J.

  29. In addition… • Link to professional accreditation when possible • Organize an External Review • Can be external to department if not able to do external to institution • Explain level of expected performance (student learning and research) and how it was derived/decided • Document decisions made and resources re-allocated, if applicable Bresciani, M.J.

  30. Differentiate the Program Process from the Institutional Process • What you need to document as a program in order to provide the institution with its required information • What are roles and responsibilities of program personnel verses institutional personnel? • What are appropriate guiding questions for program officials, external reviewers and higher level administrators? Bresciani, M.J.

  31. For Example - Program • Program student learning outcomes and results • Program enrollment data and program specific data and contribution to understanding whether program goals are met • Program decisions, resource re-allocations, practice and policy changes Bresciani, M.J.

  32. For Example - Institution Verses • Whether program is meeting or not meeting institutional goals • Required review of evidence-based recommendations that affect other parts of the institution • Re-allocation of resources • Articulating priorities Bresciani, M.J.

  33. Examine your Institutional Guidelines and Templates… Which portions of your guidelines and templates help you use the data to achieve the purpose of OBPR? What portions align with your professional accreditation process? What templates can be combined/aligned to decrease documentation efforts?

  34. Report Out Bresciani, M.J.

  35. Prioritize • Institutional learning outcomes and strategic initiatives • Resources to improve those values • Time allocated to the data collection, reflection, and improvements you desire Bresciani, M.J.

  36. Process for Reviewing and Using Data • Be sure the process for reviewing and using the data is clear • Specify roles and responsibilities of everyone involved • Articulate how decisions will be documented and approved • Provide guiding questions to those reviewing the reports including guidelines for external reviewers Bresciani, M.J.

  37. Who do you want looking at these reports in order to make the best informed decisions? In other words, who should see these reports (differentiate between content and process) and on what criteria should they be reviewed?

  38. Report Out Bresciani, M.J.

  39. Take-Home Messages • You do not have to assess everything you do every year. • You don’t have to do everything at once-start with 2 or 3 learning outcomes • Prioritize your goals/outcomes • Think baby steps and be flexible • Acknowledge and use what you have already done. • Assessment expertise is available to help - -not to evaluate your program • Borrow examples from other institutions to modify as appropriate • Time for this must be re-allocated • We allocate time according to our priorities

  40. Resources • Each Other • AAC&U, WASC, NASPA, and ACPA • University Planning and Analysis (UPA) Assessment website • http://www2.acs.ncsu.edu/UPA/assmt/ Bresciani, M.J.

  41. Questions?

  42. One Minute Evaluation • What is the most valuable lesson that you learned from this workshop? • What is one question that you still have? • What do you think is the next step that your division/program needs to take in order to implement systematic program assessment? Bresciani, M.J.

  43. References • Jenefsky, C. Bresciani, M.J., Buckley, L., Farris, D., Kasimatis, M. (2009). WASC Guidelines for Program Review. Oakland, CA: WASC • Bresciani, M.J., Zelna, C.L., and Anderson, J.A. (2004). Techniques for Assessing Student Learning and Development in Academic and Student Support Services. Washington D.C.:NASPA.

  44. References, Cont. • Bresciani, MJ.(2006). Outcomes-Based Undergraduate Academic Program Review: A Compilation of Institutional Good Practices. Sterling, VA: Stylus Publishing. • Bresciani, M. J., Gardner, M. M., & Hickmott, J. (2010). Demonstrating student success in student affairs. Sterling, VA: Stylus Publishing. • NEASC Standards for Accreditation http://cihe.neasc.org/standards_policies/standards/

More Related