1 / 25

Student Services Assessment

Student Services Assessment. Lee Gordon Assistant Vice President for Student Services Purdue University. Our Common Goal….

Download Presentation

Student Services Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student Services Assessment Lee Gordon Assistant Vice President for Student Services Purdue University

  2. Our Common Goal… “Continue to promote a culture of assessment that expects rigorous internal and external review of programs and services for continuous improvement, to increase competitiveness, and engage in best practices for increased effectiveness and efficiency.” New Synergies – Purdue University’s Strategic Plan 2008-2014

  3. Why Assessment in Student Services? • A Matter of Survival: Questions of accountability, cost, quality, access, equity, and accreditation a fundamental necessity • Quality:Do we have high-quality programs, services, and facilities? • Affordability:Does cost-benefit justify service offering? • Strategic Planning: How do we achieve our goals? • Decision Making and Policy Development:Are we using good data for making decisions? • Learning:Do we contribute to student learning? • Political Evaluation: What evidence do we have that programs should be funded?

  4. The Assessment Cycle (Bresciani, 2006) The key questions… • What are we trying to do and why? or • What is my program supposed to accomplish? or • What do I want students to be able to do and/or know as a result of my course/workshop/orientation/program? • How well are we doing it? • How do we know? • How do we use the information to improve or celebrate successes? • Do the improvements we make contribute to our intended end results? Bresciani, M.J.

  5. Gather Data Interpret Evidence Mission/Purposes Goals Outcomes Implement Methods to Deliver Outcomes, and Methods to Gather Data Make decisions to improve programs; enhance student learning and development; inform institutional decision- making, planning, budgeting, policy, public accountability The IterativeSystematicAssessment CycleAdapted from Peggy Maki, Ph.D. by Marilee J. Bresciani, Ph.D.

  6. Assessment Strategies Student Importance & Partnerships Defined By Learning Outcomes Surveys Improve Transfer Gateways Mission Vision Student Access & Success Satisfaction Survey Synergetic Values Competencies & Advantages Streamline Processes Student Services Hub Benchmarks CAS Self Assessment

  7. AAHE Assessment Principles • The assessment of student learning begins with educational values. • Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time. • Assessment works best when the programs it seeks to improve have clear, explicitly stated purposes.

  8. AAHE Assessment Principles • Assessment requires attention to outcomes but also and equally to the experiences that lead to those outcomes. • Assessment works best when it is ongoing, not episodic. • Assessment fosters wider improvement when representatives from across the educational community are involved.

  9. AAHE Assessment Principles • Assessment makes a difference when it begins with issues of use and illuminates questions people really care about. • Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change. • Through assessment, educators meet responsibilities to students and to the public.

  10. Assessment Methods • Quantitative • Qualitative • Combination of both

  11. Comparison of Assessment Methods

  12. Techniques • See http://www.purdue.edu/dp/are/tools.html

  13. Why Student Affairs Needs a Comprehensive Approach to Assessment • Tracking use of programs, services, and facilities • Assessing student needs • Assessing student importance and satisfaction • Assessing environments and student cultures • Assessing program and service outcomes (costs and benefits) • Assessing student learning • Benchmarking • Measuring effectiveness against professional standards

  14. Dimensions and Assessments Currently Being Used

  15. 2005 CSS Survey Data

  16. 2007 NSSE Data

  17. Graduating Student Learning Outcomes

  18. Other Purdue Resources • Inventory of Assessment Surveys in Student Services • Office of Institutional Research Survey Resources • Office of Institutional Research Student Experience Resources

  19. Inventory of Student Surveys

  20. “Assessing” our Assessments • Student Learning Outcomes and Benchmarking are two dimensions which are least mature in development across VPSS • Weaknesses and disadvantages with three current SLO assessment surveys: Graduating Student Learning Outcomes Survey – Provost Office coordinates with academic officers; VPSS may not have much influence in adding questions to survey. College Senior Survey – National assessment, so no PU customized questions on survey. The last one was done in 2005. 2010 National Survey of Student Engagement – National assessment, so no PU customized questions on survey. Survey conducted every three years. Bresciani, M.J.

  21. “Assessing” our Assessments • Student Services has several departmentally-run assessments, but many need to be reviewed for effectiveness and efficiency • Several assessment tools are available: • Cognos for student data mining and analysis – however, many staff have only moderate skills, and some departments have no report-writing staff resources. New reports will need to be written since last strategic plan. • Qualtrics for surveys. • Blackboard for learning outcome assessment. Bresciani, M.J.

  22. Areas for Caution • Survey fatigue • Sustainability • Knowledgeable resources are scarce • We may not be fully utilizing all of the data collected from NSSE, CSS, GSLO assessments for making decisions Bresciani, M.J.

  23. Recommendations • Ask Assessment Work Group (responsible for 2010 Imp/Sat Survey) to update inventory of assessment activities. • Encourage Qualtrics and Blackboard training. Provide consulting as needed. SSTA can help. • Establish a Student Learning Outcomes Committee, with goals of a) identifying VPSS learning outcomes; and b) approach Provost Office with request for expanding Graduating Student Learning Outcomes survey. • Spend more time reviewing results from assessments & making data-driven decisions Bresciani, M.J.

  24. Recommendations • Evaluate options for addressing Cognos reporting needs in Student Services to determine baseline data and future metrics. • Develop common standards and techniques for practicing effective benchmarking. • Taking action on the above can lead to a perpetual, systematic assessment process. Bresciani, M.J.

  25. Questions?

More Related