1 / 33

ASSESSMENT BOOTCAMP

ASSESSMENT BOOTCAMP. Expectations: Orinthia Montague Associate Vice Provost/Dean of Students Toby Shorts Assistant Director of Assessment. Top-down Expectations: System : Shared governance Independent performance audit Accountability to stakeholders

morrie
Download Presentation

ASSESSMENT BOOTCAMP

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASSESSMENT BOOTCAMP Expectations: Orinthia MontagueAssociate Vice Provost/Dean of Students Toby ShortsAssistant Director of Assessment

  2. Top-down Expectations: • System: Shared governance Independent performance audit Accountability to stakeholders • Accreditation: Assessing, confirming, and improving student learning Annual unit-level reports Build a collaborative network • Student Affairs: Increased recruitment and retention Excellence in customer service Measurable student satisfaction

  3. Assessment Research Council (ARC) expectations: • Division: Provide assessment training and workshops Support external assessment training Support professional development opportunities in assessment • ARC Coordinator: Coordinate cross-departmental assessment activities Implement assessment training Bootcamp and workshops Be a resource for all units in Student Affairs Maintain and disseminate assessment findings • ARC Members: Complete Assessment Bootcamp Attend two workshops and two consultations per semester Coordinate assessment within host unit Maintain an open dialogue about assessment within host unit

  4. WHAT IS ASSESSMENT? • “Assessment is any effort to gather, analyze, and interpret evidence which describes institutional, departmental, divisional or agency effectiveness.” • “Evaluation is any effort to use assessment evidence to improve institutional, departmental, divisional, or agency effectiveness.” Upcraft, M. L., & Schuh, J. H. (1996). Assessment in student affairs: A guide for practitioners. San Francisco: Jossey-Bass.

  5. ADDRESSING INDIVIDUAL ASSESSMENT BARRIERS: Fear of judgment Fear of the consequences of evaluation Lack of knowledge about what constitutes good assessment Lack of confidence in conducting assessment well Disbelief in the use and benefits of assessment Unwillingness to invest time and resources Disbelief in division-wide support Distrust in avenues of shared evaluation and governance Fear of over-dependence on statistical analysis

  6. THE PURPOSE OF ASSESSMENT • To articulate what we do • To support why we do what we do • To provide evidence of how well we do it • To improve or change as needed • To understand if the improvements we have made work • To align what we do with the institution and division’s missions

  7. WE ALL DO ASSESSMENT • What decisions did you make about your programs or services last year? • Did you use some form of evidence to inform that decision? • Many people already capitalize on their innate intellectual curiosity to find out what works • Many people just don’t articulate their intended end results (e.g., outcomes) ahead of time • Most people don’t document the decisions made based on their results • Most people don’t follow up later to see if their decisions made the intended improvement

  8. REASONS TO ASSESS

  9. SURVIVAL • To answer the questions of accountability, cost, quality, access, equity, and accreditation • In an era of declining resources, are student services really necessary? • Declining resources can lead to narrowing the focus to academics. • Bottom line – student affairs needs to demonstrate its central role in the academic success of students and to meet the institution’s needs for basic services.

  10. QUALITY • Even if we have demonstrated that student services, programs and facilities are essential and needed, are they of high quality? • Determining quality involves comparison against standards or comparable organizations • What evidence do we have of quality? • How do we know if we have improved it? • Assessment links goals to outcomes and helps define quality and confirms if it exists in our programs, activities and services

  11. STRATEGIC PLANNING • Planning and assessment have a cyclical and reciprocal relationship • Departmental and Division planning should support University planning • Assessment helps us to decide what to do • Define goals and objectives • Point to critical issues or problems • Provide baseline data • Point out our strengths, weaknesses, threats and opportunities • Assessment helps us know how well we have planned

  12. POLICY DEVELOPMENT • Do we have the evidence required to make the decisions we make? • Do we have the evidence needed for the programs we develop or discontinue? • Are our policies developed with particular attention to our shareholders?

  13. POLITICS • Someone of importance/influence wants information that we need to produce • Helps develop a rationale for the existence of student affairs as essential to the institution’s mission • Provides evidence about the quality of our efforts and the integration it has with student learning and development • Evidence needs to be campus-specific; it’s not enough what the literature says. Are our students like what theory and literature says?

  14. ACCREDITATION • Accreditors are looking for evidence of student learning and quality enhancement • They expect that you not only DO assessment, but that you use assessment to make evidence-based decisions in policy and program development and evolution

  15. STUDENT DEVLOPMENT • “Our mission is to provide high quality programs and services that compliment the academic experience and to provide opportunities that enhance the development of our students” • We recognize the worth and dignity of all students and their diverse needs • We believe students who are broadly educated will make a contribution to society • We serve our customers in an ethical, effective, and friendly manner • We work as partners with the campus and the community

  16. PERSONAL SATISFACTION • Asks and answers questions that you really want to know • Continues your own individual learning experience as an eternal student • Provides evidence of the positive impact you have on the lives of thousands of students • Answers the most fundamental questions: • Who are we? • What do we do? • Why do we do it? • How well do we do it?

  17. HOW ASSESSMENT CAN HELP • Justify resource allocation and reallocation (human and financial capital) • Track how your resources are used and by whom • Be able to clearly articulate student and stakeholder needs • Provide evidence of satisfaction • Articulate program and learning outcomes • Understand how different populations experience campus climate

  18. EXAMPLE:

  19. EXAMPLE:

  20. EXAMPLE:

  21. THE ASSESSMENT CYCLE Gather and Analyze Data Implement Methods to Deliver Outcomes and Methods to Gather Data Mission/Purpose Goals Outcomes Interpret Evidence Make decisions to improve programs; enhance student learning and development; inform institutional decision- making, planning, budgeting, policy, public accountability

  22. KEEP IN MIND: • The assessment of Student Affairs begins with educational values • Assessment is most effective when it is ongoing, not episodic • Assessment should be part of a larger set of conditions that promote change • Don’t reinvent the wheel. Use literature and theory when available, but remember that University of Missouri – St. Louis students are not likely to fit any mold

  23. KEEP IN MIND: • Use one another. Collaborate and use ARC as a resource • Collaborate with faculty as often as possible • Use a variety of methods to collect your data, dependence on one tool invites criticism of your methodology • Don’t do assessment for the sake of assessment. Do it for students and to enhance their learning experience.

  24. The Assessment Process: At a predominantly white institution, African American students complained that they did not feel comfortable seeking counseling services from the counseling center. Through institutional records and counseling records, the counseling center was able to confirm that while African American students represented 12 percent of the student population, they represented only 2 percent of students receiving counseling services. The use of counseling services by other minority populations was approximately equal to their proportion of the total student population.This was an easily identified problem because: • The counseling services mission includes meeting the needs of underrepresented groups. • The university was very concerned about an excessively high attrition rate among undergraduate African American students. • The African American Student Association has charged that the counseling center was insensitive to the needs of African American students.

  25. Why are we doing this assessment? • Why do we need this information? • Is our goal to improve a service? • Are we trying to justify a program? • Do we need information to help formulate a policy? • Are we trying to determine budget priorities? To determine why counseling services are underused by African American undergraduate students and what steps must be taken to increase use.

  26. What will we assess? • Is a program or service not meeting departmental expectations and goals? • Is the cost of a service, program, or facility prohibitive? • Are stakeholders dissatisfied with a program, service, or facility? • Do existing policies require review or do new policies need to be formulated? • Are clientele needs not being met? The needs and satisfaction of African American students as related to counseling services.

  27. How will we assess? • Qualitative - is a case-by-case understanding of data required to make appropriate decisions? • Quantitative - is it important to apply the results of the study to a larger population? • A combination of both qualitative and quantitative Both. (phone and walk-in interview:) The institution needs to fully understand why African American students were not using counseling services and how their needs differed from the majority population. (e-mail survey:) Further, the institution needs to apply the results received from interviewed students to the greater African American student population.

  28. Who will we assess? • What demographic information is pertinent? • How broad or narrow should the population be? • How large should the sample be? • What sampling method fits best?

  29. QUANTITATIVE For the quantitative study, a computer generated random sampling of full-time African American undergraduate students using student numbers was implemented with the expectation of receiving at least 300 responses (minimum of 35 from each class standing) with a male and female representation being proportionate to the institution population. This provided 1200 student names to be surveyed (25% return expectation). Upcraft and Schuh, in Assessment in Student Affairs, suggest a minimum of 35 subjects per variable (class standing) and no fewer than 300 responses for audience credibility.

  30. QUALITATIVE For the qualitative study, a computer generated random sampling of full-time African American undergraduate students was used to identify forty potential interviews (including 10 students from each class standing and a proportionate representation of male and female students). While qualitative inquiry expects no minimum sample size, counseling services weighed resources and human capital with the importance of gaining information-rich cases and determined that forty interviews provided sufficient data and credibility.

  31. How will the results be communicated and to whom? The data collected was analyzed by an institutional research officer and interpreted by a committee of counseling center staff members. Data comparisons and contrasts by gender and grade level were posted on the counseling services website along with a truncated summary of their findings. This report was made available to the African American Student Association. A complete summary, including all collected data and findings, was reported to administrative officials with a vested interest in African American student attrition rates. The same report was shared with all counseling center staff. Upon request, the same report was made available to the OEO, the Student Recruitment Office, and faculty among the Minority Studies program.

  32. How will we follow up? • Based on the evidence collected, implement program or policy changes to address the initial concern. • Announce changes to appropriate administrators and stakeholders • Assess client reaction to changes made • Compare initial assessment with post-change assessment and determine if changes were appropriate and if more work is required to make the program successful.

  33. http://www.educationadvisoryboard.com/salc/library.asp http://www.umsl.edu/~ir/ http://www.qualtrics.com/

More Related