1 / 14

Student Affairs Assessment Boot Camp Ongoing Improvement Session

Student Affairs Assessment Boot Camp Ongoing Improvement Session. Bill Knight Institutional Effectiveness. Plan for Today. Feedback and General Discussion (1:00-1:30) Developing an Effective Assessment Plan (1:30-1:45) Developing Effective Student Learning Outcomes (1:45-2:30)

abiola
Download Presentation

Student Affairs Assessment Boot Camp Ongoing Improvement Session

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student Affairs Assessment Boot CampOngoing Improvement Session Bill Knight Institutional Effectiveness

  2. Plan for Today Feedback and General Discussion (1:00-1:30) Developing an Effective Assessment Plan (1:30-1:45) Developing Effective Student Learning Outcomes (1:45-2:30) Making Assessment More Manageable (2:30-3:00) Analyzing, Interpreting, and Communicating Results (3:00-3:30) Resources/Aligning with Other Assessment Efforts (3:30-4:00) Available for Individual Consultations (4:00+)

  3. Bill’s Overall Feedback The 2011-2012 proposals and reports are excellent. The general approach with the timeline, committee, Division learning outcomes, the format of the plans and reports, and the glossy brochure are great. Ball State Student Affairs has come really far with this in a short period of time, and you should be proud of where you are at in this process!

  4. Collaboration Are there some programs/services/efforts that span multiple SA units where the assessment could be worked on collaboratively? - For example, are there multiple anti-alcohol abuse, diversity, or engagement efforts for which the assessment could be strengthened through collaboration?

  5. Outcomes I think that some of the program learning outcomes need to be improved. These should be phrased very specifically in terms of what students should be able to do at the conclusion of the program. Good examples include: Health Education: - BSU students will be able to name 3 potential medical risks of taking prescription medications not prescribed to them. - BSU students will be able to name 3 potential legal risks of taking prescription medication not prescribed to them, or sharing prescription medication with others. Career Center Roll Out the Red: - After completion of Module 1, student employees will be able to identify and explain the 10 professional customer service behaviors essential to customer satisfaction.

  6. Outcomes Examples of learning outcome statements that really do not work so well might include: The ___ program will expose students to . . . Exposure to programs/services/opportunities doesn’t necessarily equate to learning.

  7. Assessment vs. Other Evaluation/Feedback It might be worthwhile in plans and reports to acknowledge that not all forms of feedback or evaluation include assessment of student learning. Tracking participation and gathering information about satisfaction are perfectly acceptable things to do if the program/service/resource is not designed to affect student learning.

  8. Types of Outcomes For programs/services/resources that are designed to improve student learning, it is important to recognize different types of outcomes, for example: Based upon: Astin, A. (1993). Assessment for excellence: The philosophy and practice of assessment and evaluation in higher education. Phoenix: ACE/Oryx.

  9. Direct vs. Indirect Measures Units need to move to greater use of direct measures if cognitive and affective growth (learning) (vs. participation, satisfaction, retention, GPA, etc.) are the goals (perhaps change 1b in the report to read what are the goals of the program [not of the assessment], then 1c in it is a sub-set of 1b). Surveys are a form of indirect assessment; they ask students what they have learned rather than measuring it objectively. Results are subjective and response rates are often low. Surveys can be used as a supplemental measure to inform how and why students learned what they did (or did not learn).

  10. Direct vs. Indirect Measures The Career Center is a good example of using direct assessment by staff and employers along with a student survey as supplemental indirect feedback. The Excellence in Leadership program’s use of LPI is an excellent example of direct assessment. For example, could the Counseling Center move from a survey to a pre-post test of diversity outcomes to assess the effect of diversity (Miller-Guzman Universality Diversity Scale)?

  11. Caution Concerning Survey Incentives We can NOT use lottery-type incentives in Indiana per Business Affairs interpretation of Indiana Gaming Commission regulations. Think about ways to move from surveys to direct assessments that are “built in” to the program, so that student participation is harder to avoid.

  12. Analysis of Results Frequencies are fine, but they often lead to the “so what” reaction. Unless all students are achieving everything we want them to, we need to look deeper. Why do some students improve more than others—what patterns are evident? The Office of Institutional Effectiveness can help you with adding student demographics (gender, race, first-generation, SAT, HSGPA, family income) and other types of experiences (major, class level, other SA programs/services/facilities, etc. that we know about) to your datasets to allow for greater context and more actionable results.

  13. Analysis of Results The IE office can help you control for differences in entering student abilities and background and for other types of experiences besides your program as you examine outcomes. This is Astin’s I-E-O model. If appropriate, the IE office can also provide you with comparison groups. Beginning next academic year, we will be able to merge any sort of participation lists with retention, GPA, SCH, graduation, time-to-degree data, and survey results (Orientation Survey, MAP-Works, National Survey of Student Engagement, Senior Survey, Alumni Survey). This stuff is not that hard, and we are here to help!

  14. Discussion What are the aspects of assessment that… have been difficult, that could be improved, or with which you would like to do more?

More Related