1 / 42

Robert Stempel College of Public Health and Social Work Assessment Retreat

Robert Stempel College of Public Health and Social Work Assessment Retreat. October 22, 2012 Office of Academic Planning and Accountability (APA) Florida International University. Introduction. Susan Himburg, Director of Accreditation, APA Mercedes Ponce, Director of Assessment, APA

diem
Download Presentation

Robert Stempel College of Public Health and Social Work Assessment Retreat

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Robert StempelCollege of Public Health and Social Work Assessment Retreat October 22, 2012 Office of Academic Planning and Accountability (APA) Florida International University

  2. Introduction

  3. Susan Himburg, Director of Accreditation, APA • Mercedes Ponce, Director of Assessment, APA • Katherine Perez, Associate Director of Assessment, APA • Bridgette Cram, Assessment Coordinator • Barbara Anderson & Claudia Grigorescu, GAs, APA

  4. Assessment in FIU

  5. Retreat Agenda

  6. Overview of Assessment: Cycle

  7. Overview of Assessment: Timeline Cycle B

  8. Student Learning Outcomes (SLOs) • Student Learning Outcomes (SLOs) is a program related outcomes • SLOs focus on students’ knowledge and skills expected upon completion of an academic degree program • “A learning outcome is a stated expectation of what someone will have learned” (Driscoll & Wood, 2007, p. 5) • “A learning outcome statement describes what students should be able to demonstrate, represent, or produce based on their learning histories” (Maki, 2004, p. 60) • “A learning outcome describes our intentions about what students should know, understand, and be able to do with their knowledge when they graduate” (Huba & Freed, 2000, p. 9-10) • What should my students know or be able to do at the time of graduation?

  9. Program Learning Outcomes (POs) • Program Outcomes (POs)focus on expected programmatic changes that will improve overall program quality for all stakeholders (students, faculty, staff) • Program outcomes illustrate what you want your program to do. These outcomes differ from learning outcomes in that you discuss what it is that you want your program to accomplish. (Bresciani, n.d., p. 3) • Program outcomes assist in determining whether the services, activities, and experiences of and within a program positively impact the individuals it seeks to serve. • Emphasizes areas such as recruitment, professional development, advising, hiring processes, and/or satisfaction rates. • How can I make this program more efficient?

  10. Administrative Assessment (AAs) • Administrative Areas • Dean’s Office • Centers/Institutes • Outcomes aligned to: • Unit mission/vision • Annual goals • University mission/vision • Strategic plan • Outcomes focus on each of the following areas (all 4 required for Dean’s Office): • Administrative Support Services • Educational Support Services • Research • Community Service • Student learning is also assessed for units providing learning services to students (e.g., workshops, seminars, etc.)

  11. Matrixes I:Effective Outcomes

  12. Student Learning Outcomes (SLOs)

  13. Student Learning Outcomes (SLOs)

  14. Student Learning Outcomes (SLOs)

  15. Program Outcomes (POs)

  16. Program Outcomes (POs)

  17. Streamlining Outcomes with Program Goals

  18. Aligning Accreditation Competencies with the FIU Assessment Process • Why Alignment Matters • Assessment can seem like a burdensome process, but if all of your assessment needs are aligned, the assessment process only needs to be completed once. • How APA Can Help • The competencies required by your specialized accrediting agencies can be easily combined with the SACS requirements. APA can assist in developing outcomes that are relevant for both purposes. • TracDat is fully customizable and the APA staff can include custom alignments according to your specialized accreditation needs • A “one stop shop” for all of your assessment needs • Serves as a “data warehouse”, store all of your assessment data in one place! • Reports can be run based on which alignments you are looking for, for example: outcomes aligned with your specialized accreditation, SACS, or both

  19. Getting Started with Alignment • Identify the core competencies that are required by your specialized accrediting agency • Do any of these outcomes fulfill the SACS requirements? (Content Knowledge, Critical Thinking, Communication (Oral/Written), Technology) • For competencies that do not fall into one of the above categories, you should develop outcomes for them and enter them into TracDat accordingly.

  20. Curriculum Mapping • A good curriculum map ensures that all program stakeholders understand how your outcomes align with certain course throughout the curriculum. • For specialized accreditation alignment purposes, your curriculum map should include all competencies required, not just those that are being used for FIU Assessment purposes

  21. Tying Outcomes to Curriculum: Curriculum Maps

  22. Tying Outcomes to Curriculum: Curriculum Maps • Introduced = indicates that students are introduced to a particular outcome • Reinforced = indicates the outcome is reinforced and certain coursesallow students to practice it more • Mastered = indicates that students have mastered a particular outcome • Assessed = indicates that evidence/data is collected, analyzed and evaluated for program-level assessment • *Adapted from University of West Florida, Writing Behavioral, Measurable Student Learning Outcomes CUTLA Workshop May 16, 2007.

  23. Matrixes II:Effective Methods

  24. Choosing Assessment Measures/Instruments

  25. Understanding Types of Measurements • Direct versus Indirect Measures • Direct Measure:Learning assessed using tools that measure direct observations of learning such as assignments, exams, and portfolios; Precise and effective at determining if students have learned competencies defined in outcomes • Indirect Measure: Learning assessed using tools that measure perspectives and opinions about learning such as surveys, interviews, and evaluations; Provide supplemental details that may help a program/department understand how students think about learning and strengths/weaknesses of a program • Program Measures versus Course Measures • Program Measure: Provides data at the program level and enables department to understand overall learning experience; Includes data from exit exams and graduation surveys • Course Measure: Provides data at the course level and enables professors to determine competencies achieved at the end of courses; Includes data from final projects/presentations and pre-post exams • Formative Measures versus Summative • Formative Measures: Assessing learning over a specific timeline, generally throughout the academic semester or year • Summative Measures: Assessing learning at the end of a semester, year or at graduation

  26. Course Level • Program Level • Portfolios • Exit exams • Graduation surveys • Discipline specific national exams • Essays • Presentations • Minute papers • Embedded questions • Pre-post tests Examples of Measures/Instruments

  27. Institution-Level Assessments NSSE FSSE Graduating Master’s and Doctoral Student Survey Graduating Senior Survey Student Satisfaction Survey Global Learning Perspectives Inventory Alumni Survey Proficiency Profile Case Response Assessment (Kuh & Ikenberry, 2009, p. 10)

  28. Introduction to Rubrics

  29. Steps for Developing Rubrics

  30. Rubric Template

  31. Reporting Results • Summary of Results • Format • Narrative • Tables or charts • Analysis/Interpretation of results • Explain results in a narrative form by interpreting results or using qualitative analysis of the data. • Every student learning outcome must have at least: • One set of results • One student learning improvement strategy (use of results)

  32. Reporting Results • Non-Examples: • Our students passed the dissertation defense on the first attempt. • All the students passed the national exam. • Criteria met. • Examples: • 75% of the students (n=15) achieved a 3 or better on the 5 rubric categories for the capstone course research paper. Average score was: 3.45 • Overall, 60% of students met the criteria (n=20) with a 2.65 total average. The rubric’s 4 criteria scores were as follows: • Grammar: 3.10 (80% met minimum criteria) • Research Questions: 2.55 (65% met minimum criteria) • Knowledge of Topic: 2.50 (55% met minimum criteria) • Application of Content Theories: 2.45 (60% met minimum criteria)

  33. Reporting Results

  34. Reporting Results: Formulas

  35. Using Results for Improvements

  36. Using Results for Improvements: Student Learning

  37. Using Results for Improvements: Student Learning

  38. Using Results for Improvements: Program Outcomes

  39. Using Results for Improvements: Program Outcomes

  40. Q & A Session

  41. References for Cover Pictures1. http://www.superscholar.org/rankings/online/best-public-health-degrees/2. http://www.collegeonline.com/news-about-online-colleges/social-work-online/3. http://greensandberries.squarespace.com/greens-and-berries/2010/7/28/civic-dietetics.html References

  42. Thank you for attending.Contact Us:Katherine Perezkathpere@fiu.edu305-348-1418Bridgette Crambcram@fiu.edu305-348-1367Departmental Information:ie@fiu.edu305-348-1796PC 112

More Related