1 / 32

Programmatic Assessment for an Undergraduate Statistics Major

Programmatic Assessment for an Undergraduate Statistics Major. MASTER’S THESIS DEFENSE By: Allison Moore Advisor: Dr. Jennifer Kaplan April 11, 2014. Outline. Introduction Literature Review 2.1 Purpose of Programmatic Assessment 2.2 Theory of Student Learning Outcomes

lok
Download Presentation

Programmatic Assessment for an Undergraduate Statistics Major

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Programmatic Assessment for an Undergraduate Statistics Major MASTER’S THESIS DEFENSE By: Allison Moore Advisor: Dr. Jennifer Kaplan April 11, 2014

  2. Outline • Introduction • Literature Review 2.1 Purpose of Programmatic Assessment 2.2 Theory of Student Learning Outcomes 2.3 Theory of Curriculum Mapping 2.4 Theory of Designing an Assessment Plan 2.5 Student Learning Outcomes in Statistics • Local Situation • Current Process • Future Directions

  3. Introduction • Program assessment • Overall status of student learning • Required by all accreditation agencies • Especially important for statistics departments • Rapid increase in undergraduate enrollment • Industry demand for students trained in statistics • “Wide disparity in the curriculum at various institutions and confusion on the part of employers as to the skills and abilities that can be expected of a bachelor’s level statistician” (Bryce et al., 2000, p. 6)

  4. Programmatic Assessment • “The systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development” (Palomba & Banta, 1999, p. 4) • Benefits: • Course design (focus on core areas and fit within curriculum) • Opens dialogue • Data on strengths and weaknesses • Improve teaching without student evaluations

  5. Program vs. Classroom Assessment

  6. Assessment Cycle

  7. Student Learning Outcomes • “The knowledge, skills, abilities, and habits of mind that students take with them from a learning experience” (Suskie, 2009, p. 117) • Program versus course outcomes • Benefits: • Students, faculty, and administrators understand what is important and expected • Supports “learner-centered” approach • Improves student experience and effectiveness of the program

  8. Writing SLOs • 4 – 8 outcomes • Focus on action or product • Measurable • No passive or “fuzzy” verbs (learn, understand, master) • ABCD Method

  9. Writing SLOs – Example “Students will understand statistical programming.” “Students will be able to write SAS code, without errors, to analyze messy data.” • Audience • Behavior • Condition • Degree of competence

  10. Curriculum Mapping • Checks the alignment of SLOs and courses • Link remains regardless of instructor • Identify gaps in curriculum • “Orphan outcome” • Shows progression of student learning • Three stages • Introduce: basic knowledge, skills, and abilities • Develop: opportunity to strengthen and practice • Assess: master or emphasize

  11. Assessment Cycle

  12. Assessment Types • Midterm: direct, formative, embedded, quantitative • Alumni survey: indirect, summative, add-on, both

  13. Popular Assessment Methods Indirect Methods • Records of student data • Syllabus analysis • Exit interviews • Focus groups • Alumni surveys • Employer surveys • Current student surveys • Grades* Direct Methods • In-class assignments • Objective tests • Posters • Presentations • Papers • Capstone projects • Portfolios • Senior theses • Clicker questions • Rubrics

  14. Rubrics or “Checklists” • Scoring tools with specific criteria • Grade and assess SLOs at the same time

  15. Deciding on Methods • Triangulation – selecting multiple types • Variety of opportunities to display learning • Ideas reinforced and challenged • Objectives-by-Measures matrix

  16. Assessment Cycle

  17. Closing the Loop • Using assessment data and results • Patterns • Changes based on patterns • Curricular reform • Educational objectives (program and course) • Departmental policies and advising methods

  18. SLOs in Statistics • ASA Undergraduate Skills (2000) • New guidelines to be released at JSM 2014

  19. UGA Assessment Process • PRAC Criteria • Defined and measurable SLOs • Degree to which SLOs are obtained • Assessment is independent of grades and evaluations • Multiple methods used • Data collected over time • Data results in relevant findings • Improvements are planned

  20. Our Department • 500+ bachelor’s degrees since 1964 • 75-80 Majors in 2012 • Approximately 100 in 2013

  21. Previous Work by Our Dept. • Student performances in core courses • Roster of grades and assignments • Strengths and weaknesses of each student 2. Student portfolios • Work from 3 – 5 courses • Data analysis projects, essays, and explanation 3. Surveys and exit interviews • Alumni survey • Current student survey • Exit interview 4. Faculty analysis of data

  22. Current Process – SLOs • Office of Academic Planning Workshops • LOA Committee • Four core areas: theory, data analysis, computing, and communication • Cumulative levels from B.S. to M.S. to Ph.D. • January 2014 faculty meeting • Presented SLOs and curriculum map • Clarification between course and program assessment • Future assessment methods and plan

  23. Current Process – SLOs

  24. Current Process – Mapping

  25. Recommended Methods • Additions: • Indirect: syllabus analysis or employer survey • Direct: portfolios or embedded assignments

  26. Recommended Timetable

  27. Future Work • Committee creates Objectives-by-Measures Matrix • Decide on assessment methods • Establish timetable • Develop rubrics • Collect data • Discuss data results • Make changes to program

  28. Conclusion • Committee to complete steps 3 and 4 of cycle • Program review in 2019 – 2020 • Overall assessment experience • Started valuable conversations • Cohesive outcomes regardless of instructor • Adding rubrics to curriculum • Stronger undergraduate program • Improvements to student learning

  29. Questions? THANK YOU!!

  30. References American Association for Higher Education. (1996). 9 principles of good practice for assessing student learning. Retrieved from http://www.academicprograms.calpoly.edu/pdfs/assess/nine_principles_good_practice.pdf American Statistical Association. (2000). Curriculum guidelines for undergraduate programs in statistical science. Retrieved from http://www.amstat.org/education/curriculumguidelines.cfm Bloom, Benjamin S. & David R. Krathwohl. (1956). Taxonomy of educational objectives: The classification of educational goals, by a committee of college and university examiners. Handbook 1: Cognitive domain. New York: Longmans. Bryce, G. R., Gould, R., Notz, W. I., & Peck, R. L. (2000). Curriculum guidelines for bachelor of science degrees in statistical science. Curriculum Guidelines for Undergraduate Programs in Statistical Science. Retrieved from http://www.amstat.org/meetings/jsm/2000/usei/curriculum.PDF Cal Poly Pomona. (2007). Department of mathematics and statistics assessment plan. Retrieved from http://www.csupomona.edu/~math/Documents/Homepage/DepartmentAssessmentPlan.pdf Carr, J. W. & Hardin, S. (2010). The key to effective assessment: Writing measurable student learning outcomes. Recreational Sports Journal, 34, 138 – 144. Clark, D. R. (1999). The three types of learning. Bloom’s Taxonomy of Learning Domains. Retrieved from http://www.nwlink.com/~donclark/hrd/bloom.html Department of Statistics. (n.d.). Statistics major. Statistics: Franklin College. Retrieved from http://www.stat.uga.edu/statistics-major Franklin, C. (2003). Major assessment plan of the undergraduate statistics major. Unpublished manuscript. Department of Statistics, University of Georgia, Athens, GA. Franklin, C. (2012). Full review of the undergraduate statistics major. Unpublished manuscript. Department of Statistics, University of Georgia, Athens, GA. Groeneveld, R. A. & Stephenson, R. W. (1999). Outcomes assessment in the B.S. statistics program at Iowa State University. In B. Gold, S. Z. Keith, & W. A. Marion (Eds.), Assessment practices in undergraduate mathematics (pp. 49-53). USA: The Mathematical Association of America. Gordon, L. (2013a). Workshop: Writing student learning outcomes [PowerPoint slides]. Retrieved from http://oap.uga.edu/uploads/assess/Outcomes_Workshop_Power_Point_-_9.20.13.pdf Gordon, L. (2013b). Workshop: Choosing assessment measures [PowerPoint slides]. Retrieved from http://oap.uga.edu/uploads/assess/Assessment_Measures_Wkshp_2013_for_webpage.pdf Goucher College. (n.d.). Purpose of student learning outcomes. Student Learning Goals and Outcomes. Retrieved from http://www.goucher.edu/ academics/office-of-the-provost/student-learning-goals-and-outcomes/purpose-of-student-learning-outcomes

  31. References Hatfield, S. (2009). Idea paper #45: Assessing your program-level assessment plan. The Idea Center. Retrieved from http://www.theideacenter.org/sites/default/files/IDEA_Paper_45.pdf Horton, N. J. (2013). Updating the guidelines for undergraduate programs in statistics [Webinar]. In CAUSE Teaching and Learning Webinar Series. Retrieved from https://www.causeweb.org/webinar/teaching/2013-11/ Jonson, J. (2006). Guidebook for programmatic assessment of student learning outcomes. University-wide Assessment. Retrieved from University of Nebraska Lincoln: http://www.unl.edu/svcaa/assessment/learningoutcomes_guidebook.pdf Jordan, D., DeGraaf, D., & DeGraaf, K. (2005). Programming for parks, recreation, and leisure services: A servant leadership approach. 2nd ed. State College, PA: Venture Publishing Lazar, N. A., Reeves, J., & Franklin, C. (2011). A capstone course for undergraduate statistics majors. The American Statistician, 65(3), 183-189. Lindholm, J. A. (2009). Guidelines for developing and assessing student learning outcomes for undergraduate majors. Learning Outcomes Assessment for Undergraduate Academic Programs, (1st ed.). Retrieved from University of California, Los Angeles website: http://www.wasc.ucla.edu/eer_endnotes/Learning_Outcomes_Guidelines.pdf Macalester College. (2012). The academic program. College Catalog. Retrieved from http://catalog.macalester.edu/content.php?catoid=4&navoid=619 Maki, P. (2010). Assessing for learning (2nd ed.). Sterling, VA: Stylus Publishing. Middle States Commission on Higher Education. (2007). Student learning assessment: Options and resources (2nd ed.). Retrieved from http://www.msche.org/publications/SLA_Book_0808080728085320.pdf Mullinix, B. B. (2009). Rubrics. The TLT Group. Retrieved from http://tltgroup.org/resources/Rubrics.htm Office of Academic Planning. (n.d.). Student learning outcomes assessment. Retrieved from http://oap.uga.edu/assessment/sloa/ Office of Academic Planning. (2012). Department of marketing and distribution: Undergraduate major assessment plan. Retrieved from http://oap.uga.edu/assessment/plan/ Office of Assessment. (2014). Program assessment plans: Department of mathematics. Retrieved from University of Rochester: Arts, Sciences, and Engineering: http://www.rochester.edu/ college/ assessment/assets/pdf/undergraduate/LearningObjectives-MTH-v2.pdf Office of Experiential Learning. (n.d.). Writing SMART learning objectives. Retrieved from University of Central Florida: http://explearning.ucf.edu/registered-students/tips-for-success/writing-smart-learning-objectives/195

  32. References Office of Institutional Assessment. (n.d.). Assessment resources. Retrieved from Texas A&M University: http://assessment.tamu.edu/resources/resources_index.html Palomba, C. & Banta T. (1999). Assessment essentials: planning, implementing, and improving assessment in higher education. San Francisco: Jossey-Bass. Peck, R. & Chance, B. L. (2007). Assessment at the program level: Using assessment to improve undergraduate statistics programs. In L. Weldon & B. Phillips (Eds.), Proceedings of the ISI/IASE Satellite on Assessing Student Learning in Statistics. (pp. 1-6). Lisbon, Portugal. Public Affairs Division. (2013). UGA by the numbers. The University of Georgia. Retrieved from http://www.uga.edu/images/uploads/2013UGA_ByNumbers.pdf Quick, P. S. (2014). It looks like an ‘A’ but I don’t know why: How to construct and use rubrics for assessment and learning [PowerPoint slides]. Ritter, M. A., Starbuck, R. R., and Hogg, R. V. (2001). Advice from prospective employers on training BS statisticians. The American Statistician, 55, 14. Stassen, M., Doherty, K, & Poe, M. (2001). Program-based review and assessment: Tools and techniques for program improvement. Retrieved from Office of Academic Planning & Assessment, University of Massachusetts Amherst: http://www.umass.edu/oapa/oapa/ publications/online_handbooks/program_based.pdf Stevens, D. D. & Levi A. J. (2005). Introduction to rubrics. (pp. 82). Sterling, VA: Stylus. Stufken, J. & Taylor, R. L. (2013) A brief history of the department of statistics. In A. Agresti & X. Meng (Eds.), Strength in numbers: The rising of academic statistics departments in the U.S. (pp. 381-393). New York: Springer. Suskie, L. (2009). Assessing student learning: A common sense guide (2nd ed.). San Francisco: Jossey-Bass. The University of Rhode Island. (2012). Designing a meaningful curriculum map: Reflecting on the level of learning expected. Student Learning, Outcomes Assessment and Accreditation. Retrieved from http://web.uri.edu/assessment/files/Understanding_IRE_2012.pdf Schuh, J. H. & Upcraft, M. L. (2001). Assessment practice in student affairs: An applications manual. San Francisco: Jossey-Bass Publishers. Washington State University. (2009). Validity, reliability, and triangulation in program assessment. Center for Teaching, Learning, and Technology. Retrieved from http://ai.vancouver.wsu.edu/~nwdcsd/wiki/images/9/9c/ Validity,_Reliability,_and_Triangulation_in_Program_Assessment_Spring09.pdf

More Related