1 / 37

Developing Assessment Plans

Developing Assessment Plans. Mi-Suk Shim, Ph.D Spring 2006 DIIA. Outline of Workshop. Review of previous workshop Assessment methods overview & resources Syllabus, exam, & assignment analysis for each course Assessment map, matrix, & assessment plan at program level. SACS Criteria.

ward
Download Presentation

Developing Assessment Plans

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing Assessment Plans Mi-Suk Shim, Ph.D Spring 2006 DIIA

  2. Outline of Workshop • Review of previous workshop • Assessment methods overview & resources • Syllabus, exam, & assignment analysis for each course • Assessment map, matrix, & assessment plan at program level

  3. SACS Criteria CS 3.3.1 Institutional Effectiveness The institutionidentifies expected outcomes for its educational programsand its administrative and educational support services; assesses whether it achieves these outcomes; andprovides evidence of improvement based on analysis of those results

  4. UT SCHEDULE • Program Educational Objectives and Program Outcomes • Complete by end of Spring 2006 and document • Assessment Plan • Complete by end of Spring 2006 and document • Progress Toward Completion of One Assessment Cycle and Closing the Loop • Complete documentation by Spring Break 2007

  5. Learning Outcomes • statements that describe what students are expected to know, think, and be able to doby the time of graduation ** Knowledge, attitudes, and skills

  6. Learning Outcomes Students will DO WHAT (how)

  7. Matrix

  8. Assessment Plan University of Texas at Austin Academic Unit Assessment Plans Format (tentative version) I. School and Degree Program School Name and College: Degrees awarded: Contact person: Date: II. Program Mission Statement III. Program Educational Objectives IV. Program Learning Outcomes V. Strategies, Methods, and Level of Competence VI. Implementation Plan VII. Assessment of Results VIII. Evaluation of Results IX. Recommendations X. Actions

  9. Assessment Methods • Multiple methods & sources recommended (increase validity) • One method does NOT fit ALL (each has pros & cons) • Practicality? Time, effort, money • Do not have to measure everything or everybody (sampling) • Capitalize on what you are already doing • Quantity of data does not equate to Quality

  10. Assignment 1 (10) Assignment 2 (10) Homework (10) Quizzes (10) Exam 1 (20) Exam 2 (20) Attendance (10) Participation (10) Extra credit (2) Course grades Course Grade 90 & up: A 80 ~ 89: B 70 ~ 79: C 60 ~ 69: D 59 & below: F Learning Outcome Communicate information effectively in writing using appropriate business writing format

  11. Direct vs. indirect • Direct measures: Assess student knowledge or skills, that is student learning outcomes • Indirect measures: Assess students’ learning experiences or perceptions of their learning

  12. Direct (Required) Class Assignments (paper, presentation, report…) Capstone Project Performance Project Direct Observation Portfolios External examiner Standardized exam Locally developed exam Certification and licensure exams Simulations Theses/Senior papers Indirect (Supplemental) Surveys Student survey Alumni survey Employer survey National survey Interview Focus group Case study Inventory of assessment methods

  13. Guiding Questions for Methods Does the method…… • Measure your learning outcomes? • Measure your learning outcomes accurately? • Provide useful information (implications for educational evaluation and improvement)? If you answered YES to all of the above, it can be used to demonstrate Institutional Effectiveness

  14. Level of competence • Your decision • What do you consider a success? Example: 90% of students will meet “acceptable” level of competence using a rubric

  15. Resources • UT SACS website https://www.utexas.edu/provost/planning/assessment/sacs/resources.html • Gloria Roger’s materials from October workshop (handouts) • DIIA Instructional Assessment Resources (IAR) Website http://www.utexas.edu/academic/diia/assessment/iar/how_to/methods/index.php

  16. Where to start? • Course related: • Course descriptions • Syllabi • Course objectives • Course assignments • Course exams • Other activities: • Student exit survey • Alumni survey • Employer survey • National Standardized Exams **Key is to “Make use of existing sources”

  17. Assessment Map: Program

  18. What can individual faculty do? • Syllabus analysis • Exam analysis • Assignment analysis • For more detailed information; http://www.utexas.edu/academic/mec/research/workshopsummary.html

  19. Syllabus analysis • Identify course objectives • Document those objectives in a table • Faculty complete table for each of their courses

  20. Performance criteria • Create appropriate business documents • Write logically • Logical oral presentation • Illustrate with graphics • Interpret graphical info correctly • Syllabus Objective (BA 314) • Analyze communication situations and audiences • Write business documents that are grammatically correct and use appropriate business style • Deliver effective business presentations • Develop effective interpersonal communication skills • Use communication technology appropriately and effectively • Conduct research and use it to complete written and oral reports • LO: communicate information in the business field effectively in writing, orally, and graphically.

  21. Syllabus Analysis • table

  22. Syllabus Analysis • table

  23. Exam analysis • Identify test items that match course objectives • Calculate overall student performance for each item • Calculate the average performance for items assessing same objective • Determine the level of competence

  24. Exam Analysis • Table

  25. Assignment analysis • Identify assignment components that match course objectives • Assess student performance for each component • Determine the level of competence • Using rubrics

  26. Assignment Analysis • Table

  27. Rubrics • Scoring guidelines • A set of categories which describe the important components of the work assessed.

  28. Rubrics • Scale • Descriptors Criteria (with indicators)---Things to look for Standard --- Description of degree of each level • Type Holistic Analytic

  29. Resources for Rubrics • Sample handouts from Relearning by Design Inc. http://www.relearning.org/resources/PDF/rubric_sampler.pdf • DIIA workshop material http://www.utexas.edu/academic/mec/research/pdf/rubricshandout.pdf

  30. Compile Info at Program level:Assessment Map

  31. Assessment Map: focused

  32. Assessment Map exercise • Pick a Learning Outcome with performance criteria • Write them in 2 left columns • List courses or activities in your program in the first row -start with your own courses • Write down assignments that are relevant to the chosen learning outcome for each course or activity • Among them, choose the methods you intend to use

  33. Matrix Example page

  34. Where when who • Where --- context for assessment (sample) • When --- time of data collection • Who ---responsible person --- who interprets results?

  35. Results/Recommendation/Action • State in future tense • What do you expect as results?

  36. Assessment Plan example

  37. Further assistance • Dr. Neal Armstrong Vice Provost for Faculty Affairs Office: MAI 201 Email: neal_armstrong@mail.utexas.edu Phone: (512) 232-3305; (512) 471-4716

More Related