developing assessment plans n.
Skip this Video
Download Presentation
Developing Assessment Plans

Loading in 2 Seconds...

play fullscreen
1 / 32

Developing Assessment Plans - PowerPoint PPT Presentation

  • Uploaded on

Developing Assessment Plans. Mi-Suk Shim, Ph.D Spring 2006 DIIA. Outline of Workshop. Review of previous workshop Assessment methods overview & resources Syllabus, exam, & assignment analysis for each course Assessment map, matrix, & assessment plan at program level. SACS Criteria.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Developing Assessment Plans' - geoffrey-mcclain

Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
developing assessment plans

Developing Assessment Plans

Mi-Suk Shim, Ph.D

Spring 2006


outline of workshop
Outline of Workshop
  • Review of previous workshop
  • Assessment methods overview & resources
  • Syllabus, exam, & assignment analysis for each course
  • Assessment map, matrix, & assessment plan at program level
sacs criteria
SACS Criteria

CS 3.3.1 Institutional Effectiveness

The institutionidentifies expected outcomes for its educational programsand its administrative and educational support services;

assesses whether it achieves these outcomes;

andprovides evidence of improvement based on analysis of those results

ut schedule
  • Program Educational Objectives and Program Outcomes
    • Complete by end of Spring 2006 and document
  • Assessment Plan
    • Complete by end of Spring 2006 and document
  • Progress Toward Completion of One Assessment Cycle and Closing the Loop
    • Complete documentation by Spring Break 2007
learning outcomes
Learning Outcomes
  • statements that describe what students are expected to know, think, and be able to doby the time of graduation
learning outcomes1
Learning Outcomes

Students will

DO WHAT (how)

assessment plan
Assessment Plan

University of Texas at Austin

Academic Unit Assessment Plans Format (tentative version)

I. School and Degree Program

School Name and College:

Degrees awarded:

Contact person:


II. Program Mission Statement

III. Program Educational Objectives

IV. Program Learning Outcomes

V. Strategies, Methods, and Level of Competence

VI. Implementation Plan

VII. Assessment of Results

VIII. Evaluation of Results

IX. Recommendations

X. Actions

assessment methods
Assessment Methods
  • Multiple methods & sources recommended

(increase validity)

  • One method does NOT fit ALL (each has pros & cons)
  • Practicality? Time, effort, money
  • Do not have to measure everything or everybody (sampling)
  • Capitalize on what you are already doing
  • Quantity of data does not equate to Quality
direct vs indirect
Direct vs. indirect
  • Direct measures: Assess student knowledge or skills, that is student learning outcomes
  • Indirect measures: Assess students’ learning experiences or perceptions of their learning
inventory of assessment methods
Direct (Required)

Class Assignments

(paper, presentation, report…)

Capstone Project

Performance Project

Direct Observation


External examiner

Standardized exam

Locally developed exam

Certification and licensure exams


Theses/Senior papers

Indirect (Supplemental)


Student survey

Alumni survey

Employer survey

National survey


Focus group

Case study

Inventory of assessment methods
guiding questions for methods
Guiding Questions for Methods

Does the method……

  • Measure your learning outcomes?
  • Measure your learning outcomes accurately?
  • Provide useful information (implications for educational evaluation and improvement)?

If you answered YES to all of the above, it can be used to demonstrate Institutional Effectiveness

level of competence
Level of competence
  • Your decision
  • What do you consider a success?


90% of students will meet “acceptable” level of competence using a rubric

  • UT SACS website

  • Gloria Roger’s materials from October workshop (handouts)
  • DIIA Instructional Assessment Resources (IAR) Website

where to start
Where to start?
  • Course related:
    • Course descriptions
    • Syllabi
    • Course objectives
    • Course assignments
    • Course exams
  • Other activities:
    • Student exit survey
    • Alumni survey
    • Employer survey
    • National Standardized Exams

**Key is to “Make use of existing sources”

what can individual faculty do
What can individual faculty do?
  • Syllabus analysis
  • Exam analysis
  • Assignment analysis
  • For more detailed information;

syllabus analysis
Syllabus analysis
  • Identify course objectives
  • Document those objectives in a table
  • Faculty complete table for each of their courses
exam analysis
Exam analysis
  • Identify test items that match course objectives
  • Calculate overall student performance for each item
  • Calculate the average performance for items assessing same objective
  • Determine the level of competence
assignment analysis
Assignment analysis
  • Identify assignment components that match course objectives
  • Assess student performance for each component
  • Determine the level of competence
  • Using rubrics
  • Scoring guidelines
  • A set of categories which describe the important components of the work assessed.
  • Scale
  • Descriptors

Criteria (with indicators)---Things to look for

Standard --- Description of degree of each level

  • Type



resources for rubrics
Resources for Rubrics
  • Sample handouts from Relearning by Design Inc.

  • DIIA workshop material
where when who
Where when who
  • Where --- context for assessment (sample)
  • When --- time of data collection
  • Who ---responsible person

--- who interprets results?

results recommendation action
  • State in future tense
  • What do you expect as results?
further assistance
Further assistance
  • Dr. Neal Armstrong

Vice Provost for Faculty Affairs

Office: MAI 201


Phone: (512) 232-3305; (512) 471-4716