1 / 33

Program Outcomes Assessment Techniques: Overview and Examples

Program Outcomes Assessment Techniques: Overview and Examples. “ I have never let my schooling interfere with my education.” - Mark Twain. Steve Steele, Consultant Faculty Online Technology Training Consortium (FOTTC) Applied Sociologist Anne Arundel Community College

andrew
Download Presentation

Program Outcomes Assessment Techniques: Overview and Examples

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Program Outcomes Assessment Techniques: Overview and Examples “I have never let my schooling interfere with my education.” - Mark Twain Steve Steele, Consultant Faculty Online Technology Training Consortium (FOTTC)Applied Sociologist Anne Arundel Community College sfsteele@mail.aacc.cc.md.us 410-541-2369

  2. Common Conceptualizations of Evaluation/assessment • Is it viewed a “a reign of terror?” • Is it “stuck on like a barnacle”; an afterthought • Is it an ongoing process in itself? • An organized entity in itself • Part of a management information system

  3. Where are We Going Today? • Evaluation? Assessment? Definitions • Reasons For Doing This • Placement of Evaluation/assessment • The Dynamic Fit Between Planning, Evaluation/assessment and Doing In the Improvement Process • Examples • Summary: Strengths and Weaknesses

  4. Evaluation? Assessment? Definitions • The systematic assessment of the operation and/or outcomes of a program or policy, compared to explicit or implicit standards, in order to contribute to the improvement of the program or policy. (Weiss, 1998) • “The key sense of the term... refers to the process of determining the merit, worth, or value of something...” (Scriven, 1991, pp. 139-141) • “evaluation/assessment research... refers to research purpose rather than research method.Babbie,1989, 326)

  5. Critical Components of Definitions • A systematic process • A judgment will likely be made • Standards for judgment, for success • Intervention • Orientation toward Improvement?

  6. Internal Clients/Stakeholders The President The Dean Your Chairman Your students You! External Clients/Stakeholders Societal Middle States Professional Granting agencies Reasons For Doing This

  7. Reasons For Doing This • Have you considered the client’s “corporate culture” when constructing this project/design? • Are you sure that you have focused on the client’s needs for ALL relevant clients and stakeholders?

  8. Placement of Evaluation/assessment • Think of evaluation/assessment before the program, not after. • Integrate evaluation/assessment into the grant proposal, plan or program. • Use evaluative data “for all it’s worth!” • Consider evaluation/assessment as constructive criticism, rather than “doing time!”

  9. The Dynamic Fit Between Planning, Evaluation/assessment and Doing In the Improvement Process HERE!! Based on E. Deming’s work

  10. “Plan” and Define it.

  11. What is it? Intervention Definitions

  12. Example: A Distance Learning Business Plan http://www.aacc.cc.md.us/ola/ http://www.aacc.cc.md.us/ola/Assessment/BusPlanNov98.html

  13. Goals and Objectives... http://www.aacc.cc.md.us/ola/Assessment/BusPlanNov98.html#To implement

  14. “Check” • Some Approaches to “Checking”Keeping track... is it operating (not necessarily successfully!)Forming something, improving somethingSum up, impact… did it work? Based on Michael Scriven’s Formative & Summative Evaluation

  15. Some Tools for Tracking, Forming and Summing • Frequency of something • Rates or percentages of something • Level of Need • Opinions, behaviors, feelings • Indicators, Estimates and Forecasts • Survey • Use patterns • Focus groups

  16. A Plan Level Example #1 Wells, Kepner, Barnes, SteeleAACC

  17. Toward an Evaluation/assessment Plan http://www.aacc.cc.md.us/ola/Assessment/Baldrige.html

  18. Scroll http://www.aacc.cc.md.us/ola/Assessment/assessmentStudentStakeholder.htm

  19. Measures…. http://www.aacc.cc.md.us/ola/Assessment/studentsurveyresults.htm

  20. A Plan Level Example #2 Barnes, Warner, SteeleAACC

  21. Some Tools for Tracking, Forming and Summing • Frequency of something • Rates or percentages of something • Level of Need • Opinions, behaviors, feelings • Indicators, Estimates and Forecasts • Survey • Use patterns • Focus groups

  22. On Line Enrollment

  23. A Course Level Example #3 SteeleAACC

  24. Some Formative Checks on Soc111 Online

  25. How Many Links on the Module Did You Visit (all modules, N=188 response)?

  26. Rate this Module's Learning Value (all modules, N=188 response)?

  27. Rate this Module's Interest Value (all modules, N=188 responses)?

  28. Summary: Strengths and Weaknesses • StrengthsOpportunity for continuos growth Possibility for continuous improvementMakes planning usefulImproves program and course design • WeaknessesResource and infrastructure commitmentRequires a variety of skillsDifficult to stay “tangent”Demands attention

More Related