Download
systematic evaluation plans developing a roadmap for program evaluation and accreditation n.
Skip this Video
Loading SlideShow in 5 Seconds..
Systematic Evaluation Plans: Developing a Roadmap for Program Evaluation and Accreditation PowerPoint Presentation
Download Presentation
Systematic Evaluation Plans: Developing a Roadmap for Program Evaluation and Accreditation

Systematic Evaluation Plans: Developing a Roadmap for Program Evaluation and Accreditation

783 Views Download Presentation
Download Presentation

Systematic Evaluation Plans: Developing a Roadmap for Program Evaluation and Accreditation

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Systematic Evaluation Plans: Developing a Roadmap for Program Evaluation and Accreditation Karin K Roberts, PhD, RN, CNE Assessment Technologies Institute Leawood, KS

  2. Objectives • Analyze the relationship between program evaluation and continuous quality improvement. • Review the components of a systematic evaluation plan. • Evaluate the role of benchmarks/levels of achievement in establishing program standards. • Discuss methods used to collect and analyze program-related data. • Evaluate the relationship between program evaluation and accreditation requirements

  3. Program Evaluation • “Systematic assessment of all components of a program through the application of evaluation approaches, techniques, and knowledge in order to improve the planning, implementation, and effectiveness of programs.”(Chen, 2005) • Continuous and systematic process that helps organizations gain information and make decisions for quality improvements. (Escallier & Fullerton, 2012)

  4. Key Words • Continuous • Systematic • Information (data) based decisions • Improvement/Quality improvement

  5. Continuous Quality Improvement Systematic and continuous, data-based decision making process, with the goal of improving program quality. Standards Excellence

  6. Master Plan of Evaluation • Blueprint for program evaluation • Terms used to describe plan • Systematic Evaluation Plan (SEP) • Program Evaluation Plan (PEP) • Comprehensive Evaluation Plan (CEP)

  7. Systematic Evaluation Plan • Offers formal process for ongoing evaluation • Sets standards by which the program will be evaluated • Describes how each program component is to be evaluated • Provides evidence that standards are being met and program changes are based on evidence.

  8. Systematic Evaluation Plan • Required by ACEN • 6.1 – The systematic plan for evaluation of the nursing education unit emphasizes the ongoing assessment and evaluation of each of the following: • Student learning outcomes • Program outcomes • Role-specific graduate competencies • ACEN Standards

  9. Systematic Evaluation Plan • Required by CCNE • IV-A. A systematic process is used to determine program effectiveness Elaboration: • Written, ongoing, exists to determine achievement of program outcomes • Comprehensive • Identifies qualitative and quantitative data to be collected • Includes timelines for collection • Periodically reviewed and revised

  10. Systematic Evaluation Plan • Required by some State Boards of Nursing (example: Kansas) • Written plan that provides evidence of program evaluation and effectiveness and is used for ongoing program improvement. • Program Evaluation Plan developed by faculty along with evidence of data (collected, aggregated, trended, and analyzed) and actions taken (revise, develop, maintain).

  11. Systematic Evaluation Plan • Required for: Continuous Quality Improvement Standards Excellence

  12. Responsibility for Evaluation • Faculty • Committee(s) • Administration

  13. Format of SEP • “Provides a roadmap for organizing and tracking evaluation activities”. (Billings & Halstead, 2012)

  14. SEP Format • ACEN Standards and Criteria • CCNE Standards and Key Elements • Program Components

  15. Standards

  16. Program Components

  17. SEP Matrix Format ACEN –http://www.acenursing.net/resources/SampleSEP.pdf • Plan • Component • Expected Level of Achievement • Frequency of Assessment • Assessment Method/s • Implementation • Results of Data Collection and Analysis (including actual level of achievement) • Actions (for Program Development, Maintenance, or Revision)

  18. SEP Matrix Format • CCNE or Generic

  19. Example

  20. Responsible Entity/Schedule • Committees • Evaluation • SEP • Schedule • Curriculum • Other Committees • Administration/Faculty • Staff • Course and faculty evaluations

  21. Responsible Entity/Schedule • Formative • Collected during program of study • Indicates progress/lack of progress towards meeting outcomes • Allows ongoing changes • Summative • Collected at end or after program of study • Determines if outcomes were met • NOTE: Collect Satisfaction Surveys when students are unbiased and not angry

  22. Responsible Entity/Schedule • Annually • Every 3 to 5 years • After substantive change

  23. Sources/Activities • Standardized assessments (ATI CMS/CP) • Interviews/Focus groups (qualitative data) • Attitudinal scales (Likert, semantic differential) • Rubrics (target-specific scoring tool)

  24. Sources/Activities • Standardized assessments (ATI CMS/CP) • Review aggregate, trended group data – 3 yrs • Interviews/Focus groups • Conduct a content analysis • Attitudinal scales (Likert, Semantic Differential) • Review Likert and Semantic group means • Rubrics • Review group means

  25. Example

  26. SEP Matrix Format

  27. Benchmarks and Outcomes • Benchmark • Specifies desired level of achievement or excellence • Used to measure quality • CCNE Standard II and IV • ACEN Standard 4 • Outcome/“Level of Achievement” • Measures attainment of benchmark • “Expected” and “Actual”

  28. Outcomes • Outcomes specified by CCNE • IV-B. Completion rate 70% or higher past 3 yrs. • IV-C. NLCEX pass rate 80% or higher past 3 yrs. • IV-D. Employment rate 70% or higher over 12 mos. • Outcomes determined by program • II-A. Fiscal and physical resources are sufficient… • IV-E. Other outcomes (student learning outcomes; student/alumni achievement; student/alumni/ employer satisfaction data)

  29. Outcomes • Outcomes/“Level of achievement” determined by ACEN • 6.4.1 Licensure exam: Three-year mean...will be at or above the national mean... • Outcomes/“Level of achievement” determined by program • 6.4.2 Program completion rates • 6.4.5 Job placement rates

  30. Example

  31. Example

  32. SEP Matrix Format

  33. Data Summary • Most important part of SEP • Data is collected, aggregated, trended, and analyzed • Usually required for a minimum of 3 years • Decisions are made based on data analysis

  34. Examples • Compare TEAS scores to completion rates • Compare admission GPA to course grades • Compare nursing GPA with CMS and Comprehensive Predictor scores • Review use of online assessments to proctored assessment scores • Review time spent on remediation to CMS and Comprehensive Predictor retake scores • Other examples ?

  35. Plan for Maintenance/Improvement • Decisions are made based on data analysis • Maintenance • Improvement • Revision • Faculty MUST be involved with data analysis and decision making

  36. SEP Matrix Format • CCNE or Generic

  37. Disposition of Data • Rigorous records of ALL decisions made must be kept • Maintenance, improvement, revision • Action plan • Plan for re-evaluation after change implemented • Meeting minutes • Hard or e-copy • Committee • Month, date, year

  38. Feedback Loop Actual Outcome did not meet Expected Outcome

  39. Example

  40. Road Blocks • Distance from process • Provide regular updates of review process • Schedule formal annual review • Lack of ownership • Rotate educators on Evaluation Committee • Faculty turn over • New faculty orientation • New faculty on Evaluation Committee

  41. Using the SEP as a Road Map Is program evaluation a destination or a journey? Why?

  42. References • Billings, D.M. & Halstead, J.A. (2012). Teaching in nursing: A guide for faculty. St. Louis, MO; Elsevier. • CCNE (2013). Standards for accreditation of baccalaureate and graduate nursing programs. DuPont Circle, Washington DC: CCNE. • Chen, H. (2005). Theory-Driven Evaluations. Newbury Park, CA: Sage • Escallier, L.A. & Fullerton, J.T. (2012). An innovation in design of a school of nursing evaluation protocol. Nurse Educator, 37(5), 187-191. • Accreditation Commission for Education in Nursing (2013). Accreditation manual. Atlanta, GA: ACEN • Stavropoulou, A. & M. Kelesi (2013). Concepts and methods of evaluation in nursing education – a methodological challenge. Health Science Journal, 6(1), 11-23.