Systematic Evaluation Plans: Developing a Roadmap for Program Evaluation and Accreditation Karin K Roberts, PhD, RN, CNE Assessment Technologies Institute Leawood, KS
Objectives • Analyze the relationship between program evaluation and continuous quality improvement. • Review the components of a systematic evaluation plan. • Evaluate the role of benchmarks/levels of achievement in establishing program standards. • Discuss methods used to collect and analyze program-related data. • Evaluate the relationship between program evaluation and accreditation requirements
Program Evaluation • “Systematic assessment of all components of a program through the application of evaluation approaches, techniques, and knowledge in order to improve the planning, implementation, and effectiveness of programs.”(Chen, 2005) • Continuous and systematic process that helps organizations gain information and make decisions for quality improvements. (Escallier & Fullerton, 2012)
Key Words • Continuous • Systematic • Information (data) based decisions • Improvement/Quality improvement
Continuous Quality Improvement Systematic and continuous, data-based decision making process, with the goal of improving program quality. Standards Excellence
Master Plan of Evaluation • Blueprint for program evaluation • Terms used to describe plan • Systematic Evaluation Plan (SEP) • Program Evaluation Plan (PEP) • Comprehensive Evaluation Plan (CEP)
Systematic Evaluation Plan • Offers formal process for ongoing evaluation • Sets standards by which the program will be evaluated • Describes how each program component is to be evaluated • Provides evidence that standards are being met and program changes are based on evidence.
Systematic Evaluation Plan • Required by ACEN • 6.1 – The systematic plan for evaluation of the nursing education unit emphasizes the ongoing assessment and evaluation of each of the following: • Student learning outcomes • Program outcomes • Role-specific graduate competencies • ACEN Standards
Systematic Evaluation Plan • Required by CCNE • IV-A. A systematic process is used to determine program effectiveness Elaboration: • Written, ongoing, exists to determine achievement of program outcomes • Comprehensive • Identifies qualitative and quantitative data to be collected • Includes timelines for collection • Periodically reviewed and revised
Systematic Evaluation Plan • Required by some State Boards of Nursing (example: Kansas) • Written plan that provides evidence of program evaluation and effectiveness and is used for ongoing program improvement. • Program Evaluation Plan developed by faculty along with evidence of data (collected, aggregated, trended, and analyzed) and actions taken (revise, develop, maintain).
Systematic Evaluation Plan • Required for: Continuous Quality Improvement Standards Excellence
Responsibility for Evaluation • Faculty • Committee(s) • Administration
Format of SEP • “Provides a roadmap for organizing and tracking evaluation activities”. (Billings & Halstead, 2012)
SEP Format • ACEN Standards and Criteria • CCNE Standards and Key Elements • Program Components
SEP Matrix Format ACEN –http://www.acenursing.net/resources/SampleSEP.pdf • Plan • Component • Expected Level of Achievement • Frequency of Assessment • Assessment Method/s • Implementation • Results of Data Collection and Analysis (including actual level of achievement) • Actions (for Program Development, Maintenance, or Revision)
SEP Matrix Format • CCNE or Generic
Responsible Entity/Schedule • Committees • Evaluation • SEP • Schedule • Curriculum • Other Committees • Administration/Faculty • Staff • Course and faculty evaluations
Responsible Entity/Schedule • Formative • Collected during program of study • Indicates progress/lack of progress towards meeting outcomes • Allows ongoing changes • Summative • Collected at end or after program of study • Determines if outcomes were met • NOTE: Collect Satisfaction Surveys when students are unbiased and not angry
Responsible Entity/Schedule • Annually • Every 3 to 5 years • After substantive change
Sources/Activities • Standardized assessments (ATI CMS/CP) • Interviews/Focus groups (qualitative data) • Attitudinal scales (Likert, semantic differential) • Rubrics (target-specific scoring tool)
Sources/Activities • Standardized assessments (ATI CMS/CP) • Review aggregate, trended group data – 3 yrs • Interviews/Focus groups • Conduct a content analysis • Attitudinal scales (Likert, Semantic Differential) • Review Likert and Semantic group means • Rubrics • Review group means
Benchmarks and Outcomes • Benchmark • Specifies desired level of achievement or excellence • Used to measure quality • CCNE Standard II and IV • ACEN Standard 4 • Outcome/“Level of Achievement” • Measures attainment of benchmark • “Expected” and “Actual”
Outcomes • Outcomes specified by CCNE • IV-B. Completion rate 70% or higher past 3 yrs. • IV-C. NLCEX pass rate 80% or higher past 3 yrs. • IV-D. Employment rate 70% or higher over 12 mos. • Outcomes determined by program • II-A. Fiscal and physical resources are sufficient… • IV-E. Other outcomes (student learning outcomes; student/alumni achievement; student/alumni/ employer satisfaction data)
Outcomes • Outcomes/“Level of achievement” determined by ACEN • 6.4.1 Licensure exam: Three-year mean...will be at or above the national mean... • Outcomes/“Level of achievement” determined by program • 6.4.2 Program completion rates • 6.4.5 Job placement rates
Data Summary • Most important part of SEP • Data is collected, aggregated, trended, and analyzed • Usually required for a minimum of 3 years • Decisions are made based on data analysis
Examples • Compare TEAS scores to completion rates • Compare admission GPA to course grades • Compare nursing GPA with CMS and Comprehensive Predictor scores • Review use of online assessments to proctored assessment scores • Review time spent on remediation to CMS and Comprehensive Predictor retake scores • Other examples ?
Plan for Maintenance/Improvement • Decisions are made based on data analysis • Maintenance • Improvement • Revision • Faculty MUST be involved with data analysis and decision making
SEP Matrix Format • CCNE or Generic
Disposition of Data • Rigorous records of ALL decisions made must be kept • Maintenance, improvement, revision • Action plan • Plan for re-evaluation after change implemented • Meeting minutes • Hard or e-copy • Committee • Month, date, year
Feedback Loop Actual Outcome did not meet Expected Outcome
Road Blocks • Distance from process • Provide regular updates of review process • Schedule formal annual review • Lack of ownership • Rotate educators on Evaluation Committee • Faculty turn over • New faculty orientation • New faculty on Evaluation Committee
Using the SEP as a Road Map Is program evaluation a destination or a journey? Why?
References • Billings, D.M. & Halstead, J.A. (2012). Teaching in nursing: A guide for faculty. St. Louis, MO; Elsevier. • CCNE (2013). Standards for accreditation of baccalaureate and graduate nursing programs. DuPont Circle, Washington DC: CCNE. • Chen, H. (2005). Theory-Driven Evaluations. Newbury Park, CA: Sage • Escallier, L.A. & Fullerton, J.T. (2012). An innovation in design of a school of nursing evaluation protocol. Nurse Educator, 37(5), 187-191. • Accreditation Commission for Education in Nursing (2013). Accreditation manual. Atlanta, GA: ACEN • Stavropoulou, A. & M. Kelesi (2013). Concepts and methods of evaluation in nursing education – a methodological challenge. Health Science Journal, 6(1), 11-23.