1 / 28

Using Performance Measurements to Evaluate an Education Leadership Master’s Program

Using Performance Measurements to Evaluate an Education Leadership Master’s Program. Presented at: Northern Nevada Higher Education Assessment Conference 2/6/04. By: Bill Thornton, UNR Department of Education Leadership Gus Hill, UNR Department of Education Leadership

read
Download Presentation

Using Performance Measurements to Evaluate an Education Leadership Master’s Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Performance Measurements to Evaluate an Education Leadership Master’s Program Presented at: Northern Nevada Higher Education Assessment Conference 2/6/04 By: Bill Thornton, UNR Department of Education Leadership Gus Hill, UNR Department of Education Leadership Tara Shepperson, UNR Department of Education Leadership

  2. Why Do We Evaluate? “... The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought, with some reason, that there was no punishment more severe than eternally futile labor....” The Myth of Sisyphus

  3. Defining Evaluation • Evaluation is... the systematic investigation of the merit, worth, or significance of an “object”–Michael Scriven

  4. Avoid--Forgetting Intermediate Outcomes

  5. Framework for Program Evaluation 1. Engage stakeholders 6. Feedback and improvement 2. Describe the program Standards . Utility . Feasibility . Valid . Reliability 3. Focus on the evaluation design 5. Develop data-based conclusions 4. Collect credible data from multiple sources

  6. 1: Engage Stakeholders • Pre-Evaluation: Early identification of disagreements in… • Definition of the problem • Priority activities • Priority outcomes • What constitutes “proof” of success • Post-Evaluation: Get their help with… • Credibility of findings • Access to key players • Follow-up • Dissemination of results

  7. 1: Ed Leadership Stakeholders • Faculty, College of Ed., UNR • Students • School districts, students, teachers, administrators • The public: business, parents, community

  8. 2: Describe the Program: • How do logic models structure evaluations and promote continuous improvement? • Clarity for department & stakeholders on: • What activities • Intended effects, outcomes, and relationships • Long term effects v. short term effects • Helps focus decision making • Outcome evaluation (effects) • Process evaluation (activities) • Program improvements

  9. 2: EL Master’s Program • Approximately 60 students • State requirements for certification • Program of studies • Exit exam

  10. 3: Focus on the Evaluation Design What questions are being asked? • What intervention was actually delivered? • Were impacts and outcomes achieved? • Was the intervention responsible for the impacts and outcomes? • What does the research indicate?

  11. 3: Evaluation of EL Program • Expectation • Master knowledge • Exhibit specific performances • Assessment aligned with standards • Validity • Reliable • Basis for data-based decisions • Program improvement • NCATE

  12. 3: Education Leadership Program Assessment • Student performance by course • Praxis exam results • Internship portfolio • Program completers survey

  13. You Get What You Measure… “…In Poland in the 1970s, furniture factories were rewarded based on pounds of product shipped. “…. today Poles have the world’s heaviest furniture…” (New York Times, 3/4/99)

  14. 4: Collect Credible Data from Multiple Sources Choosing data collection methods • Typical factors might be: • Time • Cost • Sensitivity of the issue • “Hawthorne effect” • Ethics • Validity • Reliability • Utility, accuracy, & feasibility • Usually trade-offs

  15. 4: For example ISLLC Standards: Validity The Department of Education Leadership has adopted the standards of Interstate School Leaders Licensure Consortium (ISLLC). • Vision of learning. • School culture & instructional program. • Management and operations. • Collaboration with families and community. • Integrity, fairness and ethics. • Response to larger political, social, legal context.

  16. 5: Develop data-based conclusions Evaluators must “justifying conclusions & recommendations” • Can we consider the project a success--- • Examples: • Can attribute changes to the project • The project reduced disparities • The project leaves a “legacy” • The project can be sustained long-term • Conclusions are data-based

  17. 5: Develop data-based conclusions Actual Results • Performance vs. a comparison/control group • Time sequence • Plausible mechanisms (or pathways toward change) • Accounting for alternative explanations • Similar effects observed in similar contexts

  18. 5: Initial results for evaluation • Student performance by course • Praxis exam results • Internship portfolio • Program completers survey

  19. 5A: Student performance assessment Progress Assessment • Students meet course expectations • Student Field experience • Courses mapped to ISLLC standards

  20. 5B: Praxis exam Compares to national standard • Over 16,000 people took the Praxis II Exam over the last year. The typical score was between 640 and 740. 50% of those who took the exam scored in the 640 to 740 range. • How have our program completers fared? • (Through summer, 2003)

  21. What does the Praxis Cover? • Determining pupil & community needs (9%) • Curriculum design & instructional leadership (13%) • Development of staff & program evaluation (15%) • School management (34%) • Individual & group leadership skills (29%)

  22. 5C: Internship portfolio • Demonstrate skill and knowledge • Aligned with curriculum map • Product a student portfolio

  23. 5D: Program Completers Survey • Mapping Standards • Critical Incidents • Interviews

  24. 6: Feedback and Improvement Maximizing use of results • Provide feedback • Major outcomes- data-based • Explain results clearly • Implement continuous improvement plan • Strategic planning process • Vision, mission, goals • Remember the Audience • How will they use the information provided? • How much time will they be willing to spend reading and assimilating the material?

  25. 6: Improvements to EL Master’s Program • Refined curriculum map • Required field experience for each course • Added courses • Data-based decision making • SPED Law

  26. Ending Quote The establishment of an orthodox evaluation methodology is no different from the establishment of a state religion. Officially telling you what method to use is only one step removed from officially telling you what results to find. At that point, utilization of findings will cease to be an issue - there will be nothing to use, only orders to follow. - Patton (1990)

  27. Questions?

More Related