1 / 16

National Health Systems Education Scotland 2007

Using Portfolios to Evaluate Leadership Competence: Can reflective learning be combined with assessment?. National Health Systems Education Scotland 2007. Reflective Learning. Concrete Experience. Reflection. Active experimentation. Learner. Conceptualizing.

tivona
Download Presentation

National Health Systems Education Scotland 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Portfolios to Evaluate Leadership Competence: Can reflective learning be combined with assessment? National Health Systems Education Scotland 2007

  2. Reflective Learning.

  3. Concrete Experience Reflection Active experimentation Learner Conceptualizing Reflective Learning – Educational Rationale: (Adapted from Kolb, 1984)

  4. Why use portfolios to evaluate evidence ofLeadership competence? • Strengths : authenticity. • Portfolio development charts growth overtime → more authentic form of assessment which accurately represents learner ability. (e.g. by drawing on more than one piece of evidence) (Chang, 2001) • Within leadership education, portfolios often are perceived as evaluation tools that can be used to assess performance in authentic contexts. (Driessen et al 2005) • Duque (2003): The strength of portfolios is in the assessment of skills & attitudes, which are difficult to measure with more traditional assessment methods/tools.

  5. Miller’s Assessment Triangle

  6. Portfolios – Weaknesses: • Debate revolves around reliability & validity issues, also what constitutes ‘good’ evidence. • Does portfolio assessment simply measure ability to write about professional practice rather than a standard of practice itself ?(e.g. McMullan et al., 2003) • Time & effort required for portfolio construction plus the uncertainty about what to include as evidence are germane issues.

  7. Portfolio structure • As recommended by Abrami and Barrett (2005); Klenowski, Askew and Carnell (2006), The NES leadership Module portfolios comprise: • Experiential evidence of leadership/supervisory competence; • Reasons for selecting that evidence; • What the portfolio creator learned. Throughout Module, learners must: • Monitor progress against each competence statement listed; • Gather evidence to support their judgement in the portfolio; • Seek Guidance where appropriate.

  8. Competency framework for portfolio evidence: Example

  9. Portfolio evaluation / Review process. • Criteria for review: • New process; still evolving. Facilitator will consider – Does evidence offered: • Adequately illustrate specific statement of leadership/supervisory competence? • Explicitly link leadership/supervisory theory & practice? • Adequately illustrate reflective learning (e.g. what happened, how did I deal with it; what might I do differently next time?)

  10. Portfolio evaluation / Review process II – • Answers to the foregoing questions will be used to classify portfolios as: • “Excellent/highly satisfactory”, • “Satisfactory”, & • “Need for revision/resubmission”.

  11. Can portfolios combine reflective learning & assessment? • McMullan et al. (2003) expressed concern over impact of assessment purpose on selection of portfolio evidence. • How do we address this? • NES offers learner control over portfolio content, by separating: • Private reflective learning – online Reflective Journal; • “Shared” evidence of progress. (Portfolio) • Advantages: • Reflective Journal remains confidential to author – less inhibited; • Having to provide a rationale for selection of evidence helps consolidate reflective learning; • Combining evidence from different sources (e.g. Reflective Journal & Supervision Notes) integrates learning & practice.

  12. Feedback from pilot external reviewers: • Quotations from personal reflective logs & specific well-grounded examples of good practice worked particularly well. • Valuable evidence was often provided on progression & theory-practice links. • Participants seem to have benefited from the guidance & examples of good practice offered through Blackboard by the facilitator. • In general, manageable & credible system of assessing competence is evolving.

  13. Conclusions: • Feedback from systematic piloting suggests developing process viable both for learners & facilitators. • Ability to “cut & paste” selected material from private online Reflective Journal enables us to combine reflective learning & valid assessment.

  14. Issues to consider - • Weaknesses in current method: • Portfolio review process based on self-report; • Portfolios only inform us about…’competencies in an indirect way – there is no observation’ (Delandshere and Arens, 2003) • Future: • Need triangulation with additional “evidence”: • e.g. observation of leadership/supervision in practice. • Feedback from subordinates of the leader/supervisor? (360 assessment) • Other?

  15. References • Abrami, P.C. and Barrett, H. (2005). ‘Directions for research and development on electronic portfolios’. Canadian Journal of Learning and Technology. 31(3). Online version. • Chang, C. (2001). Construction and evaluation of a web-based learning portfolio system: An electronic assessment tool. Innovations in Education and Teaching International. 38(2): 144-155. • Delandshere, G. and Arens, S.A. (2003). ‘Examining the quality of the evidence in preservice teacher portfolios’. Journal of Teacher Education. 54(1): 57-73. • Driessen, E. van der Vleuten, C., Schurwirth, L., van Tartwijk, J. and Vermunt, J. (2005). ‘The use of qualitative research criteria for portfolio assessment as an alternative to reliability evaluation: A case study’. Medical Education. 39(2): 214-220. • Duque, G. (2003). ‘Web-based evaluation of medical clerkships: A new approach to immediacy and efficacy of feedback and assessment’. Medical Teacher. 25(5): 510-514. • Hall-Marley, S. (2001). ‘Supervisor Feedback Form’. Available online at: www.cfalendar.com [Last accessed April 2006].

  16. References – Cont’d • Klenowski, V., Askew, S. and Carnell, E. (2006). ‘Portfolios for learning, assessment and professional development in higher education’. Assessment and Evaluation in Higher Education. 31(3): 267-286. • McMullan, M., Endacott, R., Gray, M., Jasper, M., Miller, C., Scholes, J. (2003). ‘Portfolios and assessment of competence: A review of the literature’. Journal of Advanced Nursing. 41(3): 283-294. • Miller, G.E. (1990). ‘The assessment of clinical skills/competence/performance’. Academic Medicine (supplement). 65: S63-S7. • Rees, C. and Sheard, C. (2004). ‘The reliability of assessment criteria for undergraduate medical students’ communication skills portfolios: the Nottingham experience’. Medical Education. 38(2): 138-144.

More Related