1 / 45

Where are we with assessment and where are we going? Cees van der Vleuten

Where are we with assessment and where are we going? Cees van der Vleuten University of Maastricht This presentation can be found at: www.fdg.unimaas.nl/educ/cees/amee. Overview of presentation. Where is education going? Where are we with assessment?

sonel
Download Presentation

Where are we with assessment and where are we going? Cees van der Vleuten

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Where are we with assessment and where are we going? Cees van der Vleuten University of Maastricht This presentation can be found at: www.fdg.unimaas.nl/educ/cees/amee

  2. Overview of presentation • Where is education going? • Where are we with assessment? • Where are we going with assessment? • Conclusions

  3. Where is education going? • School-based learning • Discipline-based curricula • (Systems) integrated curricula • Problem-based curricula • Outcome/competency-based curricula

  4. Where is education going? • Underlying educational principles: • Continuous learning of, or practicing with, authentic tasks (in steps of complexity; with constant attention to transfer) • Integration of cognitive, behavioural and affective skills • Active, self-directed learning & in collaboration with others • Fostering domain-independent skills, competencies (e.g. team work, communication, presentation, science orientation, leadership professional behaviour….).

  5. Where is education going? Constructivism Cognitive psychology • Underlying educational principles: • Continuous learning of, or practicing with, authentic tasks (in steps of complexity; with constant attention to transfer) • Integration of cognitive, behavioural and affective skills • Active, self-directed learning & in collaboration with others • Fostering domain-independent skills, competencies (e.g. team work, communication, presentation, science orientation, leadership professional behaviour….). Collaborative learning theory Cognitive load theory Empirical evidence

  6. Where is education going? • Work-based learning • Practice, practice, practice…. • Optimising learning by: • More reflective practice • More structure in the haphazard learning process • More feedback, monitoring, guiding, reflection, role modelling • Fostering of learning culture or climate • Fostering of domain-independent skills (professional behaviour, team skills, etc).

  7. Where is education going? • Work-based learning • Practice, practice, practice…. • Optimising learning by: • More reflective practice • More structure in the haphazard learning process • More feedback, monitoring, guiding, reflection, role modelling • Fostering of learning culture or climate • Fostering of domain-independent skills (professional behaviour, team skills, etc). Deliberate practice Emerging work-based learning theories Empirical evidence

  8. Where is education going? • Educational reform is on the agenda everywhere • Education is professionalizing rapidly • A lot of ‘educational technology’ is available • How about assessment?

  9. Overview of presentation • Where is education going? • Where are we with assessment? • Where are we going with assessment? • Conclusions

  10. Established technology of efficient written or computer-based high fidelity simulations (MCQ, Key Feature, Script Concordance Test, MEQs….) Knows how Knows Expanding our toolbox….. Does Shows how Knows how Knows

  11. Established technology of structured high fidelity in vitro simulations requiring behavioural performance (OSCE, SP-based testing, OSPE….) Knows how Shows how Expanding our toolbox….. Does Shows how Knows how Knows

  12. Emerging technology of appraising in vivo performance (Work-based assessment: Clinical work-sampling, Mini-CEX, Portfolio, practice visits, case orals….) Does Shows how Expanding our toolbox….. Does Shows how Knows how Knows

  13. Emerging technology of appraising in vivo performance (self-, peer, co-assessment, portfolio, multisource feedback, learning process evaluations……) “Domain independent” skills Expanding our toolbox….. Does Shows how Knows how Knows “Domain specific” skills

  14. What have we learned? • Competence is specific, not generic

  15. Reliability as a function of testing time Mini CEX6 0.73 0.84 0.92 0.96 Case- Based Short Essay2 0.68 0.73 0.84 0.82 Practice Video Assess- ment7 0.62 0.76 0.93 0.93 In- cognito SPs8 0.61 0.76 0.92 0.93 Testing Time in Hours 1 2 4 8 MCQ1 0.62 0.76 0.93 0.93 PMP1 0.36 0.53 0.69 0.82 Oral Exam3 0.50 0.69 0.82 0.90 Long Case4 0.60 0.75 0.86 0.90 OSCE5 0.47 0.64 0.78 0.88 1Norcini et al., 1985 2Stalenhoef-Halling et al., 1990 3Swanson, 1987 4Wass et al., 2001 5Petrusa, 2002 6Norcini et al., 1999 7Ram et al., 1999 8Gorter, 2002

  16. What have we learned? • Competence is specific, not generic • Any single point measure is flawed • One measure is no measure • No method is inherently superior • Subjectivity/unstandardised conditions is not something to be afraid of.

  17. What have we learned? • Competence is specific, not generic • One method can’t do it all

  18. Does Does OSCEs Knows how Knows how Key features (short cases) Direct observation methods, Portfolio Knows Knows Shows how Shows how Shows how Magic expectations……. Does Shows how Knows how Knows

  19. What have we learned? • Competence is specific, not generic • One method can’t do it all • One measure is no measure • We need a mixture of methods to cover the entire pyramid • We can choose from a rich toolbox!

  20. What have we learned? • Competence is specific, not generic • One method can’t do it all • Assessment drives learning

  21. Assessment and learning “The in-training assessment programme was perceived to be of benefit in making goals and objectives clear and in structuring training and learning. In addition, and not surprisingly, this study demonstrated that assessment fosters teaching and learning.….” (Govaerts et al, 2004, p. 774)

  22. Assessment and learning “Feedback generally inconsistent with and lower than self-perceptions elicited negative emotions. They were often strong, pervasive and long-lasting….” (Sargeant et al., under editorial review)

  23. Assessment and learning “You just try and cram - try and get as many of those facts into your head just that you can pass the exam and it involves… sadly it involves very little understanding because when they come to the test, when they come to the exam, they’re not testing your understanding of the concept. They test whether you can recall ten facts in this way? ” (Student quote from Cilliers et al., in preparation)

  24. The continuous struggle Curriculum Assessment Content Format Programming/ scheduling Regulations Standards Examiners… Learner

  25. What do we know? • Competence is specific, not generic • One method can’t do it all • Assessment drives learning • Verify the consequences • Use the effect strategically • Educational reforms are as good as the assessment allows it to be.

  26. What do we know? • Competence is specific, not generic • One method can’t do it all • Assessment drives learning • Verify the consequences • Use the effect strategically • Educational reforms are as good as the assessment allows it to be.

  27. Overview of presentation • Where is education going? • Where are we with assessment? • Where are we going with assessment? • Conclusions

  28. My assumptions • Innovation in education programmes can only be as successful as the assessment programme is • Assessment should reinforce the direction of education that we are going • Future directions should use our existing evidence on what matters in assessment.

  29. The Big Challenge • Established assessment technologies have been developed in the conventional psychometric tradition of standardisation, objectification & structuring • Emerging technologies are in vivo and by nature less standardized, unstructured, noisy, heterogeneous, subjective • Finding an assessment answer beyond the classic psychometric solutions is The Big Challenge for the future.

  30. Design requirements future assessment • Dealing with real-life: • In vivo assessment cannot and should not be (fully) standardized, structured and objectified • Includes quantitative AND qualitative information • Professional and expert judgement play a central role.

  31. Design requirements future assessment • Dealing with learning: • All assessment should be meaningful to learning, thus information rich • Assessment should be connected to learning (framework of the curriculum and the assessment are identical) • Assessment is ‘embedded’ in learning (equals the ‘in vivo of educational practice’ and adds significantly to the complexity).

  32. Design requirements future assessment • Dealing with sampling: • Assessment is programmatic • Comprehensive, includes domain-specific and domain independent skills • Combines sampling across many information sources, methods, examiners/judges/ occasions….. • Is planned, coordinated, implemented, evaluated, revised (just like a curriculum design).

  33. Challenges we face • Dealing with real life: • How to use professional judgement? Do we understand judgment? • How to elicit, structure and record qualitative information? • How to use (flexible) standards? • What strategies for sampling should we use? When is enough enough? • How to demonstrate rigour? What (psychometric, statistical, qualitative) models are appropriate?

  34. Challenges we face • Dealing with learning: • What are methodologies for embedding assessment (e.g. Wilson & Sloane, 2000)? • How to deal with the confounding of the teaching and assessor role? • How to combine formative and summative assessment? • How to involve stakeholders? • How to educate stakeholders?

  35. Challenges we face • Dealing with sampling at the programme level: • What strategies are useful in designing a sampling plan or structure of an assessment programme? • How to combine qualitative and quantitative information? • How to use professional judgement in decision making on aggregated information? • How to longitudinally monitor competence development? • What are (new) strategies for demonstrating rigour in decision making? What formal models are helpful?

  36. Contrasting views in approach Programmatic embedded assessment Conventional assessment Assessment separate from learning Assessment as part of learning Method-centred Programme-centred (based on overarching cohesive structure) Context free Context matters (dynamic relation between an ability, a task and a context in which the task occurs - Epstein & Hundert, 2002)

  37. Contrasting approaches in research Programmatic embedded assessment Conventional assessment Rigour defined in direct (statistical) outcome measures Rigour defined by evidence on thrustworthiness or credibility on the assessment process Reliability/validity Saturation of information, triangulation Benchmarking Accounting

  38. Confused Contrasting views in approach Programmatic embedded assessment Conventional assessment

  39. Overview of presentation • Where is education going? • Where are we with assessment? • Where are we going with assessment? • Conclusions

  40. Conclusions • Assessment has made tremendous progress • Good assessment practices based on established technology are implemented widely • Sharing of high quality assessment material has begun (IDEAL, UMAP, Dutch consortium)

  41. Conclusions • We are facing a major next step in assessment • We have to deal with the real world • The real world is not only the work-based setting but also the educational training setting

  42. Conclusions • To make that step: • We need to think out of the box • New methodologies to support assessment strategies • New methodologies to validate the assessment

  43. I’m here because I couldn’t change the assessment Conclusions • There is a lot at stake: • Educational reform depends on it

  44. Conclusions • Let’s join forces to make that next step!

  45. This presentation can be found on: www.fdg.unimaas.nl/educ/cees/amee

More Related