1 / 47

Formative assessment the EASiHE way

Formative assessment the EASiHE way. Outputs from EASiHE HE Research Group 22 nd January 2010. Bill Warburton David Bacigalupo. What's the problem?. *DDA = dumbed-down assessment. Three recent UK surveys of CAA: 1993 (Stephens & Maskia) 1999 CAA Centre (Bull, Hesketh & McKenna)

Download Presentation

Formative assessment the EASiHE way

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formative assessment the EASiHE way Outputs from EASiHE HE Research Group 22nd January 2010 Bill Warburton David Bacigalupo

  2. What's the problem? *DDA = dumbed-down assessment Three recent UK surveys of CAA: • 1993 (Stephens & Maskia) • 1999 CAA Centre (Bull, Hesketh & McKenna) • 2003-04 (Warburton & Conole) Perennial concerns : • “Every body knows that CAA can’t test higher level outcomes (HLOs)” • CAA = DDA*

  3. What are HLOs? HLOs LLOs Increasing level of abstraction Bloom’s (1956) taxonomy of the cognitive domain: - a way of categorizing the level of abstraction of learning outcomes in educational settings – EVALUATION SYNTHESIS ANALYSIS APPLICATION COMPREHENSION KNOWLEDGE RECALL

  4. How ‘high’ can we go? Increasing level of abstraction • Most educationists agree that, with care, objective items can be used to test knowledge (andmaybe comprehension) • Increasing levels of abstraction are progressively more difficult to test... EVALUATION SYNTHESIS ANALYSIS APPLICATION COMPREHENSION KNOWLEDGE RECALL

  5. A formative taxonomy Increasing level of abstraction • For FORMATIVE evaluation, Bloom et al. (1971) distinguished a hierarchy of levels of behaviour parallel to the cognitive taxonomy • Added value to the cognitive taxonomy by making it easier for teachers to identify ‘mal-rules’ – flaws in reasoning ABILITY TO MAKE APPLICATIONS ABILITY TO MAKE TRANSLATIONS SKILL IN USING PROCESSES & PROCEDURES KNOWLEDGE OF RULES & PRINCIPLES KNOWLEDGE OF FACTS KNOWLEDGE OF TERMS

  6. The Case Studies Bill Warburton David Bacigalupo EASiHE/iSolutions

  7. Humanities: Modern Languages • Students watch video in Spanish • Questions include Comprehension and Transcribing • Feedback includes exercises based on live BBC Mundo websites • Application written in Qm Perception & accessed via Blackboard

  8. Humanities: Modern Languages

  9. Humanities: Modern Languages • Lessons learned: • Lack of confidence in central support for commonly used tools makes tutors nervous about how to invest their time • Difficulty in motivating students to use the resources • Need to maximise usability of eAssessment • Lack of subject-specific exemplars

  10. Bournemouth & Poole College: Sign Language • Benefits realisation project • Lecturers edit eAssessments using new QTI editor

  11. Bournemouth & Poole College: Sign Language • eAssessments involve students watching a video • Students then asked questions about the video • Learning Societies Lab/JISC TechDis Toolbar used to improve accessibility: • Changing colours & font • Change background • Questions being put in customised version of Edshare

  12. Civil Engineering • Creating Improved Versions of Industry Standard eAssessments to Help Save Lives

  13. Engineering Sciences: Civil Engineering • Students comment on and rate questions

  14. Medicine: Royal Society of Surgeons Exam • “Web 2.0” students write own questions and feedback • Very detailed feedback with lots of images

  15. Health Sciences: A Serious Game • How nursing students can allocate resources to patients • Managing risk • Interactive simulation • Under development

  16. ECS: Web 2.0 tools - Peer Assessment

  17. Feedback • Positive feedback from • Tutors • initial student evaluations • University senior managers via project’s Advisory Board & also JISC advisors

  18. Summary of findings so far HE Research Group 22nd January 2010 Bill Warburton Davis Bacigalupo EASiHE/iSolutions

  19. Lessons Learnt • Overcoming cultural/institutional obstacles • (1) lack of confidence in university eAssessment support for Hot Potatoes (not the university recommended system); • (2) difficulty motivating students to take full advantage of eAssessments; • (3) the need to improve eAssessment usability, especially for students with learning difficulties; and • (4) lack of examples of good and bad subject specific eAssessment questions, and associated guidance

  20. Lessons Learnt • Overcoming technical obstacles • (5) interoperability of different eAssessment software; • (6) how to choose the technologies to use • (7) increasing usability of eAssessment systems for the lecturer • (8) how to present results from a significant number of eAssessments in a form that allows them to be understood and made use of in a short period of time.

  21. Formative eAssessment isn’t a trivial pursuit • Significant investment for tutors and support staff • Time required to generate high-quality eAssessments resources is always greater than imagined • Producing high-quality Formative eAssessment materials benefits greatly from the interdisciplinary involvement of ‘high value’ Subject Experts and Technologists

  22. Formative eAssessment is a trivial pursuit • Students too often don’t see the value • Real world formative eAssesment is too often ‘shovelware’ which devalues the entire enterprise

  23. How to foster real innovation • The right intervention • With the right individuals • At the right time • Students make good use of formative resources where there is an obvious & well defined point to it

  24. How to preserve the investment? • Innovations in HE are often the ‘property’ of enthusiasts • Enthusiasts move on leaving the innovation to wither away • Patterns Workshops are one way to capture the essence of innovations • Once the core value of an innovation is established it can be applied more widely

  25. Appendix – items at various levels of Formative Blooms taxonomy HE Research Group 22nd January 2010 Bill Warburton David Bacigalupo EASiHE/iSolutions

  26. KNOWLEDGE OF RULES & PRINCIPLES • Memorise and recall a general rule • What, when, where, how ...? • Describe… • Describe interrelationships among many items • Memorise and recall applications of a rule • Memorise and recall exceptions to a rule • Does *not* deal with Application of a rule

  27. Testing KNOWLEDGE OF RULES & PRINCIPLES(Bloom 1956: COMPREHENSION) Example: In order to write a chemical formula, you have to know a. only the symbols of the elements that are in the compound b. only the proportions in which the atoms of elements combine c. both the symbols of the elements that are in the compound and the proportions in which the atoms of elements combine d. the atomic weight of the elements that form the compound (Bloom et al. 1971)

  28. Skill in using Processes & Procedures Steps along the route to Mastery The difference between ‘Knowing That’ and ‘Knowhow’? Accuracy in use Practice makes perfect Expression of (justified) confidence Not formally represented in Bloom’s Taxonomy of the Cognitive domain but important for assessing Mastery Formative drills

  29. Skill in using Processes & Procedures Example of process-based item (JISC/REAQ Report,2009)

  30. ABILITY TO MAKE TRANSLATIONS Put idea in own words or use new examples of what is learned Transform a term, fact, rule, principle, process or procedure from one form to another Take a phenomenon presented in one mode/form and represent it by an equivalent form/mode Move from a verbal to a symbolic form Determine when a new illustration is appropriate or not Move from a concrete to a more abstract form, or from a general to a more specific illustration, and vice versa.

  31. Testing ABILITY TO MAKE TRANSLATIONS (Bloom 1956: COMPREHENSION) Example: testing a participant’s ability to • Interpret (native Spanish speaker) or • Translate (Second language speaker) parts of speech from English to Spanish (School of Humanities, University of Southampton)

  32. ABILITY TO MAKE APPLICATIONS Recognize the essentials of the problem Use of a rule/principle learned in one context to solve a problem presented in a new or unfamiliar context Identify rules/principles/generalisations relevant to a problem Use ideas to solve a problem which is different from those previously encountered in the instruction or instructional materials. Most complex of the Formative categories - depends on other classifications but requires application of ideas in new situations or problems

  33. Testing ABILITY TO MAKE APPLICATIONS (Bloom 1956: APPLICATION) Example: testing a participant’s ability to (i) solve problems (in this case, using Wien’s Law) … (ii) use facts, rules and principles (School of Physics and Astronomy, University of Southampton)

  34. Testing APPLICATION skills • Solve a problem • Apply information to produce a (reasonable) result • Apply facts, rules and principles: • How is...an example of...? • How is...related to...? • Why is...significant?

  35. Testing ANALYTICAL skills 35 • Subdivide something to show how it is put together • Find the underlying structure of a communication • Identify motives • Separate something into component parts: • What are the parts or features of...? • Classify...according to... • Outline/diagram... • How does...compare/contrast with...? • What evidence can you list for...?

  36. Testing ANALYTICAL skills 1. A=South Korea; B=Kenya; C=Canada 2. A=Sri Lanka; B=Germany; C=Thailand 3. A=Sri Lanka; B=Thailand; C=Sweden* 4. A=Namibia; B=Portugal; C=Botswana (CASTLE project, University of Leicester) This example tests a participant’s ability classify data according tospecific criteria: Q. Which countries' statistics are being reported in A, B and C?

  37. Testing ANALYTICAL skills 37 Example: testing a participant’s ability to • classify a set of designs … • according tospecific criteria (Winchester School of Art, University of Southampton)

  38. Testing SYNTHETIC skills 38 • Create original products in verbal or physical form • Combine ideas to form a new whole: • What would you predict/infer from...? • What ideas can you add to...? • How would you create/design a new...? • What might happen if you combined...? • What solutions would you suggest for...?

  39. Testing SYNTHETIC skills Q. The picture shows a cube that I have made. Which one of the shapes below, if cut out and folded, could make a cube the same as mine? Example: testing a participant’s ability to predict/infer the 3D appearance of a 2D net. Requires the abstract abilities to accurately reconstruct solids, rotate them about three axes and combine the results with a predicted model. (Thinking Skills Admission Tests, University of Cambridge)

  40. Testing SYNTHETIC skills Example: testing a participant’s ability to • Create or design an … • improved and more practical design Note the differences from the previous example (School of Art, University of Southampton)

  41. Testing EVALUATION skills • Make value decisions about issues • Resolve controversies or differences of opinion • Develop opinions, judgements or decisions: • Do you agree that...? • What do you think about...? • Place the following in order of priority... • What criteria would you use to assess...? • What are the most important aspects of...?

  42. Testing Evaluation skills Example: testing a participant’s ability to evaluate the link between cause and effect in terms of predefined criteria "The United States took part in the Gulf War against Iraq BECAUSE of the lack of civil liberties imposed on the Kurds by Saddam Hussein's regime." • The assertion and the reason are both correct, and the reason is valid • The assertion and the reason are both correct, but the reason is invalid* • The assertion is correct but the reason is incorrect • The assertion is incorrect but the reason is correct • Both the assertion and the reason are incorrect. CASTLE toolkit, University of Leicester)

  43. Testing EVALUATION skills Example: testing a participant’s ability to • make value decisions and • develop judgements (School of Art, University of Southampton)

  44. Testing Evaluation skills Q. Briefly list and explain the various stages of the creative process. A student wrote the following: “The creative process is believed to take place in five stages, in the following order: orientation when the problem must be identified and defined, preparation when all the possible information about the problem is collected, incubation when no solution seems in sight and the person is often busy with other tasks, illumination when the person experiences a general idea of how to arrive at a solution to the problem and finally verification when the person determines whether the solution is the right one for the problem.” How would you evaluate this answer? A.* EXCELLENT (all stages correct, in order, with clear AND correct explanations) B. GOOD (all stages correct, in right order BUT explanations unclear) C. MEDIOCRE (stages missing/in wrong order OR explanations unclear/irrelevant) D. UNACCEPTABLE (more than two stages missing AND the order is incorrect AND the explanations are unclear AND/OR irrelevant) (University of Oregon)

  45. Mastering the art of item authoring It is important to distinguish between • Reducing the scope for guessing • Testing learning outcomes at relevant levels of abstraction • Varying the difficulty of items

  46. Varying the difficulty of items Some tried and tested ways to increase the difficulty of four commonly used problem types: • Premise – Consequence: provide more than one premise • Case Study/extended matching: increase the sophistication of the Case Study • Incomplete Scenarios: increase the sophistication of the Scenario, create additional gaps • Problem/Solution Evaluations: increase the complexity of the problem AND/OR solutions

  47. Some questions to finish with • In what ways might you test LLOs with CAA tools in your subject? • What are the drawbacks of using CAA tools to test LLOs? • In what ways could you test HLOs with CAA tools in your subject? • What are the drawbacks of using CAA tools to test HLOs? • How might you compensate for the limitations of basic CAA tools? • Are higher levels of abstraction intrinsically more difficult to test? • Is it possible to have a generic process by which paper-based questions testing HLOs could be converted to online delivery?

More Related