slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Geoffrey Crisp ALTC National Teaching Fellow PowerPoint Presentation
Download Presentation
Geoffrey Crisp ALTC National Teaching Fellow

Loading in 2 Seconds...

play fullscreen
1 / 52

Geoffrey Crisp ALTC National Teaching Fellow - PowerPoint PPT Presentation


  • 94 Views
  • Uploaded on

Assessment Design and E-learning. Geoffrey Crisp ALTC National Teaching Fellow Director, Centre for Learning and Professional Development University of Adelaide. Assessment 2.0 examples from ALTC Fellowship. http://www.transformingassessment.com. 4/09/2014. 2. Outline of presentation.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Geoffrey Crisp ALTC National Teaching Fellow' - lore


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Assessment Design and E-learning

Geoffrey Crisp

ALTC National Teaching Fellow

Director, Centre for Learning and Professional Development

University of Adelaide

assessment 2 0 examples from altc fellowship
Assessment 2.0 examples from ALTC Fellowship

http://www.transformingassessment.com

4/09/2014

2

outline of presentation
Outline of presentation

introduction to learning and assessment

e-assessment design

evaluating your assessment questions

future assessment tasks

4/09/2014

3

typical learning and assessment today
Typical learning and assessment today?

http://www.flickr.com/photos/cristic/359572656/

http://www.pharmtox.utoronto.ca/Assets/Photos/pcl473+classroom+editted.JPG

4

big problem learning and assessment
“Big problem” learning and assessment

http://www.nasaimages.org/luna/servlet/detail/nasaNAS~9~9~58363~162207:

6

assessment tasks should be worth doing
Assessment tasks should be worth doing

if students can answer questions by copying from the web, they are being set the wrong questions

if students can answer questions by using Google, they are being set the wrong questions

if students can answer questions by guessing, they are being set the wrong questions

Why the hell am I doing this course?

4/09/2014

7

students perception of learning strategies
Students’ perception of learning strategies?

http://blog.oregonlive.com/pdxgreen/2008/02/chimp.jpg

4/09/2014

8

operationalising assessment for current and future learning
Operationalising assessment for current and future learning

Authentic tasks

Life-long learning

Authentic tools

Meaningful

feedback

Self-review and

critique

Standards

Learning-oriented

assessment

10

outline of presentation1
Outline of presentation

introduction to learning and assessment

e-assessment design

evaluating your assessment questions

future assessment tasks

4/09/2014

11

11

what role for learning virtual management systems
What role for learning (virtual) management systems?

http://4.bp.blogspot.com/_hBiBaUg_1rA/SJTQE0ymK7I/AAAAAAAABxI/OIKhMiaaQ2E/s400/confusing_signs2.jpg

http://www.teach-ict.com/ecdl/module_1/workbook15/miniweb/images/stressed.jpg

4/09/2014

12

effective assessment design
Effective assessment design

Graham Gibbs, David Nicol and David Boud

Nicol, D E-assessment by design: using multiple-choice tests to good effect. Journal of Further and Higher Education, 31(1) 2007, 53–64

jisc reports and papers on e assessment
JISC - Reports and papers on (e)-assessment

http://www.jisc.ac.uk/whatwedo/programmes/elearning/

assessment/digiassess.aspx

report on summative e assessment quality reaq
Report on Summative E-Assessment Quality (REAQ)

MCQ (selected response) effort is front loaded

start with easier questions and make later questions more difficult

checking assessments with subject matter experts and high performers

identifying ‘weak’ questions and improving or eliminating them

http://www.jisc.ac.uk/media/documents/projects/reaqfinalreport.pdf

report on summative e assessment quality reaq1
Report on Summative E-Assessment Quality (REAQ)

reviewing question content to ensure syllabus coverage

assisting academics who may have limited experience of psychometrics

attending to security

using accessibility guidelines

http://www.jisc.ac.uk/media/documents/projects/reaqfinalreport.pdf

why do we assess students
Why do we assess students?

it encourages (current and future) learning

it provides feedback on learning to both the student and the teacher

it documents competency and skill development

it allows students to be graded or ranked

it validates certification and licence procedures for professional practice

it allows benchmarks to be established for standards

hasn’t changed much for decades in some cases

4/09/2014

18

18

types of assessment responses
Types of assessment responses

convergent type, in which one ‘correct’ response is expected, and divergent type, in which the response depends on opinion or analysis

assessment requiring convergent responses has its origins in mastery-learning models and involves assessment of the learner by the master-teacher

assessment requiring divergent responses is associated with a constructivist view of learning, where the teacher and learner engage collaboratively within Vygotsky’s zone of proximal development

4/09/2014

19

19

you need to think about
You need to think about:

whether a norm-referenced or criterion-referenced assessment scheme is more appropriate for the particular learning outcomes

whether the process of solving a problem and the productof solving a problem are both assessed, and what is the relative weighting for the two components

whether constructed or selected responses are appropriate

4/09/2014

20

why use e assessment for selected respoinses
Why use e-assessment for selected respoinses?

flexibility in test delivery

providing timely feedback

easy reporting and analysis of student responses

construction of questions is straightforward, but designing good assessment items difficult

can reduce overall workload for academics, but effort is frontloaded

design for mcq exams for summative use
Design for MCQ exams for summative use

use of question banks

security

guessing

slide28

Certainty-Based Marking (CBM)

Tony Gardner-Medwin - Physiology (NPP), UCL

  • CBM rewards thinking:
    • identification of uncertainty
    • or of justification
  • Highlights misconceptions
    • negative marks hurt!
  • Engages students more
  • Enhances reliability & validity
slide29

Nuggets of knowledge

?

?

?

?

?

?

?

?

EV I DENCE

Inference

Confidence-based marking places greater demands on justification, stimulating understanding

Networks of Understanding

Thinking about uncertainty and justification

stimulates understanding

Confidence (Degree of Belief)

Choice

To understand = to link correctly the facts that bear on an issue.

outline of presentation2
Outline of presentation

introduction to learning and assessment

e-assessment design

evaluating your assessment questions

future assessment tasks

4/09/2014

30

30

report on summative e assessment quality reaq2
Report on Summative E-Assessment Quality (REAQ)

The design principles for preparing quality assessment tasks in higher education have been well documented (Biggs, 2002; Bull & McKenna, 2003; Case & Swanson, 2001; Dunn, Morgan, Parry & O'Reilly, 2004; James, McInnis & Devlin, 2002; McAlpine, 2002a; PASS-IT)

There is also an extensive body of work in the discipline of validity and reliability testing for assessments and there are numerous descriptions that are readily available for academics on how to apply both psychometric principles and statistical analyses based on probability theories in the form of Classical Test Theory and Item Response Theory, particularly the Rasch Model (Baker, 2001; Downing, 2003; McAlpine, 2002b; Wright, 1977)

report on summative e assessment quality reaq3
Report on Summative E-Assessment Quality (REAQ)

Thus, there is no shortage of literature examples for academics to follow on preparing and analysing selected response questions; academics and academic developers should be in a position to continuously improve the quality of assessment tasks and student learning outcomes

However, the literature evidence for academics and academic developers generally using these readily available tools and theories is sparse (Knight, 2006)

analysing student responses
Analysing student responses

Engaging academics with a simplified analysis of their multiple-choice question (MCQ) assessment results. G. T. Crisp E. J. Palmer. Journal of University Teaching & Learning Practice Volume 4, Issue 2 2007 Article 4

survey of academics
Survey of academics

Academics were familiar with common statistical terms such as mean, median, standard deviation and percentiles

Some were familiar with the different types of terms used to describe validity, but very few were aware of the formal psychometric approaches associated with Classical Test Theory, the Rasch Model or Item Response Theory

how effective are distracters
How effective are distracters?

20-30% each distracter

outline of presentation3
Outline of presentation

introduction to learning and assessment

e-assessment design

evaluating your assessment questions

future assessment tasks

4/09/2014

43

43

process of problem solving immex
Process of problem solving - IMMEX

http://www.immex.ucla.edu

immex output
IMMEX output

Kong, S.C., Ogata, H., Arnseth, H.C., Chan, C.K.K., Hirashima, T., Klett, F., Lee, J.H.M., Liu, C.C., Looi, C.K., Milrad, M., Mitrovic,A., Nakabayashi, K., Wong, S.L., Yang, S.J.H. (eds.) (2009). Proceedings of the 17th International Conference on Computers in Education

role plays
Role Plays

http://www.ucalgary.ca/fp/MGST609/simulation.htm

http://www.roleplaysim.org/papers/default.asp?Topic=toc9

46

what happens in a role play
What Happens in a Role Play?

Reflection &

Learning

Adopt a role

Issues & problems occur

Interaction & debate

47

scenario based learning
Scenario-based learning

http://www.pblinteractive.org

48

future assessments
Future assessments?
  • Will we see universal development of immersive and authentic learning and assessment environments?
  • Will assessments measure approaches to problem solving and student responses in terms of efficiency, ethical considerations and the involvement of others?
  • Will teachers be able to construct future assessments or will this be a specialty activity?

49

assessment 1 0 v assessment 2 0
Assessment 1.0 v Assessment 2.0

Given

Done alone

Descriptive

Text

Closed book

Done in class

Teacher assessed

Negotiated

Done collaboratively

Researched/Deep

Text/audio/video

Open web

Done anywhere

Self- and peer-assessed

Assessment 1.0

Assessment 2.0

http://www.scribd.com/doc/461041/Assessment-20