Evaluating audio feedback for summative and formative assessment
Download
1 / 35

Evaluating audio feedback for summative and formative assessment - PowerPoint PPT Presentation


  • 158 Views
  • Uploaded on

Evaluating audio feedback for summative and formative assessment. Derek France ( [email protected] ) Kenny Lynch ( [email protected] ) . Outline. Objectives Brief context Chester examples Gloucestershire examples Examples Drawing it all together.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Evaluating audio feedback for summative and formative assessment' - saima


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Evaluating audio feedback for summative and formative assessment

Evaluating audio feedback for summative and formative assessment

Derek France ([email protected])

Kenny Lynch ([email protected])


Outline
Outline assessment

  • Objectives

  • Brief context

  • Chester examples

  • Gloucestershire examples

  • Examples

  • Drawing it all together


In small groups assess your expectations of the assessmentbenefits of podcasted feedback for staff and students

In 2 minutes

  • In small groups assess your expectations of the challenges for staff and students

    In 2 minutes


Objectives
Objectives assessment

  • To evaluate podcasting for summative, formative and generic feedback

  • To provide an evidence base for colleagues on how to integrate podcasted feedback into the curriculum


Brief context: assessment and feedback assessment

  • Assessment – central to the student experience:

  • “frames learning, creates learning activity and orients all aspects of learning behaviour” (Gibbs, 2006, 23).

  • Feedback – central to learning from assessment:

  • “feedback quantity and quality are the probably the most important factors in enhancing students’ learning (Race, 1999, 27).

  • However:

  • “the literature on student experiences of feedback tells a sorry tale” (Handley et al, 2007, 1).

  • “many students commented on ‘cryptic’ feedback which often posed questions, but gave no indication of where they went wrong”(GfK, 2008, 8)


Brief context literature
Brief Context: Literature assessment

  • The modern day undergraduate entering University is more technological capable than ever before and has been defined as a ‘digital native’ who has grown up with digital technology and is able to perform multiple tasks simultaneously (Prensky, 2001).

  • Oblinger and Oblinger (2005) characterise modern students as the ‘net generation’ who are digitally literate, highly Internet familiar, highly social, crave interactivity in image rich environments and don’t think in terms of technology, they think in terms of activity which technology enables.


Brief context literature1
Brief Context: Literature assessment

  • ‘Greater focus on technology will produce real benefits for all’ (Department of Education and Skills, UK, 2005, p.2)

  • HEFCE, UK (2009, p.6) more cautiously states that, ‘focus should be on student learning rather than on developments in technology per se, enabling students to learn through, and be supported by technology’

  • Prensky (2009) now advocates ‘Digital Wisdom’

    and ‘Digital Enhancement’


Models of Podcasting assessment

(Nie, 2007)

Model 1:

Support

Lectures

Screencasting,

podcating lectures

Lecture

summaries

Pre-lecture

listening materials

(complex concepts)

Lecture

recordings

Model 2:

Support

Fieldwork

“iWalk”:

Location-based

information

Instruction on

technique &

equipment use

Video footage

prepare for

field trip

Digital

Story-telling

Model 3:

Support

3-Dimensional

Learning

Anatomical

Specimens

(Structures, tissues,

dissections)

Model 4:

Support

Practical-based

Learning

Software

teaching & learning

(replace text-based

instructions)

Model 5:

Assessment

Tool

Student-created

podcast based on

field trips

Student-created

podcast to address

climate change

Model 6:

Provide

Feedback

Model 7:

Supplement

Lectures

Bring topical

issues

Guidance & tips

Assessment tasks

Skills

Development

Supplement

Online teaching


A Framework for Developing Podcast Content assessment

Purpose

Extension

To Lectures

Support

Practical

Work

Support

Fieldwork

Bring

Topical

Issues

Supplement

Online

Teaching

Develop

Students’

Study Skills

Assessment

……

Convergence

Integrated with VLE

Stand Alone

Developer

Lectures

Tutors

Students

Senior Students

Others (Experts)

Medium

Audio

Video

Reusability

Temperate (Immediacy, Alive)

Reusable

Structure

Single Session

Multiple Sessions

Length

Short (10 minutes or less)

Longer (10+)

Style

Formal (Lecture)

Informal (Conversation, Discussion)

Capacity

Large Student Cohorts

Small Groups of Students

Frequency

Weekly

Fortnightly

Monthly

Regularly

(Nie, 2007)


Chester examples

Chester examples assessment


The case study assessment

  • One year, 2008 – 2009:

  • Two modules – Level 4 (69 students); Level 6 (34 students).

  • One formative and summative assessment exercises (L6) & four generic large group feedback opportunities (L4).

  • For each assignment:

  • Summative (Sm) -generic overview commentary combined with bespoke feedback on the group presentation

  • Formative (Fm) - informal podcast based on the e-postcard

  • Sm and Fm sent to the feedback section of each student’s VLE-based e-portfolio

  • Larger group generic feedback of four coursework assessments and placed in the online module space.


Feedback Uploading & Tracking assessment

  • Feedback portal within the institutional VLE

  • Upload via modular e-learning areas



Methods of evaluation assessment

1. Pre-feedback questionnaire:

Experience of podcasts; current views about feedback and expectations.

(L4, 58, 90% response rate.) (L6, 28, 82% response rate.)

2. Post-feedback questionnaire:

Engagement and perceptions.

(L4, 30, 46% response rate.) (L6, 29, 85% response rate.)

3. Focus group discussions:

Exploring emerging themes in more detail.

(one at L6: 6 students; one at L4: 8 students.)


Prior experiences
Prior experiences assessment

  • ‘Confidence’ in using IT was high, over 90% of students)

  • Pre university podcasting experience relatively low at 37% compared to final year students of 82%

  • Prior negative feedback experiences L4, 17% and L6, 13%



Summative feedback
Summative Feedback assessment

N = 29


Generic large group feedback
Generic Large Group Feedback assessment

N = 30


Summative versus formative versus generic
Summative versus Formative versus Generic assessment

  • All three forms of podcasted feedback were valued by students

  • Formative was generally more appreciated than summative due to its potential immediacy to improve grade

  • Feeding forward issues of summative feedback were also highlighted

  • Large group generic feedback was appreciated, and students recommended that it should continue and is seen as better than front of class feedback (less embarassment).



Project aims
Project aims assessment

GEES-funded small project November 2008 – March 2009, with the aims to:

  • develop a straightforward procedure for creating and delivering audio feedback;

  • follow a group of academics through the process of introducing audio feedback in a range of modules; and

  • evaluate the experience


Project members
Project members assessment

  • Bill Burford (Landscape)

  • James Kirwan (CCRI)

  • Dave Milan (Geography)

  • Chris Short (Geography)

  • Claire Simmonds (Broadcast Journalism)

  • Elisabeth Skinner (Community Development)

  • Alan Howe (Social Work)


The project activities
The project activities assessment

  • Levels 1 through 3 to M included

  • On-campus and distance

  • Class sizes ranged from 12 to 45

  • Essays, team-based papers, TV journalism package,

  • E-mail, WebCT, Pebblepad

  • All used for feeding back on summative assessments

  • Purchased Sony ICD-UX80 Recorders


Staff responses
Staff responses assessment

  • Initially added to workload, but as become used to it, generally perceived as neutral [maximum?]

  • Initial concern about content preparation, led to scripting, but gradually moved towards notes/marking sheets and spontaneous recording [skill development and confidence]

  • Concerns about accuracy of delivery – mistakes were made in sending to students

  • Need for careful management of the medium – tone of voice, intimacy, trust


Issues
Issues assessment

  • Quality – FASQ, mark moderating

  • Security, privacy & identity – misdirected files, archive, anonymous marking, team-based feedback

  • Handling grades – on recording or on work?


Future development
Future development assessment

  • More detailed capture of student responses – in relation to different experiences e.g. discipline, location (VLE, e-mail, e-portfolio), level

  • Spread the approach – other disciplines, dissertation feedback?

  • Possible audio template (lower entry barrier)

  • Procedures for minimising misdirection


Student responses
Student responses assessment

  • Overwhelmingly positive from the students – especially distance learners

  • Even profoundly hearing impaired student

  • Students described it as personal, intimate, well-thought out


Engagement with the feedback assessment

  • Responsiveness to receiving information verbally:

  • “Don’t just briefly read it, you actually listen to it and take it in.”

  • “Novel, hearing voice 'goes in' better than just reading.”

  • “Better, goes in more. Can remember feedback from podcast but not from written.”

  • Greater sensitivity to the spoken word:

  • ‘I liked the feedback for what it was, but I also found it a bit depressing. It was very personal… I felt I let you down’.

  • “Any criticism will hit home more.”

  • “May be harder to hear a poor mark, rather than in writing.”

  • [I am least looking forward to] “hearing disappointment in their voices.”


Nature and content of the feedback assessment

  • The potential for more depth and detail:

  • Over 70% of students commented on this…

  • ‘it felt really long. If you’d written this out it would have felt like a whole book. I really got a lot out of it, though’.

  • Hearing your voice seems to make the course seem closer, less distance.

  • More personalised:

  • “This feedback felt that the work had really been looked at and evaluated personally.”

  • ‘I listened to this at home and it felt like you were in the room with me and I wasn’t totally comfortable with that’.

  • More understandable?

  • You get “the tone of voice with the words so you could understand the importance of the different bits of feedback.”


Action plan
Action Plan assessment

  • What have I learnt?

  • What I am going to do next?

  • What 3 things can you feedback to colleagues?


Potential to do more harm than good
Potential to do more harm than good? assessment

Accepted characteristics of good feedback (irrespective of method of delivery)…

  • Facilitates the development of self assessment (reflection) in learning

  • Encourages teacher and peer dialogue around learning

  • Helps clarify what good performance is (goals, criteria, expected standards).

  • Provides opportunities to close the gap between current and desired performance

  • Delivers high quality information to student about their learning

  • Encourages positive motivational beliefs and self esteem

  • Provides information to teachers that can be used to help shape the teaching.

    Juwah et al (2004)


Conclusion assessment

  • Opportunity to diversify assessment feedback strategies.

  • Adherence to well-established guidance on assessment design/timing and feedback content/style remains critical.

  • If used strategically, potential to enhance learning from assessment.

  • The potential to engage students with podcasted feedback irrespective of group size.



References assessment

  • Gibbs, G. (2006). How assessment frames student learning. In C. Bryan and K. Clegg (Eds.), Innovative Assessment in Higher Education (pp 23-36). London: Routledge.

  • GfK (2008) NUS/ HSBC Students Research. GfK Financial London, Study Number 154021

  • Handley, K., Szwelnik, A., Ujma, D., Lawrence, L., Millar, J. & Price. M. (2007). When less is more: Students’ experiences of assessment feedback. Paper presented at the Higher Education Academy Annual Conference, July 2007. Retrieved June 5, 2008 from

  • http://www.heacademy.ac.uk/assets/York/documents/events/conference/E5.doc

  • Juwah, C, Macfarlane-Dick, D, Matthew, B, Nicol, D, Ross D., & Smith, B (2004) Enhancing the Student Learning through effective formative feedback. Higher Education Academy, York. www.heacademy.ac.uk/assets/York/documents/resources/resourcedatabase/id353_senlef_guide.pdf

  • Nie, M. (2007). Podcasting for GEES Subjects. Paper presented at the IMPALA 2 workshop, Dec 2007. Retrieved June 5, 2008 from http://www2.le.ac.uk/projects/impala2/presentation/2nd%20Workshop/Presentations/Ming%20Nie

  • Race, P. (1999). Enhancing student learning. Birmingham: SEDA.

  • Salmon, G. & Edrisingha, P. (2008). Eds. Podcasting for Learning in Universities. Maidenhead: Open University Press. Including companion website: http://www.atimod.com/podcasting/index.shtml


ad