1 / 44

GALA 14 th INTERNATIONAL CONFERENCE

GALA 14 th INTERNATIONAL CONFERENCE. Advances in Research on Language Acquisition and Teaching. Thessaloniki, 14-16 December 2007. “ Reflections in the mirror: the contribution of self and peer assessment in the teaching of speaking skills”. AEGINITOU V., NTELIOU E., VLAHOYANNI N.

darcie
Download Presentation

GALA 14 th INTERNATIONAL CONFERENCE

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GALA 14th INTERNATIONAL CONFERENCE Advances in Research on Language Acquisition and Teaching Thessaloniki, 14-16 December 2007

  2. “ Reflections in the mirror: the contribution of self and peer assessment in the teaching of speaking skills” AEGINITOU V., NTELIOU E., VLAHOYANNI N. CHAROKOPIO UNIVERSITY ATHENS

  3. Overview • Introduction 1.1. Peer assessment: Benefits 1.2. Self-assessment • Our study 2.1 Background 2.2 Context and purpose 2.3 Methodology 2.4 Discussion and results 2.4.1 Students’ presentations 2.4.2 Questionnaires 2.4.3 Tutorials 3. Implications – Conclusions 4. Bibliography

  4. 1.1 Peer assessment: Benefits • development of professional skills b. students’ involvement in the learning process c. better rapport between speaker and audience d. increased objectivity of results Boud & Holmes, 1995; Stefani, 1998; Lejk et al, 1999; Magin & Helmore, 2001; Falchikov, 1986, 1995; Magin & Churches, 1989; Mockford, 1994; Lynch, 1988 

  5. 1.2 Self assessment: Benefits • Monitoring of learning and progress • Setting goals for the future • Encouraging responsibility for learning • Promoting critical thinking • Constructing and reconstructing knowledge • Bridging the gap between high and low achievers (Carr, 2002; Harlen & Winter, 2004)

  6. 2. Our study 2.1 Background: Pilot study 2005 • Subjects: EAP/ESP under graduates • Purpose

  7. 2.2 Context and purpose Research Questions • Is there a significant level of agreement between the tutors’ and the students’ assessment of oral presentation skills? • Is peer evaluation motivating and useful? • To what extent is self assessment enhanced by peer assessment?

  8. 2.3 Methodology • prior training- presentation of their own strengths and weaknesses -assessment checklists (different fortutors and students), - audio-taped sample presentations • Students’ presentations & Questionnaire completion • Tutorials • Statistical tools: (SPSS-Matlab) Cohen’s Kappa statistic, Spearman correlation, Mc Nemar- Bowker test

  9. Assessment criteria • CONTENT: content relevant to title / clear central idea / topic well supported / proper use of sources • ORGANIZATION: clear introduction / main points coherently stated / main points cohesively stated / relevant conclusion • LANGUAGE: accurate and clear/ voc. appropriate to topic / technical vocabulary clearly explained / use of transitions / comprehensible pronunciation • PRESENTATION TECHNIQUES: speed / loudnessof voice / eye contact • VISUAL AIDS: clarity / length Adapted from: Rignall, M. and Fourneaux, C. 1997. Speaking (English for Academic Studies Series). UK: Prentice Hall.

  10. 2.4. Discussion and results 2.4.1. Analysis of students’ presentations

  11. Intermediate level

  12. Advanced Level

  13. Problematic variables

  14. Sources (Intermediate)

  15. Sources (Intermediate)

  16. Sources (Intermediate)

  17. Sources (Advanced)

  18. Sources (students) * Sources (professors) Crosstabulation Count Sources (professors) No Quite Yes Total Sources No 5 0 0 5 (students) Quite 5 5 3 13 Yes 1 10 4 15 Total 11 15 7 33 Sources (Advanced)

  19. Sources (Advanced)

  20. Technical vocabulary (Intermediate)

  21. Technical Vocabulary (students) * Technical Vocabulary (professors) Crosstabulation Count Technical Vocabulary (professors) No Quite Yes Total Technical Vocabulary Quite 10 5 3 18 (students) Yes 4 4 7 15 Total 14 9 10 33 Technical vocabulary (Intermediate)

  22. Technical vocabulary (Advanced)

  23. Content relevant to title (Intermediate)

  24. Content relevant to title (Advanced)

  25. Cohesion (Advanced)

  26. Loudness (Advanced)

  27. Eye contact (Advanced)

  28. Eye contact (students) * Eye contact (professors) Crosstabulation Count Eye contact (professors) No Quite Yes Total Eye contact No 4 0 0 4 (students) Quite 7 7 2 16 Yes 2 7 4 13 Total 13 14 6 33 Eye contact (Advanced)

  29. Eye contact (Advanced)

  30. 2.4.2. Questionnaire analysis

  31. 1. While listening to the presentations of your classmates, have you learned anything new on the topics under discussion?

  32. 2. Has the organization of the presentations helped you in the way you will organize your future presentations?

  33. 3. Have you learned useful words / expressions in your subject area?

  34. 4. Were the visual aids helpful in your future selection of relevant graphics?

  35. 5. Did you find evaluating your classmates interesting?

  36. 2.4.3. Tutorials

  37. Questions asked in the tutorials • Were you satisfied with your presentation? • In which aspects of your presentation you feel you need more practice? Why? • In which aspects of your presentation you feel you performed well and you would not change? • Is it easy for you to assess yourself?

  38. Students’ comments “I did not do any presentations at school. I am not quite sure what I have to do”. “This is not my job. The teacher should do that”. “Before the training I did not know how to assess myself or my classmates. Now I think I can”.

  39. Analysis of students' comments • Weaknesses more easily identified than strengths • Participation in self-assessment procedures can facilitate better judgement on performance levels • More realistic goals are set for future presentations Bachman & Palmer 1989; Ready-Morfitt, 1991; Dickinson, 1987; Oscarson, 1997

  40. 3. Implications • Number of subjects • Absence from training session • Future design of self-assessment practice

  41. 3. Conclusions • Prior training positively modified the results • The beneficial combination of peer and self-assessment processes • Two problematic areas: technical vocabulary and reference to sources • Self-reflective practices should be introduced in the early stages of instruction • Future research

  42. 4. Bibliography • Altman DG. 1991. Practical Statistics for Medical Research. London: Chapman & Hall. • Bland JM & Altman DG. 1986. “Statistical methods for assessing agreement between two methods of clinical measurement”. Lancett32, pp. 307-10. • Boud, D. & Holmes, H. 1995. “Peer and self marking in a large technical subject”. In: D. Boud (Ed.) Enhancing learning through self assessment, London, Kogan, 63-78. • Brindley, G. 2001. ‘Assessment’. In Carter, R. & Nunan, D. (Eds.). The Cambridge Guide to Teaching English to Speakers of Other Languages. Cambridge: CUP. pp. 137-143. • Carr, S.C. 2002. “Self-evaluation: involving students in their own learning”. Reading and Writing Quarterly, 18, pp.195-199. • Cohen, JA. 1968. Weighted Kappa: nominal scale agreement with provision for scaled disagreement or partial credit. Psychological Bulletin 70, pp. 213-20. • Falchikov, N. 1986. “Product comparisons and process benefits of collaborative peer group and self assessments”. Assessment and evaluation in Higher education, 11, 146-166 • Falchikov, N. 1995. Peer feedback marking: developing peer assessment, Innovations in Education and Training International, 32, 175-187.

  43. 4. Bibliography • Fleis JL & Cohen JA. 1973. “The equivalence of weighted Kappa and the intraclass correlation coefficient as measures of reliability”. Educational Psychology Measurements 33, pp. 613-9. • Harlen, W. & Winter, J. 2004. “The development for assessment for learning: learning from the case of science and mathematics”. Language Testing, 21(3), pp.390-408. • Hughes, I.E. & Large, B.J. 1993. ‘Staff and peer-group assessment of oral communication skills’. Studies in Higher Education, 18(3), 379-385. • Landis, JR, Kock, GG. 1977. “The measurement of observer agreement for categorical data”. Biometrics 33, pp.159-74 • Lejk et al. 1999. “Group assessment in systems analysis and design : a comparison of the performance of streamed and mixed-ability groups”. Assessment and evaluation in Higher education, 24, 5-14. • Lynch, T. 1988. Peer evaluation in practice, in: A. Brookes and P. Grundy (Eds.) Individualisation and autonomy in language learning. ELT Documents 131 • Magin, D. & Churches, A. 1989. “Using self and peer assessment in teaching design”, Proceedings, World conference on Engineering Education for Advancing Technology, Institution of Engineers, Australia, 89/1, 640-644 • Magin, D. & Helmore, P. 2001. “Peer and Teacher Assessments of Oral Presentation Skills: how reliable are they?”. Studies in Higher Education, 26/3, 287-298. • Mockford, C. 1994. “The use of peer group review in the assessment of project work in higher education”, Mentoring and Tutoring, 2, 45-52. • Rignall, M. & Fourneaux, C. 1997. Speaking (English for Academic Studies Series). UK: Prentice Hall. • Stefani, L. 1998. “Assessment in partnership with learners”, Assessment and evaluation in Higher education, 23, 339-350. • Streiner, DL & Norman, GR. 1995. Health Measurement Scales: A practical Guide to their Development and Use, 2nd edn. Oxford: Oxford University Press.

  44. MERRY CHRISTMAS

More Related