Assessment and writing reports
1 / 35

Assessment and Writing Reports - PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

Assessment and Writing Reports. HEA Psychology Network Workshop: Teaching Qualitative Research Methods To Undergraduate Students, York University, 23-24 April 2008. Overview. A1. Assessing students A2. Getting students to quality check

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Download Presentation

Assessment and Writing Reports

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Assessment and Writing Reports

HEA Psychology Network Workshop: Teaching Qualitative Research Methods To Undergraduate Students, York University, 23-24 April 2008


A1. Assessing students

A2. Getting students to quality check

B. Objectivity, reliability and related questions of representativeness?

C. Similarities and differences between a qualitative and quantitative write-up

D. Marking criteria for qualitative practical work

A1: Assessing Students

Alasdair Gordon-Finlayson, Liverpool John Moores University

A1. Assessing students - Research Reports

  • Pro

    • Students already used to report writing

    • Get involved in all (most) of research process

    • Not difficult to assess

  • Contra

    • Qual work can be confusing

    • Time & planning

    • Ethics

A1. Assessing students - Labs

  • Data gathering

    • Interviewing / Others

    • Transcription

  • Data analysis

    • Break down into steps

  • Pro

    • Comparable btw students

    • Assessing specific tasks

    • Managed schedules

  • Contra

    • Hard to assess (?)

    • Doesn’t give whole process

A1. Assessing students - Examinations

  • Unseen questions

    • ‘Gold standard’

    • Can ask about specifics

    • Con: Markers’ expertise

  • Seen questions

    • Markers can be helped

    • For asking broader questions

    • Con: Benefits students who do well at coursework, not exams

  • Others

    • MCQs?

    • Hands-on tasks?

A1. Assessing students - Other possibilities

  • Critical reviews

    • Gets students reading qual research

    • See qual research ‘in the wild’

    • But… reading isn’t doing

  • Methodology / Epistemology papers

    • Requires advanced understanding

    • Again… isn’t doing

  • Other suggestions?

A2. Getting students to quality check

Alasdair Gordon-Finlayson, Liverpool John Moores University

A2. Getting students to quality check

  • Qualitative research: a reflexive process

    • Need to get students to reflect on process as well as results

    • Need to slow analysis down

  • Debates about reliability & validity etc.

  • Qual research already has mechanisms:

    • Triangulation

    • Trustworthiness

    • Reflexivity

    • Quality checklists

A2. Triangulation

  • Between-method triangulation

    • Methodological triangulation

  • Within-method triangulation

    • Data triangulation

    • Investigator triangulation

    • Theory triangulation

A2: Trustworthiness

  • Lincoln & Guba (1995 – Naturalistic Inquiry) identify four components of ‘trustworthiness’:

    • Credibility

    • Transferability

    • Dependability

    • Confirmability

      “This paradigm, while disavowing… postpositivism, sustains, at one level, Strauss & Corbin’s commitment to the canons of good science” (Guba & Lincoln 1998, p.331)

A2: Reflexivity

Tindall (1994) usefully differentiates between:

  • Personal reflexivity:

    • Revealing, rather than concealing, our level of personal involvement and engagement

    • Reflexivity allows us critical subjectivity helping to ensure that our findings do not stem from unexamined prejudice

  • Functional reflexivity:

    • Critical examination of the research process itself

    • Monitoring our role as researchers and our impact on the research process

A2: Quality checklists - ExampleEliott, Fischer & Rennie (1999)

  • Owning one’s perspective

  • Situating the sample

  • Grounding in examples

  • Providing credibility checks

  • Coherence

  • Accomplishing general vs. specific research tasks

  • Resonating with readers

B: Objectivity, reliability and related questions of representativeness?

Dr Mike Forrester, University of Kent

B. Objectivity, reliability and related questions…?

  • B1. Science, social science and processes and procedures of validation

  • B2. Contemporary picture – consider the case of the methodological assumptions in conversation analysis

  • B3. Models appropriate for qualitative methods?

B1. Science, social science and processes and procedures of validation

  • (i) What constitutes a ‘reasonable’ argument supported by this or that research method?

  • (ii) Disciplinary allegiances and a focus on specific methods deemed realisable and defensible, within particular sub-domain (e.g., compare clinical psychology with face-recognition research)

B1. Science, social science and processes and procedures of validation

  • (iii) Quantitative and qualitative methods in psychology share the same orientation to being (or having):

    • Methodic – agreed set of principles and procedures regarding how research should take place

    • Formal – practices surrounding their implementation are (should be) unambiguous and clear

    • Focus on provision of data in service of a research question

    • Reliance on processes of interpretation at some level

    • Display an orientation to the importance of selection; generalisation; reliability; replication; validity and reliability

B1. Science, social science and processes and procedures of validation

  • What about the importance of selection; generalisation; replication; validity and reliability.

  • Take the case of conversation analysis:

B2. CA: Methodological assumptions and presuppositions

  • Empirical discovery, not analytic rules

  • CA presents itself as discovering [those kinds of patterns] empirically. It doesn't start with a theory about what conversationalists are up to; in fact, CA objects to other linguists' theorising before the facts. These facts are regularities, from which the CA will draw inferences about the rules to which the participants are playing.

B2. CA: Methodological assumptions and presuppositions

  • Once those rules are understood by the analysts, they can look at a particular play and make a guess at what function it served the speaker. A reasonable metaphor would be learning the rules of a game by watching it over a period. You pick up certain regularities (for example, the play is stopped if the ball is passed to a striker who is behind the opposition's players) and you suppose that the players knowingly orient to the rule that underpins the regularity. Once you know that, you see why, for example, a team tries to make sure it has all its players in front of the opposition's leading player so that the ball can't be passed to them without 'off-side' being called (Antaki, 2004). 

B2. CA: Methodological assumptions and presuppositions

  • On generalisation and selection of samples for analysis:

  • From Woofit (2001):

  • It is only when such a collection (of sequence examples with similar properties to the one being discussed or examined) that the analyst can begin to make stronger claims about the sequence being investigated.

  • This does not mean, however, that conversation analytic claims rest on showing that an interactional phenomena has occurred ten, twenty or a hundred times in a corpus of data. It is not the case that CA seeks some sort of statistical or numerical measures by which to validate its analytic claims.

In CA, a collection of data is taken to be a series of (candidate) instances of a specific phenomenon, each of which is considered to be worthy of detailed analysis to discover how its features were interactions produced by the participants. The objective is to identify the recurrent organizational properties exhibited by the instances in the collection. Consequently is it useful to have a collection of instances taken from a range of sources and settings. ………

B2. CA: Methodological assumptions and presuppositions

B2. CA: Methodological assumptions and presuppositions

  • Reliability and validity in CA.

  • Regarding the question of reliability, Seedhouse (2005) and Peräkylä (1997) point out that often the key factors surrounding reliability concern the selection of what is recorded, the technical quality of the recordings and the adequacy of the transcripts. However, what is central is the fact that the process of analysis itself depends on the reproduction of the talk as transcript. As Seedhouse (2005) puts it,

    • “because CA studies display their analyses, they make transparent the process of analysis for the reader. This enables the reader to analyse the data themselves, to test the analytical procedures which the author has followed and the validity of his/her analysis and claims…….the data and the analysis are publicly available for challenge by any reader; in many other research methodologies readers do not have access to these”. (p. 254).

B2. CA: Methodological assumptions and presuppositions

  • Validity in CA.

  • Here the concern is with the soundness, integrity and credibility of the findings. Seedhouse (2005) makes the point that many CA procedures are based on a concern for ensuring internal validity whilst developing an emic perspective, that is always grounded on the participants perspective and orientation.

B2. CA: Methodological assumptions and presuppositions

  • Validity in CA.

  • Three associated points bearing on internal validity, are the requirement, for CA analysts that they:

  • (i) pay detailed attention to the micro-detail of interaction as this is the most appropriate way to portray the emic perspective

  • (ii) avoid the incorporation, in the analysis, of any external theories of language, culture, psychology of society when trying to explain the interaction, this would replace the emic perspective with an analyst’s perspective, unless it can be shown in the details of the interaction that the participants themselves are orienting to such theories

  • (iii) decline to invoke ‘context’ when accounting for the interaction.

Objectivity, reliability and related questions…?

  • 3. Models, that is, theoretical accounts of objectivity, reliability, generalisation and validity, appropriate for qualitative methods?

    • (a) archaeological

    • (b) social anthropological (the ethnography)

    • (c) how do conceptions of the [random] sample alter

    • (d) what is a hermeneutics of textual analysis and the role of interpretation

    • (e) building a case and the ‘reasonableness’ of exemplary sample cases

C: Similarities and differences between a qualitative and quantitative write-up

Dr Cath Sullivan, University of Central Lancashire

Conventional write-up sections

  • Abstract

  • Introduction

  • Method

  • Results

  • Discussion

D: Marking criteria for qualitative practical work

Dr Cath Sullivan, University of Central Lancashire

Suggested marking criteria include

  • Do the title and abstract convey the main aspects of the practical work in a clear manner?

  • Is there a good use of evidence?

  • Is the method section clear and comprehensive?

  • Are the aims clear and does the analysis that has been performed actually address them?

  • Are the implications (theoretical and practical) discussed appropriately?

  • Are the limitations of the method used discussed?

  • Are suggestions for future research made and do they relate sensibly to some aspect of the findings or the specific limitations of the study?

Suggested marking criteria include

  • Are the results summarised clearly in the discussion section?

  • Is it clear how these results relate to previous research and relevant theory?

  • Does the student present a thorough literature review that shows good knowledge of the area?

  • Has the student approached the material (literature and their own data) in a way that shows a capacity for independent thought and cogent argumentation?

  • Is the report written clearly with few spelling and grammatical errors?

  • Does the report, and its subsections, have a appropriate structure?

  • For textual analysis, is there evidence of the analysis of divergent or deviant cases?

Epistemological influences on evaluation

  • Reliability and validity?

  • The status of the data ?

  • How is generalisability is conceptualised?

  • Reflexivity and subjectivity?

References: CA

  • Antaki, C. (2004). See for a very useful introduction and tutorial on CA.

  • Perakyla, A., & Vehvilainen, S. (2003). Conversation analysis and the professional stocks of interactional knowledge. Discourse & Society, 14(6), 727-750.

  • Sacks, H. (1984). Notes on methdology. In J. M. Atkinson & J. Heritage (Eds.), Structures of social action. Studies in conversation analysis. Cambridge: Cambriged University Press.

  • Schegloff, E. (1996). Confirming allusions: towards an empirical account of action. American Journal of Sociology, 104, 194-187.

  • Seedhouse, P. (2005). Conversation analysis as research methodology. In K. Richards & P. Seedhouse (Eds.), Applying conversation analysis. Basingstoke: Palgrave, Macmillan.

  • Ten Have, P. (1999). Doing conversation analysis: A practical guide. London: Sage.

  • Wooffit, R. (2001). Researching psychic practitioners: Conversation analysis. In M. Wetherall & C. Taylor & S. J. Yates (Eds.), Discourse as Data. London: Sage

References: Quality checklists

  • Henwood, K.L. & Pidgeon, N.F. (1992). Qualitative research and psychological theorizing. British Journal of Psychology, 83 (1), 97-112.

  • Elliott, R., Fischer, C.T. & Rennie, D.L. (1999). Evolving guidelines for publication of qualitative research studies in psychology and related fields. British Journal of Clinical Psychology, 38 (3), pp. 215-229. [And response by Reicher (2000)]

  • Madill, A., Jordan, A. & Shirley, C. (2000). Objectivity and reliability in qualitative analysis: Realist, contextualist and radical constructive epistemologies. British Journal of Psychology, 91 (1), pp. 1-20. [Different criteria for different epistemological stances]

Reference: Reflexivity

  • Tindall, C. (1994). Issues of evaluation. In Banister, P., Burman, E., Parker, I., Taylor, M. & Tindall, C. Qualitative Methods in Psychology: A Research Guide (Chapter 9). Maidenhead: OUP.

  • Login