1 / 47

Qualitative Methods For Research

Qualitative Methods For Research. Dr Susan Gasson College of Information Science & Technology Drexel University Email: sgasson@cis.drexel.edu. Agenda. What is qualitative research? Issues of rigor and differences from quantitative research Methods for qualitative analysis

india
Download Presentation

Qualitative Methods For Research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Qualitative Methods For Research Dr Susan GassonCollege of Information Science & TechnologyDrexel UniversityEmail: sgasson@cis.drexel.edu

  2. Agenda • What is qualitative research? • Issues of rigor and differences from quantitative research • Methods for qualitative analysis • Data collection methods • Analysis methods • A Study of Knowledge Management in a Boundary-Spanning, Global IS Devt. Group • Rigor and validity issues • Exercise: coding qualitative data • Useful resources and references

  3. What is qualitative analysis? • Non-quantifiable (or non-quantified) data are analyzed using a variety of methods, to understand patterns in the data. • Whereas quantitative data are analyzed statistically, qualitative data are organized, categorized (coded) and then analyzed through inferential reasoning processes. • Organization of qualitative data involves identification of relevant data samples, e.g. • sections from tape-recorded interviews • time-stamped episodes from a video-recorded activity • field notes from observed behavior in the situation being studied).

  4. Example: Coding Observations • Categorize a description of the voting process in a specific country. • Focus is on (i) how the vote-counting process works, (ii) the reliability of the process (iii) the role of technology. • Code each new idea in the printout (may be a sentence or may be a paragraph) with • Category code (may have >1) • Attribute(s) of the category

  5. Examples: Coding Voting Description • Focus on: (i) How the vote-counting process works, (ii) The reliability of the process (iii) The role of technology (can you make any observations from this data?).

  6. Example: Coding Voting Description

  7. Coding Scheme Process (of vote-counting) • Manual vs. electronic • Hidden vs. visible • Auditable vs. no-paper-trail Reliability (of the process) • Secure vs. insecure • Trustworthy vs. untrustworthy • Objective vs. partisan Technology (role of) • Registering vote • Counting votes • Tallying totals

  8. Philosophical Questions • What are you measuring, in a scientific experiment? • Does it exist independently of your perception? • Is it universal? • Is it true? • What are you measuring, in an interview or observation study of people performing daily work? • Does it exist independently of your perception? • Is it universal? • Is it true? • If you have 5 different researchers performing the same study, will they reach the same conclusions?

  9. Research Paradigms in IS & Info. Science 1. Positivist Research • Positivists generally assume that reality is objectively given and can be described by measurable properties which are independent of the observer (researcher) and his or her instruments. • Positivist studies generally attempt to test theory, to increase the predictive understanding of phenomena (hypothesis testing). 2. Interpretive/Constructivist Research • Interpretive researchers start out with the assumption that “reality” is socially constructed. Phenomena can be understood only through the meanings that people assign to them, accessed via social constructions such as language, consciousness, & shared meanings. • Interpretive research does not predefine dependent and independent variables, but focuses on the full complexity of human sense making in context as the situation emerges. 3. Critical Research • Critical researchers assume that social reality is historically constituted and that people’s ability to change their social and economic circumstances is constrained by various forms of social, cultural and political domination. • Critical research focuses on the oppositions, conflicts and contradictions in organizations and society. It is emancipatory in intent: it seeks to eliminatecauses of alienation and domination.

  10. The Research Life-Cycle In Theory Generation Tests/extends theory Generates/explores theory

  11. Positivist vs. Interpretivist Beliefs

  12. Constructivism: The Hermeneutic Circle The whole (the big picture) Hermeneutics is (literally) the interpretation of a text: • its intent • its content, and • its context. The parts (analysis of minutiae or components) Methodologically, the assemblage of an understanding of the “whole” through an analysis of its parts, e.g. WHOLE PARTGeneral/typical case Instance of complicated caseLearning process Instances of learningDecision process Instances of decision making Gadamer, H-G (1989), "Text and Interpretation," in Dialogue and Deconstruction: The Gadamer-Derrida Encounter, edited/translated by D. P. Michelfelder and R. E. Palmer, SUNY Press,Albany, NY, pp 21-51.

  13. Use Of Multiple Methods • Most often (but not always), the term “qualitative research” refers to qualitative content analysis, performed interpretively. • Tenet of interpretivism is that researcher “interprets” data. • So can use multiple qualitative methods for both data collection and data analysis, e.g. • Data collection: observation, formal interviews, interactive (facilitated analysis) interviews and workshops, document analysis, investigative surveys, etc. • Data analysis: qualitative coding (using different sets of constructs, to examine different aspects of the data), inferential analysis (usually simple frequency co-concurrence), statistical analysis, discourse analysis, etc.

  14. Use Of Mixed Methods • The use of mixed methods indicates the comparison of findings across multiple data collection techniques and analysis methods. • This approach • Provides multiple perspectives of the research problem • Guards against limiting the scope of the inquiry • Yields a stronger substantiation of the derived constructs • (Cavaye, 1995; Eisenhardt, 1989; Orlikowski, 1994; Wolfe, 1994). • Mixed methods may (but does not have to) combine qualitative and quantitative analysis.

  15. Qualitative Data Collection Vs. Qualitative Analysis ANALYSIS DATA Source: Bernard, H.R. (1996) ‘Qualitative Data, Quantitative Analysis’, CAM, The Cultural Anthropology Methods Journal, Vol. 8 no. 1, available at http://www.analytictech.com/borgatti/qualqua.htm

  16. Contributions of Qualitative Research The contribution of qualitative research studies in IS can be: • The development of concepts • e.g. “automate vs. informate" (Zuboff, 1988) • The generation of theory • e.g. Orlikowski & Robey (1991): organizational consequences of IT. • The drawing of specific implications • e.g. Walsham & Waema (1994): the relationship between design and development and business strategy. • The contribution of rich insight • e.g. Suchman (1987): contrast of situated action with planned activity and its consequences for the design of organizational IT. Walsham, G. (1995) ‘Interpretive Case Studies In IS Research: Nature and Method’, European Journal of Information Systems, No. 4, pp 74-81

  17. Distributed Knowledge Coordination Across Virtual Organization Boundaries Dr Susan GassonEdwin M. ElrodDrexel University

  18. Knowledge Management For Virtual Collaboration Organizational KM view Knowledge-as-process • Knowledge processes are embedded within • Best practices (tacit knowledge), • Contexts (localized knowledge) and • Genres of communication (legitimate knowledge). • Effective knowledge management depends on sharing understanding that is only meaningful in the context and community of practice within which it is applied. KM Systems View Knowledge-as-thing • Knowledge can be defined independently of human action. • Knowledge can be divorced from practice • Knowledge can be abstracted into rules or algorithms, independent of context • Knowledge can be defined objectively. • Effective KM depends on knowledge capture, codification & transfer across many different places and many different CoPs. How do we resolve this tension?

  19. Research Question How are different forms of knowledge managed and coordinated across the boundaries of a virtual, global organization?

  20. eCommerce Group Functional Boundaries Executive Management Vendor Projects Europe Technical Operations Client FacingApplications Financial & Client Performance Evaluation BackendApplications

  21. Corporate and Geographic Boundaries eServCorpEU Operations eServCorp eCommerce VendorCorp eServCorp EU Customer Service eServCorp Asia Pacific eServCorp N. American Operations ParentCorp eServCorp Corporate

  22. Field Observations • Researchers observe & transcribe telephone conferences and other (face-to-face) meetings; • Supplemented with monthly ad hoc interviews with management team. • Sample statistics through June 2006 • 338 conference calls/group meetings; • Average length: 0 :30 • Shortest: 0:04 • Longest: 1:35 • 8 group interviews. • Over 1000 pages of transcription • Longitudinal, ethnographic, exploratory

  23. Thematic Analysis Of Meetings (Initial) • Thematic analysis: Whatare the most common themes? • Categories of behavior or phenomena, meaningful in context of the study. • Are there notable exceptions? • E.g. individuals whodo not discuss specific themes or who say very different thingsabout particular topics? • What concept-categories or event-categories can be identified ? • Whatis the range of views expressed with regard to a topic? • Can you identifyany sub-categories? • Variations on yourthemes, furtherdistinctions/qualifications? • What language is used? • Are there common synonyms or metaphors that indicate a specific meaning or category of behavior? • What respondent characteristicsareassociated with particularviews? • Do people with different expertise express different views? • What patterns emerge, across various samples, or over time?

  24. Knowledge Sharing (Johnson, et al, 2002)(Polanyi, 1958)(Zack, 2001) Observed knowledge translation and transformational activities. (Star, 1989) (Carlile, 2002)

  25. Know-How Make work practices explicit through discussion and debate. Standardized Procedures Ms CorpSys: Some system reports have problems. Mr VendorTech: This was fixed in acceptance, but it didn't move with the release. Mr EVP: How many times does this happen? About 50%. Why are we paying <the vendor> for the same mess up 50% of the time? Ms CorpSys: We go through a rollout plan after every test. Moving code over always catches us. Mr ClientSys: There should be some established best practice. Mr EVP: I'm sure there's a best practice 'cause it's been going on since the 1960s.

  26. Know-Why Establish boundaries of eCommerce group. Maps Mr ClientSys: It turns out that a vendor that the EU office has – is one that everyone else uses. Mr EVP: Yes and develops stuff for everyone else and shares the information. It depends whether we consider that a system for … constitutes a competitive advantage, Ms Europe: I think that outcome analysis and project sourcing has to become a strategic area. ● ● ●

  27. Informal, distributed, social context Who-Knows-What Identify relevant stakeholders in other groups. Maps Ms Europe: Mr Support and June visited the French vendor, so I have asked them to do a write-up for us, so that we understand what the issues are etc. and if there is an opportunity to take some of the stuff like the product site, like the project bank for Europe, since it’s already built. But we need to look at the how we host it, where we do it – so I have asked them to write it up for us. Mr EVP: OK, let them write it up. Then let’s talk about it – you, me and Mr ClientSys. …The reason I want to discuss this other stuff - you, me and Mr ClientSys - is that I want to make sure that whatever they put together, you have vetted. With a broader understanding of the global perspective than they might have. ... Ms Europe: Mr Support and June visited the French vendor, so I have asked them to do a write-up for us, so that we understand what the issues are etc. and if there is an opportunity to take some of the stuff like the product site, like the project bank for Europe, since it’s already built. But we need to look at the how we host it, where we do it – so I have asked them to write it up for us. Mr EVP: OK, let them write it up. Then let’s talk about it – you, me and Mr ClientSys. …The reason I want to discuss this other stuff - you, me and Mr ClientSys - is that I want to make sure that whatever they put together, you have vetted. With a broader understanding of the global perspective than they might have. ... Formal knowledge sharing

  28. Concept Map Early Themes From Analysis of Meetings Project Collaboration & Knowledge Project Knowledge Organization Distribution Problem Diverse set of global groups collaborate according to focus Who-knows-what more important than who-can-do-what Informal, distributed social context of project Too complex for one person to understand Formal knowledge often local and undocumented Project roles & responsibilities change frequently Problem emerges thro’ negotiation Knowledge located in people’s heads Project goals are subjective: various groups & individuals define project in different ways Project definition is ad hoc (memory-dependent) Group memory of project changes Definition of project changes frequently – little coordination or persistence of knowledge (group memory)

  29. Analytical Framework: Categorize Collaborations By Modes of Organizational Problem-Solving Well-Structured Problems • Clear problem-structure defines change requirements • Unambiguous goals for change • Knowledge accessed via pattern recognition (problem-solvers in similar domains develop repertoire of solutions). Ill-Structured Problems • Uncertain problem-structure indicates multiple alternative solutions • Need to bound and structure problem to analyze requirements (complexity reduction) • Explore unfamiliar knowledge-domains through consultation with experts to resolve ambiguity re change-goals and scope. Wicked Problems • Problem emerges: has no objective definition, boundary, or structure • Stakeholders see partial subsets  multiple goals for change • Problem, solutions, scope of inquiry, and relevant expertise are negotiated (equivocality reduction) . • Explore emergent knowledge-domains thro’ iterative cycles of inquiry.

  30. Three Spans of Collaboration (i)  Local coordination of projects • Core e-Commerce group manage project: define goals, scope, timescales, deliverables, and rationale • Boundaries: functional, role, geographic. (ii) Conjoint agency • Core e-Commerce group control project: act as hub, incorporating knowledge/expertise from external groups • e-Commerce define goals, scope, and responsibilities • Collaboration with hardware or software vendors, other eServCorp business units, client project groups (iii) Distributed Collaboration • e-Commerce group part of a web of collaborating groups • Goals, scope, system definitions, business-process changes negotiated, implemented, and evaluated jointly • e-Commerce group subject to joint or external project-leadership by groups from eServCorp, ParentCo., associated companies, or vendors.

  31. Knowledge coordination strategy depends on problem coordination-distance Problem-Coordination Distance

  32. Relative Incidence of Problems

  33. Modes of Organizational Problem-Solving

  34. Conclusions and Contributions • Knowledge is coordinated by means of a web of: • Functional and domain-expert roles • Distributed knowledge resources • Imposed or negotiated procedures. • Knowledge coordination strategy depends on problem coordination-distance. This concept combinesorganizational span of coordination with problem-type. • Central role of a cohesive group identity: • Informs semi-autonomous decision making by group members • Provides conceptual patterns for action at group boundaries • Adapted collaboratively through distributed, improvisational sense making to deal with novel situations.

  35. Two Dimensions of KM Coordination

  36. KMS Implications • Knowledge Management Systems must expand beyond communicating management decisions to embrace distributed, emergent, collaborative decision formation: • Well-structured problems require rule-based KMS. • Ill-structured problems require adaptive KMS. • Wicked problems require evolutionary & dynamic KMS, supplemented by human contact. • KMS must be supplemented with face-to-face mechanisms that permit social networks to be formed and maintained. • KMS must be supplemented with face-to-face mechanisms that permit domain expertise to be acquired and translated across domains.

  37. Analyzing Qualitative Data Principles and Practice(!)

  38. Qualitative data coding • Data are be transcribed into a textual form (recommended) and/or analyzed in its raw form (e.g. video/audio, with items of interest identified by time-stamp). • Data analysis (coding) can take two forms: • Data are classified according to a conceptual schema or a theoretical model, which leads to explanations dependent upon, or the further development of the conceptual model • Data are classified according to patterns that emerge from interpretation of the data. As themes and patterns emerge from the data, these are tested against further data samples to derive a substantive (grounded) theory.

  39. A Question Q: If two researchers are presented with the same data, will they derive the same results if they use the same methods, applied rigorously? Let’s find out! • Organize in groups of three(-ish) people. • Discuss themes arising from coded data (10 minutes) • Present findings: 5 minutes per group

  40. How to “Code” Data • RQ: What are differences in the ways that various types of IS professional or manager define the core problems & skills of IS design & development? • Read the transcript or data record through. • Ask yourself “what is it that is going on here?” • Make notes about “themes” that you see in the data; • Don’t attempt to be systematic/comprehensive at this point • Categorize (“code”) your observations • Relate category-codes to research question • Define attributes of categories (attribute codes) • Define categories and sub-categories (coding “families”) • Ask “so what?” • Relate categories and their attributes to contextual factors and/or type of subject • Draw conclusions about what the data tells you, in answer to the research question.

  41. Issues With Qualitative Research • How much data is enough? • How do you know that what you found is not what you were looking for? • Is it difficult to publish qualitative research studies? • Is qualitative research considered less acceptable than quantitative research? • Is this something that a PhD student should consider?

  42. Intercoder Reliability/Agreement • Intercoder reliabilityis a measure of agreement among coders in their coding of data • High reliability scores indicate that • Categories are well-defined (agreed) and can be replicated by others applying the same schema, OR • Multiple coders are applying a pre-defined set of categories consistently, when coding data samples. • Assess by comparing (co-coding) several data samples (e.g. 10) • Or analyze data from a pilot study to see what codes emergeacross researchers before main study starts • Measures of intercoder agreement): • Coefficient of reliability (Holsti, 1969, p. 140) • Scott’s pi (Holsti, 1969, p. 140) • Cohen’s kappa (Krippendorff, 1980, p. 138) • Agreement coefficient (Krippendorff, 1980, p. 138) • Composite reliability (Holsti, 1969, p. 137) • Good website: http://astro.temple.edu/~lombard/reliability/

  43. Summary: Issues in Qualitative Research • Qualitative research methods are used differently by researchers working within various philosophical approaches and various qualitative traditions. • Data collection methods include action research, case studies, ethnography. • Data analysis methods include statistical sampling of coded data and the inductive generation of relationships between variables. • In the interpretive approach: • Rigor is achieved through comparison of findings across data samples and reflexivity. • Validity is communicated through trustworthiness and subject validation of interpretations, rather than statistical significance. • Can protect yourself against allegations of subjective interpretation (lack of rigor), by testing for co-coder reliability.

  44. The “Qualitative – Quantitative Debate” Qualitative Quantitative • Constructivist/Interpretivist • Find answers to questions • Social science view • Explanatory • Goal: understand the subject’s perspective, in context • Investigation oriented • Emergent themes and issues • Researcher is part of situation being studied • Realist/Positivist • Test hypotheses • Natural science view • Confirmatory • Goal: find probabilities and correlations • Verification oriented • Controlled variables • Researcher distanced from situation being studied BUT • Differences are not as simple as this – it is possible to perform qualitative research in a positivist way, or quantitative analysis of interpreted findings. • Positivist research is also subjective – but the subjectivity occurs earlier in the research “life-cycle”, in selection of theory to be tested and research instrument(s).

  45. References (Books and Articles on How-To “Do” Qualitative Research) Denzin, N.K., and Lincoln, Y.S. [Eds.] (2000) The Handbook of Qualitative Research. Sage Books. Eisenhardt, K.M. (1989) "Building Theories From Case Study Research," Academy of Management Review (14:4), pp 532-550. Gasson, S (2003) ‘Rigor in Grounded Theory Research’, in M. Whitman and A. Woszczynski (Eds.) Handbook for Info. Sys. Research, Idea Group, Hershey PA Gasson, S. (2009) ‘ Employing A Grounded Theory Approach For MIS Research’, in Dwivedi et al. (Eds.), Handbook of Research on Contemporary Theoretical Models in Information Systems, Idea Group, Hershey PA. Glaser, B.G. & Strauss, A.L. (1967) The Discovery of Grounded Theory, Aldine Publishing, New York Guest, G., Bunce, A., & Johnson, L. (2006). How Many Interviews Are Enough? An Experiment With Data Saturation And Variability. Field Methods, 18(1), 59-82. Lincoln, Y. S. and Guba, E. G. (1985), Naturalistic inquiry, Sage Publications CA Miles, M.B. and Huberman, A.M. (1994) Qualitative Data Analysis: An Expanded Sourcebook, (2nd. Edition) Sage Publications, Thousand Oaks, CA Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks, CA: Sage.. Strauss, A. L., and Corbin, J. (1998) Basics of Qualitative Research: Grounded Theory Procedures And Techniques. 2nd. edition, Sage Publications, Newbury Park, CA Yin, R. K. Case Study Research, Design and Methods, 3rd ed. Newbury Park, Sage Publications, 2002.

  46. More references (recommended examples) – References used in slides are given in notes to slides Barley, S. (1990) ‘Images Of Imaging: Notes on Doing Longitudinal Field Work’, Organization Science, Vol. 1, No. 3, pp 220-247 Cavaye, A.L.M. "User Participation In System Development Revisited," Information & Management (28:5) 1995, pp 311-323. Checkland, P. (1981) Systems Thinking, Systems Practice, John Wiley & Sons, Chichester. Newman, M., and Robey, D.(1992) "A Social Process Model of User-Analyst Relationships," MIS Quarterly (16:2) 1992, pp 249-266. Orlikowski, W.J. & Robey, D. (1991) ‘Information Technology and the Structuring of Organizations', Information Systems Research, Vol. 2, No. 2, pp 143-169 Schutz, A.(1962) Collected papers Vol. I. The problem of social reality. Martinus Nijhoff, The Hague. Suchman, L. (1987) Plans And Situated Action, Cambridge University Press, MA, USA Tannen, D. "What's In A Frame?" in: Framing in Discourse, D. Tannen (ed.), Oxford University Press, Oxford, UK, 1993. Van Maanen, J. (1988) Tales of the Field, University of Chicago Press, Chicago, IL Walsham, G. (1995) ‘Interpretive Case Studies In IS Research: Nature and Method’, European Journal of Information Systems, No. 4, pp 74-81 Wolfe, R.A. "Organizational Innovation: Review Critique and Suggested Research Direction," Journal of Management Studies (31:3) 1994, pp 405-431. Yin, R.K.Case Study Research, Design and Methods, 2nd ed. Newbury Park, Sage Publications, 1994.

  47. Resources ISWORLD Qualitative Research website: http://www.qual.auckland.ac.nz/ CAQDAS Qualitative Research resources – lots of software! http://caqdas.soc.surrey.ac.uk/resources.htm University of Georgia – Qualitative Research Site: http://www.qualitativeresearch.uga.edu/QualPage/ Ethnographic & Qualitative Methods Course Resources Discourse Analysis (Deborah Tannen, 2004):http://www.lsadc.org/fields/index.php?aaa=discourse.htm Good discussion of inter-coder reliability in content analysis http://www.temple.edu/sct/mmc/reliability/ Some freeware for qualitative data analysis - • Audacity is an audio editor which will record sounds, play sounds, import, edit and export WAV, AIFF, Ogg Vorbis, and MP3 files • Express Scribe provides professional audio playback control software • Atlas/ti-- cut-down but usable demo of qualitative analysis software My web-page – interesting readings for PhD students: http://www.ischool.drexel.edu/faculty/sgasson/IS-readings.html

More Related