1 / 49

MIS 650 Data Collection

MIS 650 Data Collection. Chapter 3: Methodology. Chapter Outline 3.1 Methodological Issues (Usually Validity and Reliability, sometimes Ethics) 3.2 Sampling Methods 3.3 Data Collection Techniques 3.4 Data Integrity Issues 3.5 Analysis “Look-ahead”. What you say. Idea: Theory, Model.

freja
Download Presentation

MIS 650 Data Collection

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MIS 650Data Collection MIS 650: Data Collection

  2. Chapter 3: Methodology • Chapter Outline • 3.1 Methodological Issues (Usually Validity and Reliability, sometimes Ethics) • 3.2 Sampling Methods • 3.3 Data Collection Techniques • 3.4 Data Integrity Issues • 3.5 Analysis “Look-ahead” MIS 650: Data Collection

  3. What you say Idea: Theory, Model What the theory says Test Plan: Methodology hypotheses conclusions Research methods Conclusions about Idea Physical Test of Hypotheses using Methodology What the data say data What the world says MIS 650: Data Collection

  4. 3.1 Methdological Issues • State of Theory in your area (well developed, speculative) • Ability to generalize • Role of data in your research; is it empirical? • Formal or informal index of “goodness” of your methodology within a general critique MIS 650: Data Collection

  5. State of Theory • Theory vs. Experience: Is theory well developed or are we still experiencing rather than thinking about this area? • Role of Language: Are there well-defined terms and measures? • Proof vs. Communication: Role of paper • Qualitative vs. Quantitative Research: Do strong theories already exist? MIS 650: Data Collection

  6. Role of Data • Data are Instances of Abstractions • These instances have relationships which test relationships among abstractions • The abstraction relationships are the theory • We use DATA (measurements) to demonstrate the theoretical relationships among the abstractions ` MIS 650: Data Collection

  7. “Our theories are the scripts; the world, our stage; researchers, the stage managers; and data, the film of the players’ performances. Our goal is to create excitement, sell tickets, and satisfy the public.” MIS 650: Data Collection

  8. Classes of Problems • Sampling Problems (Cases, Companies, Individuals, Times, Tasks) • Observer Errors (Creating the wrong stimuli) • Subject Errors (Getting wrong responses) • Recording Errors (Losing the data) • Ethical Problems (Not deserving the data) MIS 650: Data Collection

  9. Where Students Often Fail • Lack of Theory to Guide Method • Poor Operationalization of Concepts • Convenience Samples • Measurement Errors • Sloppy DataCollection • Too little data MIS 650: Data Collection

  10. 3.2 Sampling • Discuss how sample was obtained • What was used as the sampling frame? Why? • Were there any problems with representativeness? • Were there any potential ethical problems? MIS 650: Data Collection

  11. Sampling Issues • Representativeness Usually assured by “random” sampling Not always an issue or an issue to the same degree • Procedure Topic/Hypotheses  Universe  Sampling Frame  Research Sample  Actual Sample MIS 650: Data Collection

  12. Representativeness Data points must be “unbiased” This means that qualities of the source of the data should not (apparently) affect the content of the data Generally this means that every potential data source has the same probability of being in the research sample MIS 650: Data Collection

  13. Representativeness, Cont’d The question is then, “Do the sources of data in the research sample represent all those data points not present?” If YES, then conclusions drawn from the data can be generalized to the whole universe. If NO, then such conclusions will be deemed to apply only to the research sample. MIS 650: Data Collection

  14. Representativeness, Cont’d Representativeness works in two ways: 1. Generalizability Do the data represent the universe? 2. Confidence How well do the data do that representation? MIS 650: Data Collection

  15. Representativeness, Cont’d Confidence This becomes an issue because of random variation rather than bias. Random variation is only an accumulation of unknown biases. Systematic bias pushes qualities of data source in particular directions thus increasing possibility of wrong conclusion. Random variation pushes qualities of data source in many random directions, thus lowering confidence in conclusions Systematic Bias Random Variation MIS 650: Data Collection

  16. Procedure-1 Topic/Hypotheses  Universe Topic applies to particular part of the world and your hypotheses can only be tested in a particular “world” The universe is what your ideas are eventually going to “apply to” MIS 650: Data Collection

  17. Procedure-2 Universe  Sampling Fame Sampling Frame is a systematic way to get to data sources in your universe. Examples include phone directories, databases, printed lists, physical “inventory” All real sampling frames are inaccurate, out of date and incomplete. Problems must be addressed and discussed. MIS 650: Data Collection

  18. Procedure-3 Sampling Frame  Research Sample Research sample is the actual list of your data sources. For generalization research sample should be “representative” Research sample should be drawn “randomly” if possible or sometimes in a stratified manner. Taking every nth item is common, or using random number table. Not every item selected is “real”! MIS 650: Data Collection

  19. Procedure-4 Research Sample  Actual Sample Actual sample is smaller than research sample: Sources may not be available Scheduling is hard Interruptions, lost data, accidents, etc. Sampling frame may be inaccurate or out of date. MIS 650: Data Collection

  20. Sampling Issues • Level of Aggregation Issues Organization, Group, Individual, Task. Sampling Entity Issues Site, Individual, Task, Time, Measurements • Sample Size Issues Parameterisation, Inference, Description MIS 650: Data Collection

  21. Sample Structure • Universe (all possible things) • Sampling Frame (Systematic Division into Allowable/not Allowable) • Sample Situation MIS 650: Data Collection

  22. Ex-sample • Universe [Users] • Sampling Frame [Firm phone Directory] • Sample [Every 3rd] Situation MIS 650: Data Collection

  23. Problems in Sampling • Convenience Sampling -- unrepresentative • Lack of a Sampling Frame -- can’t sample • Too small a sample size -- low confidence • Too large a sample size -- wasted effort • Sampling the wrong thing -- useless • Non-representative Sampling -- cannot generalise MIS 650: Data Collection

  24. 3.3 Data Collection Techniques • What were the possible choices for data collection technique? • Why did you choose method you did? • Describe the method in detail • Was there a role for observers, coders, interpretation? • Show how you handled problems with the technique you selected. MIS 650: Data Collection

  25. General Data Collection Methods Survey Expt. Obsv’n Case Ret RT RT RT/Ret Pro/Sub Sub N/A Pro/Sub Res Res Subj Subj/Pro Emp Emp Emp Emp • Dimensions: Real-time vs. retrospective Observed now or subject recalls from past • Projective vs. Subjective Others’/subjects’ experience • Researcher-driven vs. subject-driven Researcher creates stimulus/subject does this • Most common methods are case studies, surveys and experiments • Empirical vs. non-empirical MIS 650: Data Collection

  26. Data Collection Model 9. Response / Answer 5. Perceived Stimulus 11. Perceived Response 1. Theory   3. Stimulus / Question 8. Response formulation 2. Stimulus formulation 4. 10. 6. Knowledge 7. Ideas 12. Recorded Response Observer Interpreter/Coder Subject MIS 650: Data Collection

  27. Data Collection Model 9. Response / Answer 5. Perceived Stimulus 11. Perceived Response   1. Theory 8. 3. Stimulus / Question 1. (Actually H1): Prior experience with one application influences perception of innovation. 6. Knowledge 7. Ideas 2. 4. 10. 12. Recorded Response Observer Interpreter/Coder Subject MIS 650: Data Collection

  28. Data Collection Model 9. Response / Answer 5. Perceived Stimulus 11. Perceived Response  1. Theory 8. 3. Stimulus / Question 6. Knowledge 7. Ideas 2. 4. 10. 2. 3. “Which of the following applications have you used in the past 12 months?” 12. Recorded Response Observer Interpreter/Coder Subject MIS 650: Data Collection

  29. Data Collection Model 9. Response / Answer 5. Perceived Stimulus 11. Perceived Response   1. Theory 8. 3. Stimulus / Question 4. 5. “Do you know how to do your job?” 6. Knowledge 7. Ideas 2. 4. 10. 12. Recorded Response Observer Interpreter/Coder Subject MIS 650: Data Collection

  30. Data Collection Model 9. Response / Answer 5. Perceived Stimulus 11. Perceived Response   1. Theory 8. 3. Stimulus / Question 6. Knowledge 7. Ideas 2. 4. 10. 6. 7. <Hmmm, maybe I look like I don’t know what I’m doing here…better deny!> 12. Recorded Response Observer Interpreter/Coder Subject MIS 650: Data Collection

  31. Data Collection Model 9. Response / Answer 5. Perceived Stimulus 11. Perceived Response   1. Theory 8. 3. Stimulus / Question 6. Knowledge 7. Ideas 2. 4. 10. 8. 9. “Nope, haven’t used any of them” 12. Recorded Response Observer Interpreter/Coder Subject MIS 650: Data Collection

  32. Data Collection Model 9. Response / Answer 5. Perceived Stimulus 11. Perceived Response   1. Theory 8. 3. Stimulus / Question 10. 11. <Hmm, he must be an idiot not to have used these appli-cations> 6. Knowledge 7. Ideas 2. 4. 10. Recorded Response Observer Interpreter/Coder Subject MIS 650: Data Collection

  33. Data Collection Model 9. Response / Answer 5. Perceived Stimulus 11. Perceived Response   1. Theory 8. 3. Stimulus / Question 6. Knowledge 7. Ideas 2. 4. 10. 12. Don’t Know 12. Recorded Response Observer Interpreter/Coder Subject MIS 650: Data Collection

  34. Observer Errors • Mistakes that observers commit, usually not observing the right phenomenon or masking subjects’ behaviour Observer Behaviour Subject Behaviour MIS 650: Data Collection

  35. Observer Errors • Intrusion, leading questions • Setting up the situation to give a predetermined answer, interfering with subjects’ ability to select an answer by supplying it, assuming an answer, not respecting silence MIS 650: Data Collection

  36. Observer Errors Intrusion, leading questions • Expectation management problems • Creating a situation in which subject tries to “guess” correct answer or tries to “please” the researcher by giving socially mandated or desirable responses MIS 650: Data Collection

  37. Observer Errors Intrusion, leading questions Expectation management problems • Consultant effect • Interfering with “normal” behavior by changing the situation to favor socially-facilitated responses or by focusing attention on behavior under study MIS 650: Data Collection

  38. Observer Errors Intrusion, leading questions Expectation management problems Consultant effect • Hawthorne effect • A consultant-related effect in which behavior is enhanced because attention has been drawn to it. MIS 650: Data Collection

  39. Subject Errors Many things can influence the subject in his or her responses. Here are some of the sources • Memory effects • Protocol Intrusion effects • Subject Context and Limitation effects • Researcher-Subject Interaction effects • Subject Cognition effects • Instrument-Subject Interactions MIS 650: Data Collection

  40. Subject Errors Generally, these errors are most noticeable and problematic when subjects are used in a retrospective manner. However, any task requiring cognition or performance of any type is subject to most of these problems. Context / Protocol “Subject” is the source of variance we desire Instru- ment Sub- ject Cog- nition Mem- ory E- vents Re- sponse Researcher MIS 650: Data Collection

  41. Memory Effects Memory for events changes over time and under the influence of other events • Recency • Primacy • Von Restdorff • “I don’t remember” • “I used to know” • Clustering Recall/recognition Time since event remembered MIS 650: Data Collection

  42. Protocol Intrusion Effects Responses are conditioned not only by what the respondent might know, think or feel, but also by the presence of words or concepts in the stimulus or stimulus situation • Sequence • Positive Halo • Negative Halo • “Mand” characteristics • A • B • C • D MIS 650: Data Collection

  43. Subject Context and Limitation Effects How the subject feels about you, your questions, everything, determines the responses and how the responses are presented. • Stupidity • Ignorance • Ill Will towards you, the organization or “system”, research, any group you are imagined to be part of or represent • Resistance MIS 650: Data Collection

  44. Researcher-Subject Interaction Effects Because you are present (or not), your being around may affect what the respondent does and hence how the respondent replies. • Social facilitation MIS 650: Data Collection

  45. Subject Cognition Effects The subject is not just a machine that reacts. He or she engages in games, strategizes, and tries to understand the situation while working as a response “machine”. Intrusion effects (halo (+/-), sequence) • Experimenter expectancy • Evaluation apprehension • Gamesmanship • Face games, one-upmanship • The problem of the in-group (technicians, mgrs) MIS 650: Data Collection

  46. Instrument-Subject Interactions The instrument may prompt, provoke or prevent response because of its design • Poor scales for response • Too many responses, fatigue • Aesthetic reactions MIS 650: Data Collection

  47. Recording Errors • Failure to listen • Categorization errors • General carelessness • Privacy problems • Too little room on medium • Over-reliance on tape or technology • Poor scales MIS 650: Data Collection

  48. Interpretation Errors • Misunderstanding • Poor conceptualization of constructs • Poor scales MIS 650: Data Collection

  49. 3.4 Data Integrity Issues • How data will be recorded • Potential problems with recording • How data will be maintained • Potential problems with maintenance • How data will be stored, accessed • Potential problems with storage, access • Are data confidential? MIS 650: Data Collection

More Related