Data Collection May 22, 2013 Adv. Research Methods
Approaches to Data gathering • Two major approaches • Primary data • collect the required information • Provide first hand information • Secondary data • Information is already available and need only be extracted • Sources used are called secondary sources • Examples • Use of census data • Hospital records • Data from articles, journal, books, magazines, etc.
Collecting Data using Primary Sources • Several methods • Choice depends on: • purpose of the study • Resources available • Skills of the researchers • Methods • Observations • Interviews • Questionnaire
Observations • Watching and listening to an interaction or phenomenon as it takes place • Most appropriate • when you are more interested in the behavior than the perception of individuals OR • When subjects are so involved in the interaction that they are unable to provide objective information about it
Types of Observation • Participant observations • When researcher participates in the activities of the group being observed • Non-participant observation • When researcher not is not involved in the activities of the group • Researcher remains passive observer
Problems with Observation Method • Distortion of data due to • Change of groups’ behavior when they become aware that they are being observed • Observer bias • Variation in interpretation drawn from observation from observer to observer • Incomplete observation and/or observation recording
Situations in which observations can be made • Natural • Observing a group in its natural operation rather than intervening in its activities • Controlled • Introducing a stimulus to the group for it to react to and observing the reaction
Recording observations • Depends upon purpose of observation, complexity of interaction and type of population • Narrative recording • Using scales • Error of central tendency • Elevation effect • Halo effect • Categorical recording • Recording on electronic devices - video tapes or other devices
Interviews • An interview is a verbal interchange, often face to face, in which an interviewer tries to elicit information, beliefs or opinions from other person (Burns 1997) • Any person-to-person interaction, either face to face or otherwise, between two or more individuals with specific purpose in mind called an interview • Commonly used method • Researcher has a flexibility to select format, content, wordings, order of question, etc.
Classification of Interviews • Interviews are classified into different categories according to the degree of flexibility in process of asking questions • Structured • Unstructured • Figure 9.3
Unstructured Interviews • Complete freedom in content and structure • Freedom in terms of: • Question asking order/sequence • Wording • Way you explain questions to respondents • Formulate and raise issues on the spur of a moment • Dominantly used in qualitative research (although used in quantitative research as well)
Structured Interviews • Predetermined set of questions • No change in wording or sequence of question at the time of interview • Strictly follow ‘Interview Schedule’ • Interview schedule is a written list of questions, open ended or closed, prepared for use by an interviewer • Please note interview schedule is a research tool/instrument whereas interviewing is a method of data collection • This method provides uniform information that ensures comparability of data • Less interviewing skills are required than unstructured interviewing
Questionnaire • Written list of questions • Respondents read the question, interpret it and record the answers • Difference between Questionnaire and Interview Schedule? • In interview Schedule: interviewer asks the question, explain if necessary and records the respondents’ replies on an interview schedule • Questionnaire: replies are recorded by the respondents themselves
Important to note! • Since there is none to explain the questions to the respondents, it is important that questions are clear and easy to understand • Layout of questionnaire should be • Easy to read • Pleasant to the eye • Sequence of questions should be easy to follow • A questionnaire should be developed in an interactive style
Ways of administering a questionnaire • Mailed questionnaire • The most common approach • Needs addresses • Low response rate • Collective administration • Captive audience (people assembled at one place) like students in a class or people attending a function, seminar etc. • Ensures high response rate • You may have personal contact with study population • Saves money on postages • Administration in a public place • At public place like shopping center, bus stops, health centers etc.
Choosing between interviews and questionnaire • Nature of the investigation - talking about issues that respondents are reluctant to discuss face to face • Geographical distribution of the study population • Type of study population
Advantages of Questionnaire • Less expensive • Offers greater anonymity
Disadvantages of Questionnaire • Application is limited – only literate can participate • Response rate is low • Self selecting bias – not everyone returns questionnaire • Opportunity to clarify issues is lacking • Spontaneous responses are not allowed for - gives time to reflect before answering • Response to a question may be influenced by the response to other questions – respondents may read all question before answering • It is possible to consult others • A response can not be supplemented with other information
Advantages of Interviews • More appropriate for complex situations • Useful for collecting in-depth information – probing is possible • Information can be supplemented – observation of non-verbal reactions • Questions can be explained • Has wider application – used with any type of population, literate, handicapped, etc.
Disadvantages of Interviews • Time consuming and expensive – for geographically scattered population • Quality of data: • depends upon the quality of the interaction between an interviewer and interviewee • depends upon the quality of interviewer – experience, skills, commitment etc. • may vary when many interviews are used • Researcher’s bias in framing and interpretation of questions
Form of questions • Open ended • Possible responses are not given • For seeking opinions, attitudes and perceptions • Advantages & Disadvantages • In-depth information • Greater variety of information • Analysis is difficult • Some respondents may not be able to express themselves • Closed question • The possible answers are set out in questionnaire/schedule • Useful for eliciting factual information • Advantages & Disadvantages • Lacks depth and variety in information • Investigators bias –may list only the responses of his choice • Responses are easy to analyze
Examples • Closed-ended • Please indicate your age • Under 15 • 15 – 19 years • 20 – 24 years • Open-ended • What is your current age? ___________________years More example in Figs 9.6 & 9.7
Formulating Effective Questions • Wording and tone are important • Always use simple and everyday language – no technical jargons • Do not use ambiguous questions –that contains more than one meanings • Are you satisfied with the canteen? • Do not ask double-barreled questions • Does your department have special recruitment policy for women and minorities? • Do not ask leading questions • Smoking is bad, isn’t it? • Do not ask questions that are based on presumptions • How many cigarettes do you use in a day?
Constructing a research instrument in quantitative research • Most important since quality and validity of the output are solely dependent upon it • Ensure validity of your instrument by making sure that your questions relate to the objectives of your study • Steps: • Go back and list all the specific objectives, research questions and hypotheses • List all the associated questions for research objectives, questions and hypotheses that you want to answer through your study • List information required to answer these questions • Formulate questions to obtain this information
Asking personal and sensitive question • Direct vs. indirect manner
Order of Question • Important or not???? • Why??? • Random or in logical listing???
Prerequisites for data collection • Motivate to share the required information • Clear understanding of the questions • Possession of the required information –respondents must have the information sought
Pre-testing a Research Instrument • Testing of research instrument before it is being used • May carries out under actual field conditions on a group similar to study population • Purpose: • To identify problems • not for data collection
Collecting Data using Secondary Sources • Some secondary sources are: • Government or semi-government publications • Census Bureau of Pakistan, PMD, GSP, health department, etc. • Earlier research • Personal records • Mass media
Problems: using data from secondary sources • Validity and reliability • Varies from source to source • Personal bias • Specific problem with personal records • Availability of data • Common for beginners to assume that data will be available • Format • Different categories and classes • others
Validity and Reliability of a Research InstrumentAdv. Research Methods June 05, 2013 Department of RS and GISc Institute of Space Technology, Karachi
Concept of Validity? • Why necessary • To establish the quality of your results • Through establishing appropriateness, quality and accuracy of the procedures your adopted for finding answer to your research questions • At what stage? • Inaccuracies may be introduced at any stage and therefore the concept of validity can be applied to research process as a whole or to any of its steps
Example: Validity Check • Study is designed to ascertain the health needs of a community • In Interview Schedule most of the questions relate to the attitude of the study population towards the health services being provided to them • What is your aim? • What you are finding through interview schedule? • Is this instrument measuring what it is designed to measure?
Validity of Measurement Instrument • In terms of measurement procedures, validity is the ability of an instrument to measure what it is designed to measure • Are we measuring what we think we are measuring? (Kerlinger 1973)
TwoPerspectives • Is the research investigation providing answers to the research questions for which it was undertaken? • If so, is it providing these answers using appropriate methods and procedures?
Types of Validity in Quantitative Research • Content validity • Concurrent and predictive validity • Construct validity
Content validity • Extent to which a measuring instrument provides adequate coverage of the topic under study is content validity • Validity based upon logical link between the questions and the objectives of the study • Advantages and Disadvantages • Simple • Subjective – definite conclusion is hard to derive • No numerical way to express it • Different people may have different opinions
Concurrent and predictive validity • Concurrent validity: judged by how well an instrument compares with a second assessment concurrently done • Predictive validity: judged by the degree to which an instrument can forecast an outcome
Construct validity • Based upon statistical procedures • Determined by ascertaining the contribution of each construct to the total variance observed in a phenomenon • Most complex • Convergent • Demonstrated by strong relationship between the scores obtained from two different methods of measuring the same construct • Divergent • Demonstrated by weak relationship between the scores obtained from two non-overlapping constructs
Concept of Reliability? • Degree of accuracy or precision in the measurement made by a research instrument • A scale or test is reliable to the extent that repeat measurement made by it under constant conditions will give the same result
Factors affecting reliability • Wording of questions • Physical setting • Respondent’s mood • Interviewer’s mood • Nature of interaction • Regression effect of an instrument
Validity Tests • External consistency procedures • Test/retest • Parallel form of the same test • Internal consistency procedures • The split half technique
Ethics in conducting research • It is important that ensure that research is • not affected by the self-interest of any party • not carried out in a way that harms any party
Ethical issues • Ethical Behavior: “in accordance with principles of conduct that are considered correct, especially those of a given profession or group” (Collins Dictionary) • Code of conduct: set of rules/guidelines outlining the responsibilities of or proper practices for an individual or organization.
Stakeholder in research? • Research participants • those with direct or indirect involvement • those affected by the research • those from whom information is collected • etc. • Researcher • Funding agency
Issues to consider concerning research participants • Collecting information • Obtain respondents’ informed consent • Wasting respondents’ time is unethical – is the case when you can not justify the relevance of the research you are conducting • Providing incentives • Ethical to provide incentives to respondents to share information (a small gift after data collection not before!!!) • Seeking sensitive information • Tell respondents clearly the type of information and give them sufficient time to decide if they want to share or not • Possibility of causing harm to participants • Harm: any discomfort, anxiety, harassment, invasion of privacy or demeaning or dehumanizing procedures • Extent of harm (if not avoidable) should not be greater than ordinarily encountered in daily life • Maintaining confidentiality • Sharing information about a respondent with others for purpose other than research is unethical • Information provided by respondents should be kept anonymous
Issues to consider relating to the researcher • Avoiding bias • Bias is a deliberate attempt either to hide what you have found in your study or to highlight something disproportionately to its true existence • Using inappropriate research methodology • Unethical to use deliberately a method or procedure you know to be inappropriate to prove or disprove something • Ex: selecting highly biased sample • Using invalid instrument • Drawing wrong conclusions • Incorrect reporting • Report findings I a way that changes them to serve your own or someone else’s interest is unethical • Inappropriate use of information • Use of information that directly/indirectly affect respondents adversely is unethical • Tell respondents of the potential use of the information (including the possibility of its being used against some of them) and let them decide if they want to participate
Issues relating the sponsoring organization • Restrictions imposed by funding/sponsoring organization • There may be direct/indirect controls exercised by sponsoring agency • They may select methodology • Prohibit the publication of ‘what was found’ • Other restrictions that may stand in the way of obtaining and disseminating accurate information • Both imposition and acceptance of these controls/restrictions are unethical – might be tailoring research findings to meet sponsoring agency its vested interest • Misuse of information • Unethical to use research for justifying management decisions when research findings do not support them