1 / 42

Quantitative Research Method (Research Design)

Quantitative Research Method (Research Design). By Temtim Assefa October, 2013. Research Design . It is the entire design of the research project It is said as the research proposal It involves deciding on all aspects of the research process That includes Philosophical assumption

gordy
Download Presentation

Quantitative Research Method (Research Design)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Quantitative Research Method (Research Design) By TemtimAssefa October, 2013

  2. Research Design • It is the entire design of the research project • It is said as the research proposal • It involves deciding on all aspects of the research process • That includes • Philosophical assumption • Research method • Data collection techniques • Data analysis • Publication outlet if possible • It matters to get your proposal accepted or rejected

  3. Overview of Quantitative Research • Uses deductive method of knowledge acquisition • Intends to falsifies an existing theory • Tries to generate generalizable knowledge • Accepts objectivity of knowledge • Uses a standard measurement instruments • The researcher is independent in the process of knowledge construction • Quantifies the phenomena in terms of numbers • Intends with prediction and controlling

  4. Types of Design • Research methods used in Quantitative research method include • Survey /field study • field experiment, • laboratory experiment, • simulation, and • case study

  5. Survey Design • Webster’s Dictionary defines a survey as “a general study or inspection” • A survey is a means of "gathering information about the characteristics, actions, or opinions of a large group of people, referred to as a population" [Pinsonneault and Kramaemer] • Epistemologically surveys provide one way of obtaining and validating knowledge

  6. Survey Research • Survey for research is distinctly different • Focus on surveys that are conducted to advance scientific knowledge, which we refer to as survey research.

  7. Distinctiveness of survey research • The purpose of the survey is to produce quantitative descriptions of some aspects of the studied population.. requires standardized information from and/or about the subjects being studied • relationships between variables • projecting findings descriptively to a predefined population • The main way of collecting information is by asking people structured and predefined questions. Their answers constitute the data to be analyzed.

  8. Distinctive …. Information is generally collected about a fraction of the study population--a sample--but it is collected in such a way as to be able to generalize the findings to the population-such as service or manufacturing organizations, line or staff work groups, MIS departments, or various users of information systems.

  9. Techniques • Survey research combines three techniques • Collection of answers using standardized questionnaires • Random sampling from a known population • Statistical analysis of a quantified representation of the survey answers

  10. Purposes of Survey • Exploratory • Little is known about a population • Further information is desired about research variables • Prelude to a costlier, larger research endeavors • Descriptive • Focus on who, what, when, how • Existence of opinions and attitudes • Explanatory • Cause and effect, focus on why • Reasons for existence of facts and opinion is of interest

  11. Explanatory Requires Conceptual Framework Participation Conflict Conflict resolution Influence Conceptual Framework of User Involvement in system development

  12. Survey stages • Clarify the research problem • Develop conceptual framework if explanatory design • Decide what type of survey design is appropriate • Decide what type of sample design is most appropriate • Decide on the size of sample • Plan sampling procedures • What form of data collection • Design the questions and test your instruments • Undertake the fieldwork and data collection • Process the data • Analyze the data and interpret your results • Disseminate your findings

  13. Clarify the research problem • Move from general idea to focused topics • Degree to which study meets its aims will be determined by relevance and completeness of research questions addressing the research problem

  14. Empirical to theoretical mapping

  15. Decide type of survey • Cross-sectional or one-off • Longitudinal or time series ( collection of data at two or more points in time) • Before and after studies • Panel studies • Combination of Cross-sectional and longitudinal

  16. Sample design • Decide which type of sample design is most appropriate • Probability samples • Simple random samples • Cluster samples • Non-Probability samples • Quota samples • Volunteer samples • Convenience samples • Snowball samples • Purposive sampling

  17. Errors • Sampling errors • Difference between the results you will get from a full census and the results you will get from a sample • Non-Sampling error • Systemic • Random

  18. Sample Size • Inevitably a compromise between methodological imperative and practical constraints • Determined by level of sampling error • How about Non-response? • Formula [Calder Judith, 1998] n = (desired confidence level)2 * (s.d) 2 /desired level of precision) 2

  19. Decide form of data collection • Questionnaires • Interviews • Observation

  20. Data Collection • Some steps • Organize in advance • Agree access to your respondents • Collect the data • Monitor progress • Chasing up • Quality of the process of fieldwork and data collection • Report honest and reflect on fieldwork in research reports[Calder Judith]

  21. Finally – Interpret Data • Analyze the data and interpret your results • Does the data make sense? • Beware of spurious patterns • Present and disseminate your findings

  22. Benefits • Theoretical propositions can be tested in an objective fashion • Enable generalizations • Easy to use (although may be an illusion) • Established and accepted research method across multiple fields • Aided by statistical tools • Immensely helped by advancement computing • Strong in external validity if sampling errors are minimized.

  23. Weaknesses • Difficulty in obtaining truly random sampling • Low response rates. Business and IT staff are inundated with surveys • Weak linkages between units of analysis and respondents, and • Over-reliance on cross-sectional surveys where longitudinal surveys are really needed • single-method designs where multiple methods are needed • Inappropriateness • Different results with behavioral observation and self report

  24. Field Experiment

  25. Field Experiment • A field experiment is an experimental research method which is performed outside the laboratory in the natural settings • It follows all steps of scientific process • selection and determination of a problem, • selection of participants and measuring instruments, • selection of a research plan, • execution of the plan, • analysis of data and • formation of conclusions

  26. Features of FE • It has three unique features • the research takes place in natural setting • the experimenter manipulates one or more independent variables while exerting much control over confounding variables; and • the effect of the manipulations on one or more dependent variables is systematically observed.

  27. There three type of designs FE Design

  28. Controlling Nuisance in Field Experiment • Solomon Four Groups in True Experiment Experimental group 1: R O X O Control group 1: R O O Experimental group 2: R X O Control group 2: R O • Removing statistically the effect of suspected confounding variables

  29. Example – Interface Design and Computer use System use is low among employees. One reason is poor interface design. You divide the employees into two groups (Group 1 & 2) Redesign the Interface and let Group 1 work on the new system - Treatment Group Let Group 2 work on the old Interface– Controlled Group After six or some months after Interface design, measure, the computer use. If computer use is increased for Group 1, You can conclude that Interface design and computer use has association

  30. Data collection • The following methods are used to collect data • Questionnaire • Structured interview • Computer log files

  31. Data Analysis • Statistical Data analysis methods are used • For example • T-test is used to compare the mean between treatment and controlled groups if they significantly differ • Does Group 1 mean is different from Group 2 that can be generalized for the entire population • ANOVA is used for groups more than two • Correlation is used to observe association • Doe increase one variable follows the same pattern in other variable

  32. Advantages of FE • Conducted in natural settings and has higher experimental validity • Help to clearly identify antecedents of observed effects in casual relationships • Used for the development of theory as well as for the solution of applied problems; • The logic of FE can be applied in the analysis of many naturally occurring changes • Help for testing of broad hypotheses dealing with complex social process in lifelike situations.

  33. Disadvantage • The methods used to control confounding variables in FE are not sufficient • Manipulation of variables cause legal and ethical problems • Difficult to control dynamically changing environment during the course of the experiment • Difficult to precisely measure dependant variables in the field settings • Expensive to conduct compared to lab experiment • Needs highly skilled person to design and conduct

  34. Role of FE in computer Science • It has many applications in IS • To develop theory • To test hypothesis • To evaluate IS tools and techniques

  35. Practical Examples • Fukada et al (n.d.) used field experiment to study the effect of Road Facility Management Support System on asset management of the public infrastructure. • Mayur et al (2000) made study to compare and suggest the best training method for companies between instructor based training and computer based training methods. • Chen et al (2007) made a field experiment study to identify effects of different types of social information on contributions to an online communities.

  36. Instrument Design

  37. Some Notes on Questionnaires design • Questionnaires make use of lists and rating scales • Behaviors and attitudes are complex and cannot be easily evaluated and quantified • Check list is a list of behaviors, characteristics or other entities that a researcher is investigating • Either the researcher or participants simply check(s) items from the list • What are features of user friendly software • Graphical interface • Clear navigation direction • Immediate feedback • Other specify ____________

  38. Questionnaire … • A rating scale is more useful when a behavior, attitude, or other phenomena of interest needs to be evaluated on a continuum scale • It is designed with the following scales • “inadequate” to “excellent”, • “never” to “always” or • “strongly disapprove” to “strongly approve”

  39. Guideline for Questionnaire • Keep it short • What do I do with the information? • Is it absolutely essential to have this information to solve part of the research problem • Use simple, clear, unambiguous language • Check for unwarranted assumptions implicit in your questions • How many cigarette do you smoke each day? • Good to add a choice • 25 ___ 25-16 ____ 15-5 ___ <5 ____ None ____ • Word your questions in ways that don’t give clues about preferred or more desirable responses What strategies have you used to try to quit smoking? • Leads him to list strategies he did not try

  40. Guide line … • Check consistency – that leads to give contradictory answer for two questions • Determine in advance how you will code the response • Keep the respondents task simple • Provide clear instructions • Make the questionnaire attractive and professional looking • Conduct a pilot test • Give for half a dozen friends to see they have difficulty understanding any items • Scrutinize the almost final product carefully to make sure it address your needs

  41. Review Questions Explain the different quantitative research design methods What is the strength and weakness of each method What are the different survey research methods What is the main data analysis methods what problem can you address with quantitative research methods

  42. Review questions What are the procedures in quantitative Research How do you use Quantitative research in computer science?

More Related