1 / 85

Presenters

Richard W. Riley College of Education and Leadership Research Methodology for the Project Study EdD Residency Program. Presenters. Dr. Wade Smith: Wade.smith@waldenu.edu 352-895-9900 Dr. Paul Englesberg, paul.englesberg@waldenu.edu 360-380-2238 PST

tameka
Download Presentation

Presenters

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Richard W. RileyCollege of Education and Leadership Research Methodologyfor the Project StudyEdD Residency Program

  2. Presenters • Dr. Wade Smith: Wade.smith@waldenu.edu • 352-895-9900 • Dr. Paul Englesberg, paul.englesberg@waldenu.edu • 360-380-2238 PST • Dr. Wallace Southerland, wallace.southerland@waldenu.edu • 919-815-5323 • Dr. Martha Richardson; martha.richardson@waldenu.edu

  3. Collaborative Space for Notes • http://edd-res-method.wikispaces.com/ • Ask to join • Upload notes for colleagues usage

  4. Purpose of this Session • Select and apply the appropriate method to a given problem statement and research questions. • Align methodology with problem statement, research questions, data collection, and analysis. • Recognize methodology alignment in a peer-reviewed article • Practice summarizing primary research.

  5. Methodology for a Project Study (EdD) • STUDY TYPE • PROJECT STUDY – Scholarly paper and project • STUDY METHODOLOGY • Quantitative • Qualitative • Mixed

  6. Quantitative Research • Concise statement of purpose and question • Specification of measurable constructs (variables) • Question poses a relationship or comparison • Results are numerically presented. • Narrative is objectively oriented, all elements congruent, exactly consistent statements of key elements • NOT creative or emotive rhetorical writing

  7. Research Questions and Purpose The research question is the most important element of the research endeavor. More time and care should be invested in determining the right question in the correct form than in any other part of the process. Once a proper question is articulated, the rest of process falls into place [paraphrase] (Creswell, personal communication, 1995).

  8. DVs and IVs for Quant. Problem Statement • Study must include a conjectured relationship between at least 2 variables: • Independent Variable (IV): The variable that is being manipulated or tested • Dependent Variable (DV): The variable that varies based on manipulation of the IV • Example: SIGNIFICANT Differences in reading comprehension scores (DV) by level of parental involvement (IV)

  9. Dependent and Independent Variables • According to Creswell (2003), independent variables are generally defined as consisting of the two or more treatment conditions to which participants are exposed. These variables “cause, influence, or affect outcomes” (Creswell, 2003, p. 94). • Dependent variables are observed for changes in order to assess the effect of the treatment. They “are the outcomes or results of the influence of the independent variables” (Creswell, 2003, p. 94).

  10. Dependent and Independent Variables For example: An independent variable could be parental involvement in reading activities of English language learners while the dependent variable is identified as reading comprehension performance as measured by the reading portion of Criterion Reference Competency Test (CRCT) for English reading comprehension assessments.

  11. I. Quantitative purpose and questions • Specific purpose, research questions, hypotheses, and/or research objectives are concise and clearly defined. • Must include measurable elements.

  12. Quantitative Research Questions • There are three basic types of questions • Descriptive: When a study is designed primarily to describe what is going on or what exists • Relational: When a study is designed to look at the relationships between two or more variables • Comparative: When a study is designed to compare differences between groups or conditions in terms of measurable outcomes; sometimes called causal comparative

  13. Hypotheses • Statements of relationships between variables you want to test—only when using statistical tests • Expectations based on what the literature indicates about the relationship between the variables • Stated in terms of a null hypothesis (no SIGNIFICANT difference or no SIGNIFICANT change) and research hypothesis (statement of conjectured SIGNIFICANT difference)

  14. Hypotheses as Operational Definitions • Hypotheses are, in a sense, a statement of operational definitions • An operational definition matches a concept, such as intelligence, with a measurement tool, such as the Wechsler Adult Intelligence Scale (WAIS) • Your hypotheses MUST operationalize your concepts; this makes your hypotheses TESTABLE.

  15. Hypothesis Example • You are interested in cognition and ADD • Null (H01): There is no [statistically significant] difference in time on task [as measured by the TaskTest], between ADD children who are given 5 minutes of physical exercise every half an hour and ADD students who go through the normal daily routine. • Research (H1): There is a [statistically significant] difference in time on task [as measured by the TaskTest], between ADD children who are given 5 minutes of physical exercise every half an hour and ADD students who go through the normal daily routine.

  16. Hypotheses • Statistical hypotheses should be straightforward statements of what is expected to happen in regard to the independent and dependent variables. • NOTE: Descriptive questions do not suggest hypotheses.

  17. Hypotheses, cont’d. • The NULL is a statement of NO SIGNIFICANT effect, NO SIGNIFICANT relationship, or NO SIGNIFICANT difference, depending on the research question and design. The statement should be clear, understandable, and not unnecessarily complex or obtuse.

  18. Hypotheses, cont’d. Null hypotheses are in the following form: The independent variable has NO SIGNIFICANT effect on the dependent variable. Or There is NO SIGNIFICANT relationship between the independent and dependent variable. Or There is no SIGNIFICANT difference between the treatment group and the placebo group in terms of the dependent variable.

  19. Hypotheses, cont’d. • For complex designs, multiple hypotheses are sometimes required. For example, when the design incorporates multiple independent variables (factorial designs), there might be two or more sets of hypotheses.

  20. Research Question Example Example: Research questions: • Does gender of student affect math performance? • Does pedagogy X affect math performance? • Is there an interaction between gender and pedagogy X in terms of math performance?

  21. Hypotheses Examples • Hypotheses:Null 1: Gender has no SIGNIFICANT effect on math performance.Null 2: Pedagogy X has no SIGNIFICANT effect on math performance.Null 3: There is no SIGNIFICANT interaction between gender and pedagogy X in terms of math performance.

  22. Measured Variables • Note that quantitative analysis requires numeric measurement of variables. So, focus groups, interviews, and open-ended observations are NOT typically part of quantitative study. Quantitative inquiry focuses on variables that can be measured by reliable and valid instruments, such as tests, numerically represented attitude scales, or other tools that yield numerical results.

  23. I. Purpose of the Study • Simple paragraph that describes the intent of your study. It should flow directly from the problem statement. Two to three sentences are sufficient. • It should be logical and explicit. • Review that it directly relates to the problem statement and research questions.

  24. Purpose of the Study Example • The purpose of this correlational study is to examine the relationship between the level of parental involvement and reading comprehension performance on the XYZ assessment among elementary English language learners. • (Matches earlier DV and IV example)

  25. Quantitative Research – Design and Methods • All elements of the quantitative study MUST be exactly aligned. • The problem, purpose, questions, hypotheses, design, and methods MUST be congruent and directly aligned. • The same terms must be used every time an element is discussed, described, or mentioned.

  26. Quantitative Research – Design and Methods, cont’d. • The design and methods narrative includes: • The larger population of interest, to which the results will be generalizable • The location and context of the study (how does the location and context relate to the research purpose and questions?) • Instruments or means of measuring variables (Only variables mentioned in the purpose and question are measured.)

  27. Quantitative Research – Design and Methods, cont’d. • The subjects who will provide the measurement on variables of interest (Why are these the best subjects for the purpose and questions?) • The sampling strategy (for quantitative inquiry, a random/representative or equivalent sample is required)

  28. Quantitative Research – Design and Methods, cont’d. • Data analysis—how will the data collected be analyzed to answer the research question?

  29. Main Research Designs • Experimental • Random assignment—Comparative design • Quasi-Experimental—Pre-selected groups (not random–convenient) • Causal comparative design • Within-group designs (Pretest/posttest or matched pairs comparisons) Very weak design – unconvincing. • Between-group designs (comparisons b/w groups) • Non-Experimental • Descriptive • Correlational/secondary data analysis

  30. Selecting a Research Design • What are the underlying constructs (variables)? • What is the intended population? • What is the intended interpretation? • Sometimes you need more than one type of design • You can/should use an existing design (Creswell, pp.168–171)

  31. A Few Examples of Quantitative Studies • Treatment/intervention outcomes (e.g., in schools/classrooms) • Activity analysis (# of behaviors/episode) • Policy analysis (Breakdown by criteria – content analysis) • Program Evaluation • Needs Assessment • Surveying an organization to understand the impact of management practices on employees • Secondary data analysis • Developing reliable and valid scales to assess attitudes

  32. Methodology —Setting and Sample • Setting and Sample • Population from which sample will be drawn • Describe and defend sampling method • Describe and defend sample size • Use sample size generators for random selection designs: • See notes • Eligibility criteria for participants • Characteristics of selected sample

  33. Population and Sample Example • What is the population of interest (Ex: 4th grade boys) • Your sample is drawn from a population. Note: You can only generalize to the populations from which you have sampled. Describe how you will select your participants. • Recruitment strategies • What is your role as a researcher? • Describe the demographics of your sample • Gender • Age • Independent variables

  34. Types of Samples • WEAK: Convenience sample – the subset of a population is available for study (may or may not be representative of the population) • Random Sample – each member of a population has an equal chance of being picked • Equivalent to random/representative • Website for further information: http://www.socialresearchmethods.net/tutorial/Mugo/tutorial.htm

  35. Sample Size • Rule of thumb: In order to estimate the number needed for your sample, use 15–20 participants per variable. The larger the sample, the better! • Statistical Power Analysis basically answers the question How large must my sample be to ensure a reasonable likelihood of detecting a difference if it really exists in the population? • You will need to report statistical power in your study (http://statpages.org/).

  36. Methodology —Instrumentation • Instrumentation and Materials—describe data collection tools • Name of instrument • Concepts measured by instrument • How scores are calculated; meaning • Processes of assessing the reliability and validity of instrument (e.g. Cronbach’s alpha, etc.) See notes • How participants will complete • Detailed description of each variable

  37. Instrumentation and Materials • Describe the survey or other tools you will use in your investigation • A brief description of the methods with references to previous studies that have used it • Identify the underlying constructs. Constructs should derive from the theoretical foundation. • Include copies of your measure in an Appendix.

  38. Test Validity • A score from a test or other measurement instrument must represent what it is intended to represent. • The researcher must provide some evidence and support for validity.

  39. Test Reliability • Tests/measurement scores must be reliable. • Reliable means that the score will be similar over different administrations, or that the score is based on an internally consistent set of items/tasks. • Reliability is a technical issue.

  40. Methodology —Data Analysis • An explanation of all descriptive and/or inferential analyses • Null hypotheses as they relate to research questions • Specific explanations of variables • Best presented in a sequence that matches the research questions

  41. Statistical Analysis • Based on frequency in category, average of measurement, variability of measurement • Answers questions like is the frequency of occurrence what we expect? Is frequency the same across groups? • Are the average results different between or among groups? • Is there a relationship between or among measured variables?

  42. Characteristics of a Quantitative Method • Rooted in testable and confirmable theories • Looking for relationships • Statistical tests are used to analyze the data: • t tests • Analysis of variance (ANOVA), analysis of covariance (ANCOVA) • Chi square analysis • Correlation • Linear/Logistic regression See notes

  43. Types of Tests • Independent-Samples t test (compare scores of two independent groups) • Compare achievement of two groups • Compare employees from two companies on morale • Paired-Samples t tests (compare two groups of scores that are matched) • Compare the pretest and posttest scores provided by participants of an intervention (pre-post design) • ANOVA (comparing two or more levels of an independent variable) • Can be between groups (independent groups) or repeated-measures (matched scores)

  44. Types of Tests • Chi-square (examine statistical relationship between categorical variables) • Association – relationship between two categorical variables • Goodness of fit- is the distribution in your sample the same as the population or the same as another study • Correlation (relationship between 2 variables) • Pearson r (parametric) or Spearman (non-parametric) • Regression (examine the effect of multiple independent variables on one variable See notes • How do various differentiation strategies predict achievement?

  45. Which test should I use? • Tutorial that helps you decide: http://www.wadsworth.com/psychology_d/templates/student_resources/workshops/stat_workshp/chose_stat/chose_stat_01.html • This site has four different interactive webpages that help you decide the correct analytical procedure: http://statpages.org/#WhichAnalysis

  46. Tool for Aligning Methods and Questions

  47. GOAL = Overall Alignment in your Study • Problem Statement • Nature of the Study/Guiding Question • Purpose of the Study and • Research Design • Setting and Sample • Data Collection • Data Analysis

  48. References • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Mahwah, NJ: Lawrence Erlbaum. • Creswell, J. W. (2002). Research design: Qualitative, quantitative, and mixed-methods approaches (2nd ed.). Thousand Oaks, CA: Sage Publications. • Gravetter, F. J., & Wallnau, L. B. (2004). Statistics for the behavioral sciences (6th ed.). Belmont, CA: Thompson-Wadsworth. • Hallahan, M., & Rosenthal, R. (1996). Statistical power: Concepts, procedures, and applications. Behavior Research and Therapy, 34, 489–499. • Lipsey, M. W., & Wilson, D. B. (1993). The efficacy of psychological, educational, and behavioral treatment: Confirmation from meta-analysis. American Psychologist, 49(12), 1181–1209. • Murphy, K. R., & Myors, B. (1998). Statistical power analysis: A simple and general model or traditional and modern hypothesis tests. Hillsdale, NJ: Erlbaum. • Patten, M. L. (2007). Understanding research methods: An overview of the essentials (6th ed.). Los Angeles: Pyrczak Publishing. • Rossi, J. (1990). Statistical power of psychological research: What have we gained in 20 years? Journal of Consulting and Clinical Psychology, 58(5), 646–656. • Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics (4th ed.). Needham Heights, MA: Allyn & Bacon.

  49. Qualitative Research/Naturalistic Inquiry: Project Study • Natural setting & culture/context-bound • Researcher as instrument (interpersonal skills essential) • Use of researcher’s tacit/implicit knowledge • Qualitative methods (narrative observation & in-depth interview) • Purposive sampling (based on research question) • Inductive data analysis (emic vs. etic) • Grounded theory (emerges from data/represent “reality”)

  50. Qualitative Research/Naturalistic Inquiry (cont’d.) • Negotiated outcomes (interpret with participants) • In-depth contextualized reporting (thick description) • Idiographic (vs. nomothetic) representation • Tentative (conditional) application • Focus-determined but emerging boundaries • Criteria for trustworthiness (confidence in findings)

More Related