1 / 33

Data Collection

Data Collection. Research Process and Design Spring 2006 Class #5. Today’s objectives. Discuss prospectuses To answer any questions about material covered thus far To explore various quantitative data collection techniques To learn about test validity and reliability

muriel
Download Presentation

Data Collection

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Collection Research Process and Design Spring 2006 Class #5

  2. Today’s objectives • Discuss prospectuses • To answer any questions about material covered thus far • To explore various quantitative data collection techniques • To learn about test validity and reliability • To discuss how you plan to collect data for your group project Research Process and Design (Umbach)

  3. Methods of data collection • Cognitive tests • Achievement • Aptitude • Alternative Assessment • Non-cognitive tests • Surveys • Observations • Unobtrusive Research Process and Design (Umbach)

  4. Cognitive Tests • Norm-referenced: scores are interpreted relative to the scores of others taking the test (i.e., a norming group) • Johnny performed better than 95% of the other students • NRT scores include percentiles, stanines, CEEB scores (i.e., ETS scores), etc. • Criterion-referenced: scores are interpreted relative to what the student knows • Sally knows how to add, subtract, and multiply, but she does not know how to divide • Typically interpreted relative to some performance standard (e.g., pass or fail, competent or incompetent, etc.) Research Process and Design (Umbach)

  5. Non-Cognitive Assessments • Personality, Attitude, Value, and Interest Inventories • Concerns • Reliability often low • Social desirability • Self-report Research Process and Design (Umbach)

  6. Surveys • A survey is a systematic method for gathering information from a sample of entities for the purposes of constructing quantitative descriptors of the attributes of the larger population of which the entities are members. • A survey is information: • Gathered by asking people questions • Collected by having interviewers ask questions, having people read or hear questions and recording own answers • Collected from only a subset of the population rather than from all members Research Process and Design (Umbach)

  7. Modes • Face-to-face interviews • Telephone interviews • Mail surveys • Web surveys Research Process and Design (Umbach)

  8. Types of Items • Open-ended • Respondents create their responses • Closed • Respondents choose their response from alternatives given to them Research Process and Design (Umbach)

  9. Response formats • Likert scale • Ranking • Semantic differential • Numerical • Binary • Checklist Research Process and Design (Umbach)

  10. Determine the format for measurement • Thurstone scaling • http://www.socialresearchmethods.net/kb/scalthur.htm • Guttman scaling • http://www.socialresearchmethods.net/kb/scalgutt.htm • Scales with equally weighted items Research Process and Design (Umbach)

  11. Cognitive Processes in Answering Questions (Tourangeau, Rips, Rasinski, 2000) Comprehension of the question Retrieval of information Judgment and estimation Reporting an answer Research Process and Design (Umbach)

  12. Problems in Answering Questions • Failure to encode information sought • Misinterpretation of the questions • Forgetting and other memory problems • Flawed judgment or estimation strategies • Problems in formatting an answer • Deliberate misreporting • Failure to follow instructions Research Process and Design (Umbach)

  13. Tips for Writing Questions – Keep it Simple • Speak the common language • Avoid double-barreled questions • Be aware of problems associated with recall – add memory cues or bounded recall • Try to used closed questions • Be specific • Avoid check-all-that-apply (use yes/no or agree/disagree) Research Process and Design (Umbach)

  14. Evaluating Survey Questions • Expert reviews • Focus groups • Cognitive interviews • Pretests Research Process and Design (Umbach)

  15. Other measures • Observations: Recordings of naturally occurring behavior seen or heard by the observer • In quantitative research the observer remains detached and objective • Unobtrusive measures: Measures uninfluenced by an awareness of the subjects that they are participating • Archival data Research Process and Design (Umbach)

  16. An example: National Survey of Student Engagement (NSSE) • Modes of collection: Paper, Web, Mixed • Institutions provide population file that includes 1st year students and seniors from which NSSE draws a sample • NSSE responsible for contacting and following-up • Personalized pre-notification letter from school president on school letterhead • At least 1 electronic invitation and 2 follow-up contacts • Response rate in low 40s. Full-time students, females, and whites more likely to respond. Research Process and Design (Umbach)

  17. Validity • Validity • The extent to which inferences made from the results are appropriate, meaningful, and useful • Evidence based on • Test content • Internal structure • Relations to other variables Research Process and Design (Umbach)

  18. Evidence Based on Test Content • Extent to which the test covers the content it is supposed to cover • Types • Face validity: a cursory examination of the content of a test • Content validity: a systematic examination of the content of a test and the level of cognition at which it is tested; in other words, are you measure what you planned to measure • Estimated by creating a table of specifications and mapping items into it • Rely on experts Research Process and Design (Umbach)

  19. Evidence Based on Internal Structure • Extent to which different parts of an instrument and the items related to these parts are related in prescribed manners (i.e., construct validity) Research Process and Design (Umbach)

  20. Evidence Based on Internal Structure • Example • A “Survey of School Attitudes” scale measures 4 dimensions: curriculum, teachers and administrators, physical environment, and social relationships among students. • Items written to each dimension should be unique to that dimension and strongly related to one another. • Each dimension should represent a unique aspect of the construct of school attitudes and therefore not be related highly to the other dimensions. • Estimated statistically by inter-item correlations and factor analyses of dimensions Research Process and Design (Umbach)

  21. Evidence Based on Relations to Other Variables • Extent to which scores from a measure relate in a predictable manner to scores on a criterion measure • Convergent and discriminant evidence • Criterion-related evidence Research Process and Design (Umbach)

  22. Convergent and Discriminant Evidence • Convergent: Scores on the predictor measure relate positively to those on the criterion measure • e.g., ACT scores are related to freshman GPA’s, GRE scores are related to graduate school GPA’s • Discriminant: Scores on the predictor measure relate negatively to those on the criterion measure • e.g., frequency of counseling sessions and disruptive behavior, lesson effectiveness and off-task behaviors • Estimated with correlation coefficients Research Process and Design (Umbach)

  23. Criterion-Related Evidence • Predictive validity • Scores on a predictor measure are collected first while scores on the criterion measure are collected at some point in the future • e.g., ACT scores are collected during the student’s senior year in high school while freshman GPA is collected after completing his first year of college • Concurrent validity • Scores on the predictor and criterion measures are collected at the same time • e.g., a teacher’s perception of her effectiveness is collected at the same time as an observer’s ratings of that teacher’s effectiveness are made • Estimated with correlation coefficients Research Process and Design (Umbach)

  24. Effect of Validity of Research • Researcher must establish validity for the measures being used • Validity evidence is a matter of degree, not presence or absence Research Process and Design (Umbach)

  25. Reliability • Extent to which the results are consistent, that is, they are similar over different forms of the same instrument or occasions of data collection • Conceptual formula: Obtained Score = True Score + Error Research Process and Design (Umbach)

  26. Types of Reliability Estimates • Stability over time • Taking the same test on two occasions (i.e., test - retest) • Stability over item samples (also known as equivalence) • Taking two different forms of the same test at the same time (i.e., parallel forms) • Stability over item samples and stability over time • Parallel forms—one of which is taken at one time and the other at a later time Research Process and Design (Umbach)

  27. Types of Reliability Estimates • Internal consistency--artificially splitting a single test into two halves • Split-half: literally any combination of halves of the test • e.g., odd-even, first half-second half, etc. • Kuder Richardson formulae: statistical formulae estimating the average of all combinations of spit-halves • KR 20: uses test and item statistics in a complicated formula • KR 21: uses test statistics only but under-estimates the KR 20 • Cronbach alpha: similar to the KR 20 but applicable to non-dichotomous responses • e.g., Likert scales, Semantic Differential scales, etc. Research Process and Design (Umbach)

  28. Types of Reliability Estimates • Stability over scorers • Extent to which two or more people agree about what was observed or rated (i.e., inter-rater reliability) • Standard error of measurment • A way to assess the difference between obtained score and true score Research Process and Design (Umbach)

  29. Interpretation of Reliability Coefficients • All reliability coefficients range from 0 to 1 with the higher the coefficient the greater the reliability • Cronbach’s Alpha • Relatively low reliabilities OK and are tolerable in early phases of research • Higher reliabilities are required when the measure is used to determine group differences (>.7) (Nunnally, 1978) • Very high reliabilities are needed for making important decisions about individuals (>.9) (Pedhazur, p. 109) • Ultimately it depends on how much error the researcher is willing to have Research Process and Design (Umbach)

  30. Factors Positively Affecting the Reliability Coefficient • Heterogeneity of the group taking the test • Greater number of items • Greater variability in scores • Moderate levels of item difficulty • Subjects with characteristics similar to those of the norming group • Items that discriminate effectively between high and low achievers Research Process and Design (Umbach)

  31. Effect of Reliability on Research • A researcher must establish reliability for the measures being used • Reliability is a necessary, but not sufficient, condition for validity Research Process and Design (Umbach)

  32. Group Research Proposal • What technique will you use to collect data? Be specific. • How will you assess validity and reliability? Research Process and Design (Umbach)

  33. For next week… • Experimental research design • Readings: • *Gall, Borg, & Gall – Ch. 12 • *Steele & Aronson (1995) Reminder: Article review is due. Research Process and Design (Umbach)

More Related