1 / 17

CaDEA Workshop 3 Input

CaDEA Workshop 3 Input. Brad Cousins University of Ottawa October 2010. Overview. Evaluation design options Data quality assurance validity/credibility reliability/dependability Instrument development and validation Data collection strategies. Design Choices. Comparison groups?

charla
Download Presentation

CaDEA Workshop 3 Input

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CaDEA Workshop 3 Input Brad Cousins University of Ottawa October 2010

  2. Overview • Evaluation design options • Data quality assurance • validity/credibility • reliability/dependability • Instrument development and validation • Data collection strategies

  3. Design Choices • Comparison groups? • Yes, no, hybrid • Black box, grey box, glass box • Data collected over time? • Yes, no, hybrid • Mixed methods • Quant, qual., simultaneous, sequential

  4. Evaluation design alternatives • One shot, post only • X O1 • Comparative post only • X O1 • O2 • Randomized control trial • R X O1 • R O2

  5. Evaluation design alternatives • Time series design • O1 O2 O3 O4 X O5 O6 O7 O8 • Pre-post comparative group design • O1 X O3 • O2 O4 • Delayed treatment group design • O1 X O3 O5 • O2 O4 X O6

  6. Major Concepts VALIDITY/CREDIBILITY • Key points • Degrees on a continuum; • Describes the results or inferences; NOT the instrument; • Depends on the instrument and the process; • Involvesevidence and judgment; • Internal validity/credibility • Attribution: how confident canwebethat the observedeffects are attributable to the intervention?

  7. Threats to internal validity • Actual but non-program related changes in participants • Maturation • History • Apparent changes dependent on who was observed • Selection • Attrition • Regression • Changes related to methods of obtaining observations • Testing • Instrumentation

  8. Instrument Development • General Principles • Build on existing instruments and resources • Ensure validity: face, content, construct, • Ensure reliability (eliminate ambiguity) • Consider task demands • Obtrusive vs unobtrusive measures • Use of conceptual framework as guide • Demographic information solicited at end • Pilot test

  9. Questionnaire Development • Scales: Nominal, ordinal, interval • Selected response • Multiple choice (tests) • Fixed option: • Check all that apply • Check ONE option only • Likert type rating scales • Frequency (observation): N R S F A • Agreement (opinion): SD D A SA

  10. Questionnaire Development • Selected response (cont) • Rank ordered preferences (avoid) • Paired comparison • Constructed response • Open-ended comments • Structured • Unstructured • If ‘other’ (specify)

  11. Questionnaire Development • Data collection formats • Hardcopy – data entry format • Hardcopy – scan-able format • Internet format • Over specify instructions • Judicious use of bold/italics and font variation • Response options on right hand side • Stapling: booklet > upper left > left margin • Judicious determination of length (8 p. max)

  12. Interview / Focus Group Instrument Development • Review of purpose / expectations • Spacing of questions to permit response recording • Questions vs prompts • Use of quantification

  13. Data Collection Ethics • Ethics review board procedures/protocols • Letters of informed consent • Purpose • How/why selected • Demands / Right to refusal • Confidential vs. anonymous • Contact information • Issues and tensions

  14. Data collection • Interview tips • Small talk – set the tone • Audio tape recording – permission • Develop short-hand or symbolic field note skills • Permit some wandering but keep on track • Minimize redundancy

  15. Sampling • Quantitative for representation • proportionate to population • random • Qualitative to maximize variation • Purposive sampling: based on prior knowledge of case(s)

  16. Useful References Colton, D. & Covert, R. W. (2007). Designing and constructing instruments for social research and evaluation. San Fransisco: John Wiley and Sons, Inc. Cresswell, J. W. & Miller, D. L. (2000). Determiningvalidity in qualtitativeinquiry. Theoryinto practice, 39(3), 124-130. Fraekel, J.R. & Wallen, N.E. (2003). How to Design and Evaluate Research in Education. New York: McGraw-Hill. McMillan, J. H. (2004). 4th Ed. Educational Research. Toronto: Pearson, Bacon and Allen, pp. 172-174. Shultz, K.S. & Whitney, D.J. (2005). Measurement Theory in Action: Case Studies and Exercises. Thousand Oaks, CA: SAGE Publications.

More Related