1 / 25

Proposal for Midterm Review and Research Design

This proposal outlines the problem statement, objectives, background review, methods, and attachments for a midterm review and research design study. It discusses the importance of the study, the population and sampling methods, measurement instruments, and budget and timeline. The objectives of the study include estimating benefits and costs, determining program effects on the target population, and assessing community recreation needs.

cristinav
Download Presentation

Proposal for Midterm Review and Research Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods

  2. Purposes of Proposal • Communicate with Client • Demonstrate your grasp of problem • Plan the study in advance, so others can evaluate the study approach • will it work? • have you overlooked something? • will results be useful to client? • Can we afford it?

  3. Proposal Format 1. Problem Statement - define program to be evaluated/problem to be studied, users & uses of results. Justify importance of the problem/study. 2. Objectives : Concise listing . In evaluation studies, the objectives usually focus on the key elements of program to be evaluated & the evaluation criteria. These are the study objectives NOT the program objectives. 3. Background/Literature Review - place for more extensive history/structure of program. Focus on aspects most relevant to proposed evaluation. Discuss previous studies or the relevant methods. 4. Methods - details on procedures for achieving objectives - data gathering and analysis, population, sampling, measures, etc. Who will do what to whom, when, where, how and why? 5. Attachments - budget, timeline, measurement instruments, etc. NOTE: Most “programs” must be narrowed to specific components to be evaluated. Think of a “Program of studies” rather than a single evaluation study. The proposal should define this specific study & how it fits into a broader program of studies.

  4. Sample Objectives 1. Estimate benefits and costs of program 2. Estimate economic impacts of program on local community (social, environmental, fiscal). 3. Determine effects of program on target population. 4. Describe users and non-users of program 5. Assess community recreation needs, preferences 6. Determine market/financial feasibility of program 7. Evaluate adequacy or performance of program

  5. Research ProcessDefine Problem, Research Objectives • HOW? • Overall Method • Survey • Experiment • Case Study • Secondary Data • What? • Concepts • Variables • Measures • Who? • Population • Sampling Data Gathering Analysis Application

  6. Methods Choices • Overall Approach/Design • Qualitative or Quantitative • Primary or secondary data • Survey, experiment, case study, etc. • Who to study - population, sample • individuals, market segments, populations • What to study - concepts, measures • behavior, knowledge, attitudes • Cost vs Benefit of Study

  7. Major Design Types • Surveys • Experiments • Observation • Secondary Data • Qualitative Approaches • Focus Group • Case Study

  8. Research Designs/Data Collection Approaches

  9. General Guidelines on when to use different approaches 1. Describing a population - surveys 2. Describing users/visitors - on-site survey 3. Describing non-users, potential users or general population - household survey 4. Describing observable characteristics of visitors - on-site observation 5. Measuring impacts, cause-effect relationships - experiments

  10. Guidelines (cont) 6. Anytime suitable secondary data exists - secondary data 7. Short, simple household studies - phone 8. Captive audience or very interested population - self-administered survey 9. Testing new ideas - experimentation or focus groups 10. In-depth study - in-depth personal interviews, focus groups, case studies

  11. Primary or Secondary Data • Secondary data are data that were collected for some purpose other than your study,e.g. government records, internal documents, previous surveys • Choice between Primary /Secondary Data • Costs (time, money, personnel) • Relevance, accuracy, adequacy of data

  12. Qualitative vs Quantitative Approaches Qualitative Focus Group In-Depth Interview Case Study Participant observation Secondary data analysis Quantitative Surveys Experiments Structured observation Secondary data analysis

  13. Survey vs Experiment Survey - measure things as they are, snapshot of population at one point in time, generally refers to questionnaires (telephone, self-administered, personal interview) Experiment - manipulate at least one variable (treatment) to evaluate response, to study cause-effect relationships (field and lab experiments)

  14. Definition & Measurement “measurement is the beginning of science, … until you can measure something, your knowledge is meager and unsatisfactory” Lord Kelvin Nominal/Conceptual Definition - define concept in terms of other concepts, links concepts without tying them to real world Operational definition - equates definition with measurement, specify procedures/operations to generate the concept.

  15. Levels of Measurement

  16. Validity vs Reliability

  17. Questionnaire Design 1. Preliminary Info Information needed Who are subjects Method of communication 2. Question Content 3. Question Wording 4. Response Format 5. Question Sequencing/Layout

  18. What Info? Demographic, Socioeconomic, Physical Cognitive - Knowledge & beliefs Affective - attitudes, feelings, preferences Behavioral - actions

  19. Sampling • Always define study population first • Use element/unit/extent/time for complete definition • element - who is interviewed • sampling unit - basic unit containing elements • extent - limit population (often spatially) • time - fix population in time

  20. Types of Sampling Approaches • Probability vs non-Probability • Judgment, Simple Random, Systematic • Stratify or Cluster (Area Sample) • Time Sampling

  21. Sample size • Based on four factors • Cost/budget • Accuracy desired • variance in popln on variable of interest • subgroup analysis planned • Formula: n= Z2 2 / e 2 • n= sample size • Z indicates confidence level (95% = 1.96) •  = standard deviation of variable in population • e = sampling error

  22. Sampling errors for binomial(95% confidence interval)percent distribution in population

  23. Computing 95% confidence interval • N= 100 , sample mean = 46%, use p= 50/50, • sampling error from table = 10% • 95% CI is 46% + or - 10% = (36, 56) • N=1,000 sample mean =22% • sampling error from table = 2.5% • 95% CI is 22% + or - 2.5% = (19.5, 24.5)

  24. STEPS IN A SURVEY • 1. Define problem and study objectives • 2. Identify information needs & study population(s) • 3. Determine basic design/approach • - cross sectional vs longitudinal • - on-site vs household vs other • - self-admin. vs personal interview vs phone • - structured or unstructured questions • 4. Questionnaire design • 5. Choose sample (frame, size, sampling design) • 6. Estimate time, costs, manpower needs, etc.

  25. Survey Implementation • 7. Proposal & “Human subjects” review • 8. Line up necessary resources • 9. Pre-test instruments and field procedures • 10. Data gathering and follow-up procedures • 11. Coding, cleaning and data processing • 12. Analysis: preliminary, then final. • 13. Communication and presentation of results.

More Related