1 / 50

FHWA Civil Rights: Data Collection and Program Evaluation

FHWA Civil Rights: Data Collection and Program Evaluation. …why should I care?. Data Collection and Program Evaluation (Analysis). …can be confusing. Objectives. Identify the importance of data collection and program evaluation. Explain the basic structure of an evaluation plan or strategy.

dragon
Download Presentation

FHWA Civil Rights: Data Collection and Program Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. FHWA Civil Rights: Data Collection and Program Evaluation …why should I care?

  2. Data Collection and Program Evaluation (Analysis) …can be confusing

  3. Objectives • Identify the importance of data collection and program evaluation. • Explain the basic structure of an evaluation plan or strategy. • Design a specific program evaluation plan to • Identify specific data that must be collected, • Assess the implementation and effectiveness of various civil rights programs, and • Ensure State DOT management is aware of issues, needs and potential problems associated with various civil rights programs.

  4. Why should I care? • Reliable information about the impact of FHWA programs (program evaluation) is essential to ensuring non-discrimination. • The relationship between FHWA programs and varying social factors have wide-ranging implications for any community, municipality, county, or state.

  5. Why is it important? • Data collection and program evaluation (analysis) can: • Help identify and prioritize problem areas, • Initiate and evaluate the effectiveness of policies and programs to ensure non-discrimination, • Assess the relationship between specific programs and beneficiaries to develop non-discrimination strategies.

  6. Importance(continued) • Draw agency attention and resources to resolve issues and problems. • Provide justification for certain programs or illustrate a need for new programs. • Help justify continued or additional funding. • Communicate the importance of the goals, solutions, strategies, or programs.

  7. “Program evaluation is like a bunch of feathers because it lifts you up to get a bird’s eye view so you can see the whole picture” (unknown).

  8. Current FHWA Data Collection Each State DOT must “develop procedures for the collection of statistical data (race, color, religion, sex, and national origin) of participants in, and beneficiaries of State highway programs, i.e., relocatees, impacted citizens and affected communities.” (23 CFR 200.9(b)(4)) • How do you collect statistical data? • What do you do with it? • Project-based or Program-based? • Benefits vs. Burdens analysis? • “Letter of the law” or “Spirit of the law”?

  9. Steps to look at the whole picture • Step 1: Focus the Evaluation • Step 2: Identify Barriers to Evaluation • Step 3: Methods of Data Collection • Step 4: Organizing, Analyzing, Interpreting, Summarizing and Reporting Results • Step 5: Develop the Evaluation Plan

  10. Step 1: Focus the Evaluation • What will be evaluated? • What is the purpose for this evaluation? • Who will be affected by or involved in the evaluation? • What are the critical questions the evaluation must address?

  11. Step 1: Focus Example(Right-of-Way) • What will be evaluated? • Real estate appraisals (project-based or program-based) • What is the purpose of the evaluation? • To ensure real estate appraisals are equitable without regard to race, color, or national origin. • Who will be affected by or involved in the evaluation? • Certified Appraiser - Relocation Agents • Review Appraisers - Property Management • State DOT Appraisers - State DOT Leadership • Acquisition Agents - FHWA Division/Headquarters

  12. Step 1: Focus Example(Right-of-Way) • What are the critical questions the evaluation must address? • Was the resulting appraisal amount fair and equitable? • Are the land unit values consistent throughout the project area? • Were relocatees required to move to a similar socio-demographic region? • Were relocatees provided relocation advisory services to find replacement housing (i.e., transportation to/from available housing).

  13. Step 2: Identify Barriers • Factors that may influence the evaluation • Implications for the evaluation • Strategies to overcome barriers Evaluation is like an acorn because it must overcome adverse conditions to grow into a tree” (unknown).

  14. Step 2: Identify Barriers • Organizational Politics: • Is there support for the evaluation? • Are there opponents to the evaluation? • Project Leadership: • Who has control over the program? • How does the evaluation fit their goals? • Professional Influences: • Are professional groups interested? • Are the supportive of an evaluation?

  15. Step 2: Identify Barriers • History: • What has been the tradition or history of self-evaluation of programs? • Have evaluations been conducted before? • Organizational Setting: • Does the program fit into a larger organizational network? Where or how? • What kind of information could jeopardize the program?

  16. Step 2: Identify Barriers • Economics: • Is fiscal support for the program and evaluation secure? • Interpersonal Patterns: • How much interagency or interpersonal conflict is likely to occur as a result? • Could the evaluation be controversial to internal or external audiences?

  17. Step 2: Identify Barriers • Legal Guidelines: • Will legal restrictions limit collection of desired information/data? • Will the project be affected by pending legislation? • Resources: • Are resources (human, financial, time, etc.) available to support the evaluation?

  18. Step 3: Data Collection Methods • What type of data should be collected? • What methods should be used to collect data? • How much information should you collect? • Questionnaires • Validity and Reliability

  19. Step 3: Types of Data • Descriptive information can include: • Characteristics of the program. • Demographic information on program participants, beneficiaries, etc. • Actual benefits paid or realized by beneficiaries. • Specific results of a program or service.

  20. Step 3: Types of Data • Judgmental information can include: • Opinions from consultants. • Beneficiaries beliefs and values. • Agency personnel’s interpretation of laws. • Stakeholders perceived priorities. • Interpretations of policy, procedures and guidelines.

  21. Step 3: Data Collection Methods • Opinion Surveys: as assessment of how a person or group feels about a particular issue. • Questionnaire: a group of questions that people respond to verbally or in writing. • Case Studies: experiences and characteristics of selected persons involved with a program. • Individual interviews: individual’s responses, opinions, or views.

  22. Step 3: Data Collection Methods • Group Interviews: small groups’ responses, opinions, and views. • Records: information from records, files, or receipts. • Advisory, Advocate Teams: ideas and viewpoints of selected persons.

  23. Step 3: Choosing a Method • Availability: information may already be available to you that can help answer some questions. Review information in prior records, reports and summaries. • Interruption Potential: the more disruptive an evaluation is to the routine of the program, the more likely that it will be unreliable. • Protocol Needs: you may need to obtain appropriate permission to collect information.

  24. Step 3: Choosing a Method • Reactivity: you do not want “how” you ask something to alter the response you get. • Bias: to be prejudiced in opinion or judgment. • Reliability: will people interpret your questions the same way each time? • Validity: Will the collection methods produce information that actually measures what you intend to be measured?

  25. Step 3: How Much Data? • Sampling: a portion of subjects in order to learn something about the entire population without having to measure the whole group. • Random: each individual in the group has an equal chance of being chosen for the sample. • Purposive: a sample that will represent specific viewpoints.

  26. Step 3: When Choosing a Method, Consider… • Should you use a sample of a population or a census (an entire population, such as all people living in a project area)? • Should you use a random or purposive sample? • How large a sample size do you need? • Is your sample likely to be biased?

  27. Step 3: Questionnaires • Questionnaires are a widely used method of collecting information. They can be a cost effective way to reach a large number of people or a geographically diverse group.

  28. Step 3: Questionnaires • Questionnaires should include the following key elements: • Cover letter: include the purpose of the study, why and how participants were selected, etc. • Introduction: a short recap of the information included in the cover letter. • Instructions: clear and concise.

  29. Step 3: Questionnaires • Questionnaires should include the following key elements: • Grouping questions: group questions with similar topics together in a logical flow. • Demographic questions: age, gender, ethnicity, etc. • Thank you: thank the respondent for completing the questionnaire. • Disposition: what to do with the questionnaire when complete.

  30. Step 3: Validity • “Does the questionnaire measure what you want it to measure?” A questionnaire that is valid for a specific situation or audience may not be valid in a different situation or for a different audience. • Have a panel of experts (professionals familiar with the program) review the questionnaire and provide feedback. • Test the questionnaire on a similar population.

  31. Step 3: Reliability • Does the questionnaire consistently give the same results with the same group of people under the same condition? • Can be shown when using a pilot sample with a questionnaire administered repeatedly over time. • Are answers repeated over time? • Revise as needed.

  32. Step 4: Organizing, Analyzing, Interpreting, Summarizing and Reporting • Organize Evaluation Data • Analyze • Quantitative Data Analysis • Qualitative Data Analysis • Interpret (summarize) • Report Results

  33. Step 4: Organize Data • Develop a system to organize your data (spreadsheet). • As data are received, check to be sure it is complete. • Set up protocol for who does/does not have access to the data. • Track all data. Ensure data are not lost or overlooked as analysis and summarizing begin.

  34. Step 4: Quantitative Data • Quantitative Data Analysis: information that contains numerical data. • N: The total number of participants in the sample. • Mean: The average score of the sample. • Median: The score halfway between the high and low score. • Mode: The response given the most. • Standard Deviation: The range from the mean in which 68% of the responses are found. • Frequency: How often a particular response was given.

  35. Step 4: Standard Deviation

  36. Step 4: Quantitative Analysis Example • On a scale of 1 to 5, where 1=poor and 5=excellent, how would you rate the overall quality of the workshop? • Answers (10 people): 4,5,2,4,3,4,3,3,5,4 • Mean: 3.7 (the total divided by 10) • Median: 3.5 (halfway between the lowest score of 2 and the highest score of 5) • Mode: 4 (the score repeated the most) • Standard Deviation: 0.95 (68% of the answers were between 2.75 and 4.65). • Frequency: 1=0, 2=1, 3=3, 4=4, 5=2.

  37. Step 4: Qualitative Data • Qualitative data analysis is the process of systematically searching and arranging the interview transcripts, field notes, and other materials to increase your own understanding and enable you to present what you have discovered. (Bodgan and Biklen(1992)).

  38. Step 4: Qualitative Data • Data resulting from interviews, focus groups, open-ended questions, or case studies. • Develop common elements, themes, topics, sentiments, etc. from the qualitative data.

  39. Step 4: Interpret • Identify trends, commonalities and testimony that will help answer the critical evaluation questions that were generated in Step 1. • To be useful, the data must be interpreted so that the stakeholders will understand the results and know how to use them.

  40. Step 4: Interpret Example • At the end of a public meeting to discuss a future construction project, persons were asked to rate their satisfaction with key components of the project on a scale of 1 to 5. • 1=not satisfied and 5=very satisfied.

  41. Step 4: Interpret Example • A total of 75 people responded. • 40 Hispanic/Latino • 34 White (non-Hispanic) • 29 African-American • 15 Disabled

  42. Step 4: Interpret Example What does it all mean?

  43. Step 4: Reporting • Serves as a basis for further program development and improvement. • Provides support for continuing or expanding the program. • Serves as a basis for public relations and promoting future programs. • Identifies implications of the results, how they can be used, and recommendations for the overall program.

  44. Step 5: Evaluation Plan • Who should lead the evaluation? • How should tasks and responsibilities be formalized, organized and scheduled? • What problems are expected?

  45. Step 5: Who Should Lead? • One person or a team? • Leader’s responsibilities may include: • Designing the evaluation • Constructing the instrument • Collecting data • Analyzing data • Writing/delivering reports • Managing and interacting with personnel

  46. Step 5: Organizing Tasks • When a project team, evaluation leader, and the stakeholders meet and ask “What must be done, when and by whom?” the evaluation management plan begins to take shape. These tasks should be put into some logical organizational structure.

  47. Step 5: Organizing Tasks • A management plan should include: • Evaluation design and a general plan specifying what must be done. • When each activity needs to be conducted and completed. • Responsible parties for each activity. • Available resources. • Ability to revise or refine the plan as time passes. • Contingency plans for problems that may arise.

  48. Step 5: Expected Problems • Murphy’s Law • Monitor the evaluation’s progress • The earlier you detect a problem, the better able you will be to resolve and continue. • Develop a contingency plan.

  49. Conclusion • Program Evaluation including data collection and analysis can provide an overall focus to the civil rights programs. • Requires a paradigm shift from daily implementation to program efficiency. • An effective evaluation plan will help identify the varying types of data that need to be collected in the programs.

  50. Thank you • Questions? • Contact information: Darren Kaihlanen FHWA Oklahoma Division darren.kaihlanen@dot.gov (405) 254-3312 (voice) (405) 254-3302 (fax)

More Related