Fhwa civil rights data collection and program evaluation
Download
1 / 50

- PowerPoint PPT Presentation


  • 231 Views
  • Uploaded on

FHWA Civil Rights: Data Collection and Program Evaluation. …why should I care?. Data Collection and Program Evaluation (Analysis). …can be confusing. Objectives. Identify the importance of data collection and program evaluation. Explain the basic structure of an evaluation plan or strategy.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about '' - dragon


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Fhwa civil rights data collection and program evaluation l.jpg

FHWA Civil Rights: Data Collection and Program Evaluation

…why should I care?



Objectives l.jpg
Objectives

  • Identify the importance of data collection and program evaluation.

  • Explain the basic structure of an evaluation plan or strategy.

  • Design a specific program evaluation plan to

    • Identify specific data that must be collected,

    • Assess the implementation and effectiveness of various civil rights programs, and

    • Ensure State DOT management is aware of issues, needs and potential problems associated with various civil rights programs.


Why should i care l.jpg
Why should I care?

  • Reliable information about the impact of FHWA programs (program evaluation) is essential to ensuring non-discrimination.

  • The relationship between FHWA programs and varying social factors have wide-ranging implications for any community, municipality, county, or state.


Why is it important l.jpg
Why is it important?

  • Data collection and program evaluation (analysis) can:

    • Help identify and prioritize problem areas,

    • Initiate and evaluate the effectiveness of policies and programs to ensure non-discrimination,

    • Assess the relationship between specific programs and beneficiaries to develop non-discrimination strategies.


Importance continued l.jpg
Importance(continued)

  • Draw agency attention and resources to resolve issues and problems.

  • Provide justification for certain programs or illustrate a need for new programs.

  • Help justify continued or additional funding.

  • Communicate the importance of the goals, solutions, strategies, or programs.


Slide7 l.jpg

“Program evaluation is like a bunch of feathers because it lifts you up to get a bird’s eye view so you can see the whole picture” (unknown).


Current fhwa data collection l.jpg
Current FHWA Data Collection lifts you up to get a bird’s eye view so you can see the whole picture”

Each State DOT must “develop procedures for the collection of statistical data (race, color, religion, sex, and national origin) of participants in, and beneficiaries of State highway programs, i.e., relocatees, impacted citizens and affected communities.” (23 CFR 200.9(b)(4))

  • How do you collect statistical data?

  • What do you do with it?

  • Project-based or Program-based?

  • Benefits vs. Burdens analysis?

  • “Letter of the law” or “Spirit of the law”?


Steps to look at the whole picture l.jpg
Steps to look at the whole picture lifts you up to get a bird’s eye view so you can see the whole picture”

  • Step 1: Focus the Evaluation

  • Step 2: Identify Barriers to Evaluation

  • Step 3: Methods of Data Collection

  • Step 4: Organizing, Analyzing, Interpreting, Summarizing and Reporting Results

  • Step 5: Develop the Evaluation Plan


Step 1 focus the evaluation l.jpg
Step 1: Focus the Evaluation lifts you up to get a bird’s eye view so you can see the whole picture”

  • What will be evaluated?

  • What is the purpose for this evaluation?

  • Who will be affected by or involved in the evaluation?

  • What are the critical questions the evaluation must address?


Step 1 focus example right of way l.jpg
Step 1: Focus Example lifts you up to get a bird’s eye view so you can see the whole picture” (Right-of-Way)

  • What will be evaluated?

    • Real estate appraisals (project-based or program-based)

  • What is the purpose of the evaluation?

    • To ensure real estate appraisals are equitable without regard to race, color, or national origin.

  • Who will be affected by or involved in the evaluation?

    • Certified Appraiser - Relocation Agents

    • Review Appraisers - Property Management

    • State DOT Appraisers - State DOT Leadership

    • Acquisition Agents - FHWA Division/Headquarters


Step 1 focus example right of way12 l.jpg
Step 1: Focus Example lifts you up to get a bird’s eye view so you can see the whole picture” (Right-of-Way)

  • What are the critical questions the evaluation must address?

    • Was the resulting appraisal amount fair and equitable?

    • Are the land unit values consistent throughout the project area?

    • Were relocatees required to move to a similar socio-demographic region?

    • Were relocatees provided relocation advisory services to find replacement housing (i.e., transportation to/from available housing).


Step 2 identify barriers l.jpg
Step 2: Identify Barriers lifts you up to get a bird’s eye view so you can see the whole picture”

  • Factors that may influence the evaluation

  • Implications for the evaluation

  • Strategies to overcome barriers

    Evaluation is like an acorn because it must overcome adverse conditions to grow into a tree” (unknown).


Step 2 identify barriers14 l.jpg
Step 2: Identify Barriers lifts you up to get a bird’s eye view so you can see the whole picture”

  • Organizational Politics:

    • Is there support for the evaluation?

    • Are there opponents to the evaluation?

  • Project Leadership:

    • Who has control over the program?

    • How does the evaluation fit their goals?

  • Professional Influences:

    • Are professional groups interested?

    • Are the supportive of an evaluation?


Step 2 identify barriers15 l.jpg
Step 2: Identify Barriers lifts you up to get a bird’s eye view so you can see the whole picture”

  • History:

    • What has been the tradition or history of self-evaluation of programs?

    • Have evaluations been conducted before?

  • Organizational Setting:

    • Does the program fit into a larger organizational network? Where or how?

    • What kind of information could jeopardize the program?


Step 2 identify barriers16 l.jpg
Step 2: Identify Barriers lifts you up to get a bird’s eye view so you can see the whole picture”

  • Economics:

    • Is fiscal support for the program and evaluation secure?

  • Interpersonal Patterns:

    • How much interagency or interpersonal conflict is likely to occur as a result?

    • Could the evaluation be controversial to internal or external audiences?


Step 2 identify barriers17 l.jpg
Step 2: Identify Barriers lifts you up to get a bird’s eye view so you can see the whole picture”

  • Legal Guidelines:

    • Will legal restrictions limit collection of desired information/data?

    • Will the project be affected by pending legislation?

  • Resources:

    • Are resources (human, financial, time, etc.) available to support the evaluation?


Step 3 data collection methods l.jpg
Step 3: Data Collection Methods lifts you up to get a bird’s eye view so you can see the whole picture”

  • What type of data should be collected?

  • What methods should be used to collect data?

  • How much information should you collect?

  • Questionnaires

  • Validity and Reliability


Step 3 types of data l.jpg
Step 3: Types of Data lifts you up to get a bird’s eye view so you can see the whole picture”

  • Descriptive information can include:

    • Characteristics of the program.

    • Demographic information on program participants, beneficiaries, etc.

    • Actual benefits paid or realized by beneficiaries.

    • Specific results of a program or service.


Step 3 types of data20 l.jpg
Step 3: Types of Data lifts you up to get a bird’s eye view so you can see the whole picture”

  • Judgmental information can include:

    • Opinions from consultants.

    • Beneficiaries beliefs and values.

    • Agency personnel’s interpretation of laws.

    • Stakeholders perceived priorities.

    • Interpretations of policy, procedures and guidelines.


Step 3 data collection methods21 l.jpg
Step 3: Data Collection Methods lifts you up to get a bird’s eye view so you can see the whole picture”

  • Opinion Surveys: as assessment of how a person or group feels about a particular issue.

  • Questionnaire: a group of questions that people respond to verbally or in writing.

  • Case Studies: experiences and characteristics of selected persons involved with a program.

  • Individual interviews: individual’s responses, opinions, or views.


Step 3 data collection methods22 l.jpg
Step 3: Data Collection Methods lifts you up to get a bird’s eye view so you can see the whole picture”

  • Group Interviews: small groups’ responses, opinions, and views.

  • Records: information from records, files, or receipts.

  • Advisory, Advocate Teams: ideas and viewpoints of selected persons.


Step 3 choosing a method l.jpg
Step 3: Choosing a Method lifts you up to get a bird’s eye view so you can see the whole picture”

  • Availability: information may already be available to you that can help answer some questions. Review information in prior records, reports and summaries.

  • Interruption Potential: the more disruptive an evaluation is to the routine of the program, the more likely that it will be unreliable.

  • Protocol Needs: you may need to obtain appropriate permission to collect information.


Step 3 choosing a method24 l.jpg
Step 3: Choosing a Method lifts you up to get a bird’s eye view so you can see the whole picture”

  • Reactivity: you do not want “how” you ask something to alter the response you get.

  • Bias: to be prejudiced in opinion or judgment.

  • Reliability: will people interpret your questions the same way each time?

  • Validity: Will the collection methods produce information that actually measures what you intend to be measured?


Step 3 how much data l.jpg
Step 3: How Much Data? lifts you up to get a bird’s eye view so you can see the whole picture”

  • Sampling: a portion of subjects in order to learn something about the entire population without having to measure the whole group.

    • Random: each individual in the group has an equal chance of being chosen for the sample.

    • Purposive: a sample that will represent specific viewpoints.


Step 3 when choosing a method consider l.jpg
Step 3: When Choosing a Method, Consider… lifts you up to get a bird’s eye view so you can see the whole picture”

  • Should you use a sample of a population or a census (an entire population, such as all people living in a project area)?

  • Should you use a random or purposive sample?

  • How large a sample size do you need?

  • Is your sample likely to be biased?


Step 3 questionnaires l.jpg
Step 3: Questionnaires lifts you up to get a bird’s eye view so you can see the whole picture”

  • Questionnaires are a widely used method of collecting information. They can be a cost effective way to reach a large number of people or a geographically diverse group.


Step 3 questionnaires28 l.jpg
Step 3: Questionnaires lifts you up to get a bird’s eye view so you can see the whole picture”

  • Questionnaires should include the following key elements:

    • Cover letter: include the purpose of the study, why and how participants were selected, etc.

    • Introduction: a short recap of the information included in the cover letter.

    • Instructions: clear and concise.


Step 3 questionnaires29 l.jpg
Step 3: Questionnaires lifts you up to get a bird’s eye view so you can see the whole picture”

  • Questionnaires should include the following key elements:

    • Grouping questions: group questions with similar topics together in a logical flow.

    • Demographic questions: age, gender, ethnicity, etc.

    • Thank you: thank the respondent for completing the questionnaire.

    • Disposition: what to do with the questionnaire when complete.


Step 3 validity l.jpg
Step 3: Validity lifts you up to get a bird’s eye view so you can see the whole picture”

  • “Does the questionnaire measure what you want it to measure?” A questionnaire that is valid for a specific situation or audience may not be valid in a different situation or for a different audience.

  • Have a panel of experts (professionals familiar with the program) review the questionnaire and provide feedback.

  • Test the questionnaire on a similar population.


Step 3 reliability l.jpg
Step 3: Reliability lifts you up to get a bird’s eye view so you can see the whole picture”

  • Does the questionnaire consistently give the same results with the same group of people under the same condition?

  • Can be shown when using a pilot sample with a questionnaire administered repeatedly over time.

  • Are answers repeated over time?

  • Revise as needed.


Step 4 organizing analyzing interpreting summarizing and reporting l.jpg
Step 4: Organizing, Analyzing, Interpreting, Summarizing and Reporting

  • Organize Evaluation Data

  • Analyze

  • Quantitative Data Analysis

  • Qualitative Data Analysis

  • Interpret (summarize)

  • Report Results


Step 4 organize data l.jpg
Step 4: Organize Data Reporting

  • Develop a system to organize your data (spreadsheet).

  • As data are received, check to be sure it is complete.

  • Set up protocol for who does/does not have access to the data.

  • Track all data. Ensure data are not lost or overlooked as analysis and summarizing begin.


Step 4 quantitative data l.jpg
Step 4: Quantitative Data Reporting

  • Quantitative Data Analysis: information that contains numerical data.

    • N: The total number of participants in the sample.

    • Mean: The average score of the sample.

    • Median: The score halfway between the high and low score.

    • Mode: The response given the most.

    • Standard Deviation: The range from the mean in which 68% of the responses are found.

    • Frequency: How often a particular response was given.



Step 4 quantitative analysis example l.jpg
Step 4: Quantitative Analysis Example Reporting

  • On a scale of 1 to 5, where 1=poor and 5=excellent, how would you rate the overall quality of the workshop?

    • Answers (10 people): 4,5,2,4,3,4,3,3,5,4

    • Mean: 3.7 (the total divided by 10)

    • Median: 3.5 (halfway between the lowest score of 2 and the highest score of 5)

    • Mode: 4 (the score repeated the most)

    • Standard Deviation: 0.95 (68% of the answers were between 2.75 and 4.65).

    • Frequency: 1=0, 2=1, 3=3, 4=4, 5=2.


Step 4 qualitative data l.jpg
Step 4: Qualitative Data Reporting

  • Qualitative data analysis is the process of systematically searching and arranging the interview transcripts, field notes, and other materials to increase your own understanding and enable you to present what you have discovered. (Bodgan and Biklen(1992)).


Step 4 qualitative data38 l.jpg
Step 4: Qualitative Data Reporting

  • Data resulting from interviews, focus groups, open-ended questions, or case studies.

  • Develop common elements, themes, topics, sentiments, etc. from the qualitative data.


Step 4 interpret l.jpg
Step 4: Interpret Reporting

  • Identify trends, commonalities and testimony that will help answer the critical evaluation questions that were generated in Step 1.

  • To be useful, the data must be interpreted so that the stakeholders will understand the results and know how to use them.


Step 4 interpret example l.jpg
Step 4: Interpret Example Reporting

  • At the end of a public meeting to discuss a future construction project, persons were asked to rate their satisfaction with key components of the project on a scale of 1 to 5.

  • 1=not satisfied and 5=very satisfied.


Step 4 interpret example41 l.jpg
Step 4: Interpret Example Reporting

  • A total of 75 people responded.

    • 40 Hispanic/Latino

    • 34 White (non-Hispanic)

    • 29 African-American

    • 15 Disabled


Step 4 interpret example42 l.jpg
Step 4: Interpret Example Reporting

What does it all mean?


Step 4 reporting l.jpg
Step 4: Reporting Reporting

  • Serves as a basis for further program development and improvement.

  • Provides support for continuing or expanding the program.

  • Serves as a basis for public relations and promoting future programs.

  • Identifies implications of the results, how they can be used, and recommendations for the overall program.


Step 5 evaluation plan l.jpg
Step 5: Evaluation Plan Reporting

  • Who should lead the evaluation?

  • How should tasks and responsibilities be formalized, organized and scheduled?

  • What problems are expected?


Step 5 who should lead l.jpg
Step 5: Who Should Lead? Reporting

  • One person or a team?

  • Leader’s responsibilities may include:

    • Designing the evaluation

    • Constructing the instrument

    • Collecting data

    • Analyzing data

    • Writing/delivering reports

    • Managing and interacting with personnel


Step 5 organizing tasks l.jpg
Step 5: Organizing Tasks Reporting

  • When a project team, evaluation leader, and the stakeholders meet and ask “What must be done, when and by whom?” the evaluation management plan begins to take shape. These tasks should be put into some logical organizational structure.


Step 5 organizing tasks47 l.jpg
Step 5: Organizing Tasks Reporting

  • A management plan should include:

    • Evaluation design and a general plan specifying what must be done.

    • When each activity needs to be conducted and completed.

    • Responsible parties for each activity.

    • Available resources.

    • Ability to revise or refine the plan as time passes.

    • Contingency plans for problems that may arise.


Step 5 expected problems l.jpg
Step 5: Expected Problems Reporting

  • Murphy’s Law

  • Monitor the evaluation’s progress

    • The earlier you detect a problem, the better able you will be to resolve and continue.

  • Develop a contingency plan.


Conclusion l.jpg
Conclusion Reporting

  • Program Evaluation including data collection and analysis can provide an overall focus to the civil rights programs.

  • Requires a paradigm shift from daily implementation to program efficiency.

  • An effective evaluation plan will help identify the varying types of data that need to be collected in the programs.


Thank you l.jpg
Thank you Reporting

  • Questions?

  • Contact information:

    Darren Kaihlanen

    FHWA Oklahoma Division

    [email protected]

    (405) 254-3312 (voice)

    (405) 254-3302 (fax)


ad