1 / 31

Improving Data Collection and Preparing for Cross-Site ...

albert
Download Presentation

Improving Data Collection and Preparing for Cross-Site ...

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. Improving Data Collection and Preparing for Cross-Site Evaluation Presented by Olivia Silber Ashley, Dr.P.H. Linda Bailey-Stone, B.S. Presented to Office of Adolescent Pregnancy Programs Prevention Grantee Conference, September 25-27, 2006, Pittsburgh, Pennsylvania

    2. 2

    3. 3 Overview Core evaluation instruments Cross-site evaluation Draft standardized data collection procedures

    4. 4 Background on Core Evaluation Instruments Office of Management and Budget (OMB) recently examined the AFL program using its Program Assessment Rating Tool (PART) Identified program strengths Program purpose Design Management Identified areas for improvement Strategic planning Program results/accountability In response, OPA Developed baseline and follow-up core evaluation instruments Developed performance measures to track demonstration project effectiveness

    5. 5 Staff and Client Advisory Committee Anne Badgley Leisa Bishop Doreen Brown Carl Christopher Cheri Christopher Audra Cummings Christina Diaz Amy Lewin David MacPhee Janet Mapp Ruben Martinez Mary Lou McCloud Charnese McPherson Alice Skenandore Jared Stangenberg Cherie Wooden

    6. 6 Capacity Assessment Methods Review of grant applications, annual reports, and other information from 28 most recently funded programs Qualitative assessment involving program directors, evaluators, and staff in: 14 Title XX Prevention programs 14 Title XX Care programs Telephone interviews Site visit Observations of data collection activities Document review Conducted between January 26, 2006, and March 16, 2006 31 interviews involving 73 interviewees across 28 programs 100% response rate

    7. 7 Selected Title XX Prevention and Care Programs Baptist Children’s Home Ministries Boston Medical Center Emory University Freedom Foundation of New Jersey, Inc. Heritage Community Services Ingham County Health Department James Madison University Kings Community Action National Organization of Concerned Black Men Our Lady of Lourdes Red Cliff Band of Chippewas St. Vincent Mercy Medical Center Switchboard of Miami, Inc. Youth Opportunities Unlimited Children’s Home Society of Washington Children’s Hospital Choctaw Nation of Oklahoma Congreso de Latinos Unidos Hidalgo Medical Services Illinois Department of Human Services Metro Atlanta Youth for Christ Roca, Inc. Rosalie Manor Community & Family Services San Mateo County Health Services Agency Truman Medical Services University of Utah Youth and Family Alliance/Lifeworks YWCA of Rochester and Monroe

    8. 8 Capacity Assessment Research Questions What is the data collection capacity of AFL Prevention and Care demonstration projects? How and to what extent have AFL projects used the core evaluation instruments? What problems have AFL projects encountered with the instruments? What data collection systems and evaluation designs are appropriate for the AFL program? What are the potential barriers to projects’ participating in electronic data collection and/or a cross-site evaluation?

    9. 9 Difficulties with Core Evaluation Instruments among Prevention Programs Reading level Guidelines and instructions Too long Formatting Do not understand theoretical underpinnings Need information on validity and reliability Do not match program curriculum Do not capture all data wanted Parents would be upset Insensitive to diverse family structures No behavioral items Racial groups too large Adolescents object to ethnicity question

    10. 10 Expert Work Group Elaine Borawski Claire Brindis Meredith Kelsey Doug Kirby Lisa Lieberman Dennis McBride Jeff Tanner Lynne Tingle Amy Tsui Gina Wingood

    11. 11 Draft Revision of Core Evaluation Instruments Confidentiality statement 5th grade reading level Instructions for adolescent respondents Re-ordering of questions Improved formatting Sensitivity to diverse family structures Consistency in response options Improved fidelity to original source items Eliminated three items Improved race question Reverse coding

    12. 12 Future Activities Pilot test Create crosswalk from original instrument items to revised items Add behavioral items Translate instruments and consent/assent forms into Spanish Develop database structure Seek OMB clearance Behavioral items Individual-level data collection Provide technical assistance and training

    13. 13 Purpose of Cross-Site Evaluation Improve OPA’s PART rating Provide evaluation data about the AFL program as a whole Inform resource allocation decisions Determine the activities and impacts of AFL demonstration project efforts Inform policy decisions about program Support Expansion Improvement

    14. 14

    15. 15 Capacity Assessment Findings Paper and pencil surveys Classroom-based group-administered surveys Reading questions to adolescents Providing individual help Data collection staff training Program staff collecting data Data collection on the first and last day of the intervention Corresponds to school year Respondent ID numbers, names, dates of birth, and/or initials Sealed envelopes Follow-up for non-responders No scanning equipment Open to training and documentation about standardized data collection procedures No major barriers to meta-analysis

    16. 16 Draft Evaluation Design Two analytic strategies used for the meta-analysis Treating each project as a unit of analysis, with the effect sizes of the projects as the focus Including all adolescents within projects in the project-level study together as a unit of analysis, with program exposure as a predictor variable on performance measures Inclusion/exclusion criteria Provide assistance with tracking non-responders Address missing data Multiple imputation Maximum likelihood modeling Consider program characteristics Mediation and moderation analysis

    17. 17 Draft Timeline

    18. 18 Draft Standardized Data Collection Procedures Need for standardization Recruiting adolescents Informed consent/assent forms Confidentiality guidelines Classroom survey administration Adverse event/distress protocol Data storage and shipping

    19. 19 Why Standardize? Grantees voiced a need Collect quality data uniformly Allow for generalization of findings across sites Comply with Federal regulations

    20. 20 Principles Guiding Human Subjects Research

    21. 21 Shared Responsibility between RTI and AFL All RTI research involving human subjects is governed by the Code of Federal Regulations 45 CFR 46 RTI bears full responsibility for ensuring that human subjects research is conducted in accordance with the Federal regulations RTI’s Institutional Review Board (IRB) must review and approve all research involving human subjects Both RTI project staff and AFL project staff are responsible for Protecting the rights and welfare of human subjects Complying with Federal regulations

    22. 22 Recruiting Adolescents From AFL projects funded in October 2004 or later Emancipated minors Recruitment script Lead letter

    23. 23 Informed Consent/Assent Draft revised consent/assent forms Adolescent assent form Include ID number Verbal explanation for adolescents aged 17 or younger Exception for child neglect Expected number of study participants Toll-free RTI project director and IRB numbers

    24. 24 Draft Confidentiality Guidelines Improve perceptions of confidentiality among adolescents Increase disclosure Avoid social desirability bias ID numbers with no identifying information Sealed envelope Staff confidentiality agreement

    25. 25 Survey Administration Read questions aloud if necessary Avoid interpreting questions or providing help beyond reading questions aloud A staff person knowledgeable about the instrument and study should be available to answer questions about the study if needed Use sealed envelope After completion, check with adolescents to see whether they have questions or want to discuss feelings or issues Ensure time Provide privacy

    26. 26 Adverse Event/Distress Protocol Adverse events Confidentiality breach Possible formal action against Client Grantee OPA RTI Distress Incident reporting Referrals

    27. 27 Discomfort Versus Distress Discomfort Skips questions Says they do not want to answer a question Says that the information is too personal to disclose Distress Becomes tearful Reports feeling badly or very sad Shows signs of being nervous or anxious For example, very nervous speech

    28. 28 Identifying Distress Emotional reaction Crying Anger Statements about extreme worry or anxiousness Concern about unwanted sexual activity Upset about family situation Statements indicating Hopelessness Sadness Depression

    29. 29 Serious Adverse Events Extreme distress Extreme emotional reaction Statements indicating concern to the point that the respondent is consumed with worry or anxiety Statements indicating extreme Hopelessness Sadness Depression Suspected child abuse or neglect Of respondent Of another child

    30. 30 Data Storage and Shipping Store signed consent/assent forms separately from completed instruments Ship signed consent/assent forms separately from completed instruments Separate packages Different days Federal Express versus mail Notify RTI When shipment sent Tracking number If shipments do not arrive at RTI as scheduled, RTI will immediately initiate tracing through Federal Express RTI will monitor, provide feedback, and provide re-training if needed

    31. 31 Next Steps RTI IRB approval OPA review Staff and client committee review Pilot test standardized data collection procedures Debrief with pilot sites to receive feedback Incorporate comments, revise, improve Provide training and technical assistance RTI and AFL staff possibly conduct initial data collection for cross-site evaluation together

More Related