1 / 65

Ain’t No Mountain High Enough

Ain’t No Mountain High Enough. Facilitators: Christina Borbely Kerrilyn Scott-Nakai Produced and Conducted by the Center for Applied Research Solutions, Inc. for the California Department of Alcohol and Drug Programs SDFSC Workshop-by-Request March 22, 2006 Ventura County

marcie
Download Presentation

Ain’t No Mountain High Enough

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ain’t No Mountain High Enough Facilitators: Christina Borbely Kerrilyn Scott-Nakai Produced and Conducted by the Center for Applied Research Solutions, Inc. for the California Department of Alcohol and Drug Programs SDFSC Workshop-by-Request March 22, 2006 Ventura County Authored by Christina J. Borbely, Ph.D. Safe and Drug Free Schools and Communities Technical Assistance Project Climbing the Peaks of Program Excellence

  2. Trail Map • Why Are We Doing This • Value • Opportunities • Opportunity for Recognition • Advanced Program Essentials • Program Essentials • Key Considerations • Advancing Program Through Evaluation • Methodology: Design & Instrumentation • Data Plan & Analysis • Reporting

  3. Why Are We Doing This? • The Value of Advancing Programs • Opportunities for Advancing Programs

  4. Value • Replicating innovative strategies • Fill in gaps • Integrate latest science and/or practice • Making contribution through dissemination • Participate in science-service dialog • Advance the field • Provide effective program to others

  5. Opportunities • Expansion • Demonstrate need/value of new or additional funding • Bolster capacity to sustain programming • Recognition • Validation from field • Potential for supplemental support/resources • Publications

  6. Opportunity for Recognition Validation from the Prevention Field: • Service to Science • NREPP • Exemplary Programs

  7. National Registry of Effective Prevention Programs (NREPP) NREPP is coordinated by the Center for Substance Abuse Prevention (CSAP) under the federal Substance Abuse and Mental Health Services Administration (SAMHSA) NREPP is: “a system designed to support informed decision making and to disseminate timely and reliable information about interventions that prevent and/or treat mental and substance use disorders.” http://modelprograms.samhsa.gov/template.cfm?page=nreppover

  8. Original NREPP Designations • A program will be considered “Model”if the NREPP review team appointed your program as an effective program, and an agency agrees to participate in CSAP’s dissemination efforts. Model programs also provide training and technical assistance to practitioners who wish to adopt a program in order to ensure that the program is implemented with fidelity. • A program is considered “Effective”if it is science-based, and produces consistently positive patterns of results. Only programs positively effecting the majority of intended recipients or targets are considered effective. • A program will be considered “Promising”if it provides useful and scientifically defensible information about what works in prevention, but has yet to gather sufficient scientific support to standards set for effective/model programs. Promising programs are sources of guidance for prevention practitioners, although they may not be as prepared as Model programs for large-scale dissemination.

  9. Evidence-Based Programs • Conceptually Sound and Internally Consistent • Program Activities Related to Conceptualization • Reasonably Well Implemented & Evaluated • Effective Programs • Consistently Positive Outcomes • Strongly Implemented & Evaluated • Promising Programs • Some Positive Outcomes • Model Programs • Available for Dissemination • Technical Assistance Available from Program Developers

  10. NEW NREPP: Eligibility Criteria • Open submission; review based on alignment of intervention with NREPP priorities • SAMHSA's three Centers -- the Center for Mental Health Services, the Center for Substance Abuse Prevention, and the Center for Substance Abuse Treatment -- will establish priorities for the types of interventions to be reviewed and highlighted on NREPP. • Priorities will be established and provided to the public annually through notices on the NREPP Web site. • These priorities based on dialogues with treatment and prevention stakeholders as well as with SAMHSA's Federal partners

  11. NEW NREPP: Review Criteria “the sole requirement for potential inclusion in the NREPP review process is for an intervention to have demonstrated one or more significant behavioral change outcomes.”

  12. NEW NREPP: Review Process • A trained Ph.D.-level evaluation specialist works with applicants to assure that adequate materials have been submitted before initiating an NREPP review. • The evaluation specialist serves as collaborator in the application process and liaison to the reviewers. • A scientific review of the intervention is conducted by two independent Ph.D.-level reviewers. • Completed review summaries, including descriptive components, reviewer ratings, and explanations are provided to the applicant for approval before they are posted on the NREPP Web site.

  13. NEW NREPP: Application Process Application materials include one or more of the following types of documents: • formal evaluation reports, • published and unpublished research articles, • narrative sections of grant applications, • training materials, and • implementation or procedural manuals. • concise summary of the intervention that includes the intervention name, a description of its main components, the population(s) targeted, and the behavioral outcomes targeted.

  14. The Exemplary Program Awards • The Exemplary Program Award is designated by CSAP • The Exemplary Awards program recognizes prevention programs in two tracks: Promising Programs—those that have positive initial results but have yet to verify outcomes scientifically, and Model Programs—those that are implemented under scientifically rigorous conditions and demonstrate consistently positive results. • The Exemplary Awards recognize prevention programs that are innovative and effective and that successfully respond to the needs of their target populations, both as Promising Programs and Model Programs.

  15. Exemplary Program Award: Review Process • A multifaceted procedure is used identify and select Promising Programs to receive an Exemplary Substance Abuse Prevention Program Award annually. All nominated programs submit to a three-level review process. • First, state agency personnel and national organizations submit their formal nominations. • Applications are then reviewed by experts in the field of substance abuse prevention and former Exemplary Substance Abuse Prevention Program Award winners. • Finally, the National Review Committee reviews and scores the top applications according to eight criteria and recommends those that merit an Exemplary Substance Abuse Prevention Program Award. Final selections are made jointly by NASADAD, CADCA, and SAMHSA/CSAP.

  16. Exemplary Program AwardApplication Process • Applications for the Innovative Programs may be obtained from State Alcohol and Drug Agencies, the NASADAD/NPN Web page (www.nasadad.org) and office. • Applicants must submit their application to their national nominating organization (see application appendix) for sign-off. Applicants should then return the original signed, completed application (including cover sheet) and three copies to the NASADAD/NPN central office in Washington, D.C. For more information about the application process, call or write: NASADAD/NPN 808 17th Street, NW, Suite 410 Washington, DC 20006 Attention: Exemplary Programs Web page: www.nasadad.org E-mail: amoghul@nasadad.org (202) 293-0090, Fax (202) 293-1250

  17. Exemplary Program Award 8 Review Criteria • Philosophy • Background and need (program planning) • Goals and objectives • Population(s) to be served • Activities and strategies • Community coordination • Evaluation • Program management

  18. Service to Science • Service to Science is a national initiative supported by SAMHSA/CSAP to enhance the evaluation capacity of innovative programs and practices that address critical substance abuse prevention or mental health needs. http://captus.samhsa.gov/northeast/special_projects/service_to_science/main.cfm

  19. Service to Science Academy • Designed to enhance capability of community-based prevention strategies, programs or practices that demonstrate effectiveness. • Each Academy is customized to support the needs of the groups/organizations and programs accepted to attend, • Emphasis on the development of a strong evaluation and/or research design. • Participants receive training and technical assistance helping them move along the evidence-based continuum

  20. Service to Science Academy:Eligibility Criteria 1. Primarily focused on ATOD prevention, but may also address the prevention of violence, HIV/AIDS, STDs, etc. Expected outcomes or areas of focus include, but are not limited to, efforts to decrease high-risk behaviors by children or adults; eliminate use of illicit drugs; reduce underage use of alcohol, tobacco, and other drugs, and decrease DUI/DWI rates. 2. Nominated for recognition by a State Alcohol and Drug Agency, by the Community Anti-Drug Coalitions of America (CADCA), or by other national organizations or their affiliates. 3. Able to document and demonstrate success in the form of quantifiable outcome data. 4. In operation for a minimum of two (2) years.

  21. Service to Science Academy:Review Criteria • Philosophy • Needs Assessment • Population Served • Goals & Objectives • Activities & Strategies • Evaluation • Program Management • Community Coordination

  22. Service to Science Academy: Application Process • The application to attend a Service to Science Academy is a modified National Association of State Alcohol & Substance Abuse Directors (NASADAD) application for Innovative/Exemplary Programs. • Applications are reviewed by a panel who makes recommendations for acceptance to the Academy.

  23. Application Criteria as Program Practice Live it! • SDFSC Santa Cruz County: Service to Science Academy Santa Cruz County submitted an application and was awarded a program slot with the current cohort for the Service to Science Academy. The Santa Cruz team will receive a series of trainings and technical assistance to assist them in moving their program towards being recognized as a model or promising program. • SDFSC Butte County: NPN Exemplary Program Award Butte County submitted 3 of their prevention programs for review: Friday Night Live Mentoring, Friday Night Live, and Youth Nexus. Two of these programs are being recognized nationally, with only 6 programs receiving this national recognition by the National Prevention Network Research. • Andrea Taylor, Ph.D.: NREPP Model Program Status Andrea Taylor evolved a local program, Across Ages, an intergenerational mentoring program that promotes positive youth development and helps prevent school failure, substance abuse and teen pregnancies into to an NREPP Model Program that is implemented nation-wide. The process spanned 1991-1998.

  24. Advanced Program Essentials Put Your Finger On It… • Logic Model • Core Components • Documented Need and Value • Defining Population • Defining Need for Service within the Community

  25. Logic Model “ A logic model is a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan, and the changes or results you hope to achieve.” (W.K. Kellogg, Logic Model Development Guide, 2004)

  26. Value of a Logic Model A Picture is Worth a 1000 Words • Builds understanding about what the program is, what it’s expected to do and what measures of success will be used. • Provides a research-based theory behind your strategies • Promotes communication and a common understanding amongst staff and funders

  27. Core Program Components What are the “active ingredients” in the formula for program success? • In theory, core components must be implemented precisely as intended in order to achieve demonstrated outcomes. • Core components cannot be adapted.

  28. Define Core Components Core components might be: • program structure (e.g. the sequence of sessions or context of delivery), • program content (e.g. specific concepts or skill sets), or • method of delivery (e.g. “homework” assignments, classroom infusion, or youth-led group activities).

  29. Define Population • Institute of Medicine (IOM) Classifications Universal preventive interventions are activities targeted to the general public or a whole population group that has not been identified on the basis of individual risk.Selective preventive interventions are activities targeted to individuals or a subgroup of the population whose risk of developing a disorder is significantly higher than average.Indicated preventive interventions are activities targeted to individuals in high-risk environments, identified as having minimal but detectable signs or symptoms foreshadowing disorder or having biological markers indicating predisposition for disorder but not yet meeting diagnostic levels.

  30. Defining Need for Service • Integrating key stakeholders in process • Bonus points for youth • Representative of community • Strategic Prevention Framework • Needs/Resource Assessment • Strategic Planning • Evidence-based Implementation

  31. Key Considerations Advancing Programming • What’s the yardstick? • How do I measure up? • Where do I want to go from here?

  32. Considerations: Participation • Recruitment • Are we meeting target #s consistently? • Are we using strategic recruitment methods? • Retention: • Do we have sufficient completion rates? • Have we defined a program graduate/drop-out? • What do we do to encourage retention?

  33. Considerations: Fidelity • Fidelity • To what degree are we consistently implementing core components? Is this sufficient? • What system do we use to reflect on areas of challenge? How does that inform our process? • What method do we use to monitor implementation across sites? Are we vigilant enough? Does feedback get incorporated?

  34. Considerations: Innovation • Degree to which program is novel, cutting edge, innovative. • How is this different than what’s already available? • What aspects of the program are unique? • Grounded but Innovative: program alignment with already-proven models of service • What proven methods are incorporated in what we do? • Did we take an evidence-based strategy to the “next level” or use it in a novel way?

  35. Considerations: Population • How culturally appropriate are services to identified population? • Program content • Program materials (e.g. translation) • Staff (training and protocol) • Tested across ethnic/cultural groups • Link to evidence-based strategies demonstrated with specific populations

  36. Considerations: Marketing • Have materials/curriculum been “packaged” • Sequencing • Branding • Training protocol tested and established/documented

  37. Considerations: Replication • Protocol • Program curriculum • Training process • Evaluation • Packaged program materials • Curriculum • Evaluation • Strategic replication • Varied populations • Varied context

  38. Advancing Programs through Evaluation • Rigor • Methodology • Data Plan & Analysis • Reporting

  39. Increasing Evaluation Rigor Across the Board Methodology/Design Instrumentation to Analysis Reporting

  40. Tips for Optimal Evaluation Rigor • Use external evaluator to lend credibility • Especially valuable for publishing findings • Conduct evaluation of replication sites • Evidence of impact in varied settings; populations • Evaluate program effect and sustainability of effect • Pre/post demonstrates immediate effects • Follow up (longitudinal) proves how those effects are sustained.

  41. Advancing Methodology • Process & Outcome • Evaluation Design • Tips for Optimal Design

  42. Process Allows for continuous learning about how the program is working as it is implemented Focuses on clearly describing and assessing program design and implementation. Makes it possible to answer questions concerning “why” and “how” programs operate the way they do and what can be done to improve them. Outcome The outcome evaluation focuses on producing clear evidence concerning the degree of program impact on program participants. Assesses the immediate or direct effects of program activities (as compared to long-term impact). Role of Process and Outcome Methods

  43. Level of Rigor: Outcome Evaluation Design

  44. NREPP Source of Evidence Hierarchy

  45. Tip for Optimal Design: Matched Data Making a Match • Requires tracking of individuals • Allows for analysis of individual-level impact, not just aggregate level • Can control for “dosage” or other factors

  46. Tip for Optimal Design: Longitudinal Data Looking at the long run… The majority of programs use a pre/post assessment schedule. • The utilization of follow-up points is recommended based on length of program • Consider a follow up point at 1, 3, 6, 9, or 12 months after completion. • Programs with continuous enrollment vs. cohorts of youth need • strong tracking systems • Continuous evaluation schedule (e.g. every 3 or 6 months)

  47. Tip for Optimal Design: Comparison Groups Shall I compare thee to a summer’s day… Comparison groups can sometimes be fairly easy to develop • School data • Low dosage service groups can sometimes be utilized—make the distinction between program drop-out versus evaluation drop-out • Use standardized measures and compare program groups to school, district, state results.

  48. Tip for Optimal Design: Control Groups Control freak! • Controls groups require resources and may deter participants due to randomization. • The trick is in the approach and the ability to provide services at a later date.

  49. Advancing Instrumentation • Standardized v. Locally Developed • Tips for Optimal Instrumentation

  50. Pros Already developed, lots of choices. Psychometrics established Allows for comparison of results—national, state, district levels Scoring and analysis sometimes available Cons Cost May not be specific to your population May not capture novel aspects of program Pros Can tap into novel program aspects/impact Can be tailored to population No cost Cons Don’t know reliability/validity Doesn’t allow for comparison Survey Options Standardized Locally Developed

More Related