1 / 35

Partnerships for Improving Community Health Web Conference Series

sierra
Download Presentation

Partnerships for Improving Community Health Web Conference Series

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    2. Partnerships: Opportunities to Align Resources and Interests Two key goals: Alignment of resources and individual interests with collective onesTwo key goals: Alignment of resources and individual interests with collective ones

    4. Trust ? dependency Trusting is an inherently risky behavior All have tendency to be trusting or suspicious Trust is a shared belief that you can depend on each other to achieve a common purpose. However, it is important to remember that trusting someone is not the same as being dependent upon that person. There must always be a choice to participate in the relationship. Research has shown that a key indicator that trust will be sustained in a relationship is the perception that all parties are behaving consistently with their character and are, therefore, seen as predictable. This is due in large part to the fact that trusting is inherently a risky behavior. Finally, it is important to remember that everyone brings to the relationship an initial tendency to be trusting or suspicious, and that this predisposition has little or nothing to do with the individual they are currently working with (Lynne Hinnant, PhD, FSU College of Information, 2009) Trust ? dependency Must always be a choice to participate in relationship Trusting is an inherently risky behavior Perception that behavior is consistent (i.e., predictable) with character contributes to sustained relationship All have tendency to be trusting or suspicious Predisposition has little/nothing to do with current interactions Trust is a shared belief that you can depend on each other to achieve a common purpose. However, it is important to remember that trusting someone is not the same as being dependent upon that person. There must always be a choice to participate in the relationship. Research has shown that a key indicator that trust will be sustained in a relationship is the perception that all parties are behaving consistently with their character and are, therefore, seen as predictable. This is due in large part to the fact that trusting is inherently a risky behavior. Finally, it is important to remember that everyone brings to the relationship an initial tendency to be trusting or suspicious, and that this predisposition has little or nothing to do with the individual they are currently working with (Lynne Hinnant, PhD, FSU College of Information, 2009) Trust ? dependency Must always be a choice to participate in relationship Trusting is an inherently risky behavior Perception that behavior is consistent (i.e., predictable) with character contributes to sustained relationship All have tendency to be trusting or suspicious Predisposition has little/nothing to do with current interactions

    5. Collaboration Using a Team Approach Partnerships are Like Teams A Group of Individuals Organized To Work Together To Accomplish A Common Goal(s) Team Roles and Responsibilities Partnership Leader Team Facilitator Timekeeper Scribe Team Member Briefly describe Role & Responsibilities for Timekeeper, Scribe & Team Member – specific detail on Partnership Leader and Team Facilitator on following slides Leader: Coordinates Meeting Logistics Manages the Team to Ensure Milestones are Achieved Coaches Individuals Works with Team Facilitator Keeps Key Partners Apprised of Progress Facilitator: Focuses on Accomplishing Desired Outcomes Encourages Full Participation Builds Relationships Guides Use of Problem Solving Techniques Provides a Safe Environment for Risk Taking Models Appropriate Behavior Briefly describe Role & Responsibilities for Timekeeper, Scribe & Team Member – specific detail on Partnership Leader and Team Facilitator on following slides Leader: Coordinates Meeting Logistics Manages the Team to Ensure Milestones are Achieved Coaches Individuals Works with Team Facilitator Keeps Key Partners Apprised of Progress Facilitator: Focuses on Accomplishing Desired Outcomes Encourages Full Participation Builds Relationships Guides Use of Problem Solving Techniques Provides a Safe Environment for Risk Taking Models Appropriate Behavior

    6. Determine purpose Define desired outcomes Develop agenda Set ground rules Utilize parking lot Ensure participation Evaluate meeting Review ground rules Keep the meeting focused on desired outcomes Facilitate and encourage discussion Ensure ideas/information is documented (by scribe) Evaluate the meeting Review ground rules Keep the meeting focused on desired outcomes Facilitate and encourage discussion Ensure ideas/information is documented (by scribe) Evaluate the meeting

    8. Rotate the leadership role Identify the barriers (perceived and real) Clearly articulate leader duties, responsibilities (create job description) Create a shared leadership role Consider merging with another partnership Re-assess meeting time, venue Match skills, expertise with roles

    9. Today’s Objectives Better understanding of types and functions of partnership assessments and evaluation Become familiar with evaluation and assessment tools Be able to make informed choices on applying partnership assessment and evaluation techniques Participants will Have a better understanding of the types and functions of partnership evaluations Become more familiar with tools and resources Be able to make informed choices on applying partnership evaluation techniquesParticipants will Have a better understanding of the types and functions of partnership evaluations Become more familiar with tools and resources Be able to make informed choices on applying partnership evaluation techniques

    10. Evaluation is The systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve effectiveness, and/or inform decisions about future programming.(adapted from MQ Patton, Utilization-focused Evaluation, 1997)

    11. Partnership Evaluation Basics Evaluation purpose Provide ongoing, systematic information that strengthens the partnership Provides impact and outcome data to assess extent of change among participants or systems Information generated may determine whether the organization and activities are sustained Partnership evaluation plan should be part of planning/building the partnership

    12. Role of Partners in Evaluation Commitment to process, impact, outcome evaluation and partnership assessment Contribute (or link to) evaluation expertise and resources Data sharing Participate in problem solving, overcoming barriers Communicate, use outcomes

    13. Types of Evaluation for Partnerships Process evaluation Impact evaluation Outcome evaluation Periodic partnership assessment Today’s presentation is not meant to be comprehensive tutorial on evaluation. There are many excellent comprehensive resources on evaluation. Among your partners you may have expertise on evaluation—call on them to lead and guide it. In the following slides, we’ll briefly describe each of the three evaluation types and then spend a little more time on partnership assessment. CDC’s Framework for Program Evaluation in Public Health (http://www.cdc.gov/eval/resources.htm) and the University of Kansas’s Community Tool Box: A Framework for Program Evaluation (http://ctb.ku.edu/en/ ) are good guides. The Evaluation Framework for Community Health Programs, based on those documents, is also an easy-to-use resource (more about accessing these later). CDC Evaluation Fundamentals for Evaluating Partnerships, Atlanta, GA: US Department of Health and Human Services, 2008 is another resource. (http://www.cdc.gov/dhdsp/state_program/evaluation_guides/pdfs/Partnership_Guide.pdf) Today’s presentation is not meant to be comprehensive tutorial on evaluation. There are many excellent comprehensive resources on evaluation. Among your partners you may have expertise on evaluation—call on them to lead and guide it. In the following slides, we’ll briefly describe each of the three evaluation types and then spend a little more time on partnership assessment. CDC’s Framework for Program Evaluation in Public Health (http://www.cdc.gov/eval/resources.htm) and the University of Kansas’s Community Tool Box: A Framework for Program Evaluation (http://ctb.ku.edu/en/ ) are good guides. The Evaluation Framework for Community Health Programs, based on those documents, is also an easy-to-use resource (more about accessing these later). CDC Evaluation Fundamentals for Evaluating Partnerships, Atlanta, GA: US Department of Health and Human Services, 2008 is another resource. (http://www.cdc.gov/dhdsp/state_program/evaluation_guides/pdfs/Partnership_Guide.pdf)

    14. Partnership Evaluation Guidance 6 step process (CDC Evaluation Framework) 1. Who is the evaluation for? Identify stakeholders – who has an interest (or who cares) about what will be learned and what will be done with the information 2. What program/activity is being evaluated? Describe partnership’s members, activities, products, outcomes A logic model may help describe program parameters 3. What methods will be used to conduct the evaluation? Focusing the evaluation design means determining the use and users. What is the purpose of the evaluation? To test program effectiveness? To make a case for programmatic changes? To assess the impact on a specific population? To justify financial support? 4. Gathering and analyzing evidence addresses the “raw material” for a good evaluation. Data (or information) must be believable, trustworthy and relevant. Having credible evidence strengthens evaluation results and the recommendations that follow. In this step, you also consider how to present results in a meaningful way. 5. Why justify conclusions? Sometimes the evidence doesn’t speak for itself. This step helps assure that claims made about the partnership’s program/activities are based on evidence and can be substantiated. Interpretation enters in this step; interpretations will be strengthened through active stakeholder participation, drawing on perspectives that they bring to the process. 6. How can we assure use of what we learned? First, make sure partners/stakeholders are aware of findings. Consider how the findings can be used to make program-related decisions. Deliberate efforts must be made to promote use of results and equally important is to prevent misuse of findings. Lastly, four standards ensure the quality of the evaluation and include Utility – is the evaluation useful? (answers relevant questions) Feasibility – is the evaluation practical and viable? (makes sense, is cost effective) Propriety – is the evaluation ethical? (conducted with regards for the rights and interests of those involved) Accuracy – is it correct? (just that—can purposes and steps be documented) 6 step process (CDC Evaluation Framework) 1. Who is the evaluation for? Identify stakeholders – who has an interest (or who cares) about what will be learned and what will be done with the information 2. What program/activity is being evaluated? Describe partnership’s members, activities, products, outcomes A logic model may help describe program parameters 3. What methods will be used to conduct the evaluation? Focusing the evaluation design means determining the use and users. What is the purpose of the evaluation? To test program effectiveness? To make a case for programmatic changes? To assess the impact on a specific population? To justify financial support? 4. Gathering and analyzing evidence addresses the “raw material” for a good evaluation. Data (or information) must be believable, trustworthy and relevant. Having credible evidence strengthens evaluation results and the recommendations that follow. In this step, you also consider how to present results in a meaningful way. 5. Why justify conclusions? Sometimes the evidence doesn’t speak for itself. This step helps assure that claims made about the partnership’s program/activities are based on evidence and can be substantiated. Interpretation enters in this step; interpretations will be strengthened through active stakeholder participation, drawing on perspectives that they bring to the process. 6. How can we assure use of what we learned? First, make sure partners/stakeholders are aware of findings. Consider how the findings can be used to make program-related decisions. Deliberate efforts must be made to promote use of results and equally important is to prevent misuse of findings. Lastly, four standards ensure the quality of the evaluation and include Utility – is the evaluation useful? (answers relevant questions) Feasibility – is the evaluation practical and viable? (makes sense, is cost effective) Propriety – is the evaluation ethical? (conducted with regards for the rights and interests of those involved) Accuracy – is it correct? (just that—can purposes and steps be documented)

    15. Process Evaluation Also called formative or short-term Answers questions about implementation, fidelity to the plan How many interventions were delivered? Who attended? Who didn’t? How were implementers trained? Were the activities delivered as intended? Was target population reached? What were participants’ reactions Process evaluation focuses on implementation and usually provides the first information on a program/activity/implementation or detects potential problems. This data can be used to make program adjustments early into the process (thus sometimes called formative evaluation) Sources of this information may include periodic surveys, direct observations, audits and counts of services, contract reviews, and client/participant input, pre- and post-test resultsProcess evaluation focuses on implementation and usually provides the first information on a program/activity/implementation or detects potential problems. This data can be used to make program adjustments early into the process (thus sometimes called formative evaluation) Sources of this information may include periodic surveys, direct observations, audits and counts of services, contract reviews, and client/participant input, pre- and post-test results

    16. Impact Evaluation Also called intermediate outcome evaluation Assess intermediate effects (1-5 years) Measures progress towards accomplishment of an objective, goal Changes in contributing or risk factors Lifestyle changes Behavioral changes Environmental changes May have to use indirect measurements Impact evaluation assesses the intermediate effects the program/activity has on the target population. Looks for changes in contributing and/or risk factors such as changes in lifestyle, behavioral and environment. Impact evaluation assesses the intermediate effects the program/activity has on the target population. Looks for changes in contributing and/or risk factors such as changes in lifestyle, behavioral and environment.

    17. Outcome Evaluation An assessment of strategy, program or activity on target population Relates directly to achievement goal Objects of interest may be changes in health status, quality of life, policy/law Morbidity, mortality measures Ability to detect changes may depend on precision of measurement, specificity of standards, size of effect, which measures are used Outcome evaluation answers the question “so what?” L.W. Green and M.W. Kreuter (2004) Health program planning: an educational and environmental approach (4th edition), New York: McGraw-HillOutcome evaluation answers the question “so what?” L.W. Green and M.W. Kreuter (2004) Health program planning: an educational and environmental approach (4th edition), New York: McGraw-Hill

    18. A Few Evaluation Resources

    19. A Few Evaluation Resources

    20. Focus on Partnership Assessment Goal Collect ongoing, systematic, honest feedback Performance improvement, improved outcomes Framework for discussing collaborative process Logistics of assessment At least annually (group should be in existence for at least 6 months) Include input from all members (not just board or leadership team) Minimum of 5 active members Aim for input from at least 65% of membership Electronic or paper instrument Consider data collection process when setting up process

    21. Partnership Assessment Elements Partnership composition Diversity, representative Numbers (is more better?) How the collaboration works Leadership Efficiency Administration and management Resources Decision-making

    22. Partnership Assessment Elements Member perspective Relationships and synergy Use of complementary knowledge, skills, abilities and resources to accomplish more as a group than individually Trust Includes measures of reliability, shared belief in mission, frankness, truthfulness, Perceived benefits, drawbacks, satisfaction Connectivity (def. - measured interactions between partners Dimensions include interaction patterns, communication, trust, reciprocity Use mapping/graphing or strategic network analysis methodology Connectivity is defined as the measured interactions between partners Use social network analysis methodology to evaluate the strength of connections based on six dimensions Membership (organizational characteristics) Interactions (patterns) Communication (types, levels, frequency) Organizational value to partnership (power, involvement, resources) Trust (reciprocity, shared belief in mission, frankness) Reciprocity (mutual exchange of resources)Connectivity is defined as the measured interactions between partners Use social network analysis methodology to evaluate the strength of connections based on six dimensions Membership (organizational characteristics) Interactions (patterns) Communication (types, levels, frequency) Organizational value to partnership (power, involvement, resources) Trust (reciprocity, shared belief in mission, frankness) Reciprocity (mutual exchange of resources)

    23. Example from CDC Prevention Research Centers’ Partnership Trust Tool Survey Instrument

    28. Resources for Partnership Assessments

    29. Seven Indicators of Partnership Success Institutional partners that link to a goal Number of partners not a measure of success Evolving structure and partners No change may indicate trouble Leadership in key positions Most senior not necessarily best leader Inclusive decision-making Teamwork From Dr. Hector Garza, president of the National Council for Community and Education Partnerships in the Harvard Family Research Project’s Evaluation ExchangeFrom Dr. Hector Garza, president of the National Council for Community and Education Partnerships in the Harvard Family Research Project’s Evaluation Exchange

    30. Seven Indicators of Partnership Success (cont) Appropriate governance structure Style and structure suited to mission, goals Mutually beneficial interactions Relationships, trust are key Decision-making based on data For strategic decisions, wise investments, measure impact Use of memoranda of agreement, if appropriateUse of memoranda of agreement, if appropriate

    31. Promising Practices Use innovative methods Strategic network management Focus on partnership capacity measures For example: trust, leadership, resources, sense of community Provide orientation, training on evaluation Provide partners a thorough understanding of benefit, process Begin where you are Avoid overwhelming (e.g., choose one aspect to start)

    32. Challenges of Evaluating Partnerships Evaluation may not been included in plan Inadequate resources Priority (rather spend on programs) Lack of commitment (are we there yet?) Uniqueness of partnerships As compared to what? who? Design and methods Establishing, measuring outcomes, baseline data Controlling for extraneous variables, secular trends Political will, perceived risk

    33. Resources http://www.cdc.gov/dhdsp/state_program/evaluation_guides/evaluating_partnerships.htm CDC Division for Heart Disease and Stroke Prevention, “Fundamentals of Evaluating Partnerships: Evaluation Guide” (2008) http://www.cdc.gov/prc/about-prc-program/partnership-trust-tools.htm CDC Prevention Research Center’s Partnership Trust Tool http://www.cacsh.org/ Center for Advancement of Collaborative Strategies in Health, Partnership Self-Assessment Tool http://www.joe.org/joe/1999april/tt1.php Journal of Extension article: “Assessing Your Collaboration: A Self-Evaluation Tool” by L.M. Borden and D.F. Perkins (1999)

More Related