1 / 39

Cindy Willey Washington Institute for Mental Illness Training and Research ckw4@u.washington

Building and Maintaining Consumer Organizations: Using Evaluation to Support Consumer Delivered Services. Cindy Willey Washington Institute for Mental Illness Training and Research ckw4@u.washington.edu Eric J. Bruns University of Washington

lauren
Download Presentation

Cindy Willey Washington Institute for Mental Illness Training and Research ckw4@u.washington

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building and Maintaining Consumer Organizations:Using Evaluation to Support Consumer Delivered Services Cindy Willey Washington Institute for Mental Illness Training and Research ckw4@u.washington.edu Eric J. Bruns University of Washington Division of Public Behavioral Health and Justice Policy ebruns@u.washington.edu

  2. Description of the session • Introductions • Why did we attend this session? • Setting the stage • A brief overview of research and evaluation • Why is it relevant for consumers and families? • Consumer-led evaluations • History and examples • Brainstorming and group discussion • Evaluating your program in Washington State • Getting Ideas and Feedback • The Washington State Consumer and Family Evaluation Mini-Grant Program

  3. Consumers, family members, and research • Research has led to many breakthroughs in treatment and rehabilitation • Such breakthroughs are only possible through teamwork • So why is research sometimes not seen as relevant? • Research as a mystery • Research as something consumers don’t participate in • Research as something not relevant to everyday life

  4. Research Questions from consumers • What factors help people to recover? • Or, what factors hinder recovery? • What kinds of jobs supports can help people with psychiatric disabilities succeed? • What factors make treatment most useful and meaningful? • How does spirituality affect recovery? • What is the relationship between mental health and physical health? • How do budget cuts affect the mental health of consumer/survivors? • What is the impact of involuntary commitment on someone’s mental health? • What are the benefits or risks of taking a specific medication?

  5. Program Evaluation • A growing emphasis on outcomes • (Outcomes = Documented result or impact of a program, intervention, or process…) • A positive development for consumers? • All programs will have to show how services helped to improve the lives of their clients • (i.e., not just how many people they served or how many units of service were provided) • A negative development for consumers? • For consumer-run programs, more responsibility for conducting research or evaluation that shows whether these programs are beneficial to their participants.

  6. How to make research and evaluation more relevant • More useful products – “real-world” information focused on daily living (employment, transportation, housing) • Research reports in more simple language and in varied formats that are user- friendly • Better understanding by researchers of cultural differences and needs. • Recruit, train and hire more researchers with disabilities and from minority backgrounds. • More public education should be provided. • Consumers should be full participants in planning and doing research • Build greater trust between researchers and consumers. • From the National Center for the Dissemination of Disability Research and Research Utilization Support and Help (RUSH) project

  7. Evaluation Methodology, Design and Analysis:An Overview

  8. Process and Outcome Evaluation • Process Evaluation (a.k.a. implementation assessment, formative evaluation, intervention fidelity) • Is the program being delivered as intended? • Is the program being delivered to who it was intended? • Outcome Evaluation (a.k.a. impact assessment, summative evaluation) • What is the effect of the program on the program recipients? • Is the program beneficial? • For whom is the program beneficial? More for some than others?

  9. Process evaluation examples • Consumer satisfaction • Consumer-oriented values • Service delivery and practice • Availability • General assessment of service quality • Treatment quality / treatment fidelity • System development • Service coordination and collaboration • Assessment of community capacity

  10. Why process evaluation?Testing assumptions in an evaluation of respite care • Information about respite availability and respite’s benefits will reach the target audience • When parents and providers hear of the program, they will refer/be referred to the program • Respite workers will be well-trained to deal with problem behaviors • Respite workers will be able to offer activities during respite hours that are attractive and beneficial to the child and the parent • Parents will be able to use allotted respite care time in a way that reduces stress • Respite will be of adequate intensity and duration to be of benefit to children and parents • Parents and children will perceive the respite care program to be helpful • Respite provides a “value-added” to other services and supports

  11. Outcomes Evaluation:Experimental Evaluation Design • Random assignment into intervention and control groups • Group comparability • Minimize threats to internal validity • Internal validity: Are the outcomes associated with the intervention; Can we say that the intervention causes the outcome? • Can only be used to evaluate programs that do not target entire populations

  12. Offer best assessment of cause and effect Offer best linkage between program and outcomes The “Gold Standard” in determining “Evidence Based Treatments” May not be ethical May not be legal May not be feasible Experimental Designs: The Realities

  13. Examples of Quasi-Experimental Designs(when program does NOT target entire population) • Post-test only comparison group designs • Two groups (intervention and comparison) observed following the presentation of intervention • No random assignment to groups • No pre-testing • Pre-test post-test comparison group designs • This is the most frequently used design in social science evaluation research. Involves a pre and post intervention measurement for the two groups (intervention and comparison), no random assignment to groups.

  14. Examples of Quasi-Experimental Designs(can be used when program targets entire population) • Repeated measures designs • Can be used with just the intervention group or with two groups (intervention and comparison). Several measurements over time (e.g., baseline, post intervention, one year subsequent, etc.) • Time series designs • Expanded repeated measures, include many observations of either just the intervention group or two groups (intervention and comparison) • Intervention may be introduced and removed and reintroduced • At least 50 observations points required

  15. Data Collection Methods • Self-report: Individuals report/describe their own behavior/state of mind/feelings/etc. • OR some other informant provides information about the individual of interest (e.g, parent reports on child behavior) • Observation: Researcher observes and records (in some way) the behavior of interest. • Archival Data/Existing Records/MIS

  16. Evaluation Principles to consider • Theory-based • What are the expected outcomes? • Why do we think we will get them? • Utilization-focused • What are the priorities for our evaluation resources? • Participatory • How do we make sure the evaluation is informed and used by all the relevant stakeholders?

  17. Theory-based: Clubhouse model Clubhouse This example courtesy of Dori Hutchinson, USPRA Research Committee

  18. Utilization focused:Asking questions about “What to measure?” Types of questions: Evaluation use • What decisions, if any are the evaluation’s findings expected to influence? • When will the decisions be made? • By whom? • When, then, must the evaluation findings be presented to be timely and influential?

  19. Utilization focused:Asking questions about “What to measure?” Types of questions: What data is needed? • What data and findings are needed to support decision-making? • Are the data available through existing means? • …Or, are new data collection approaches needed?

  20. Participatory Evaluation • Broadly identify stakeholders and include more non-evaluators than evaluators in the evaluation (e.g., consumers, families, youth, program directors, community groups, etc.). • A “stakeholder” = Anyone who has a stake in the program that is being evaluated!! • The group of stakeholders of a program and its evaluation may not be static. • Regularly update stakeholder groups on the data collection process. Learn how it can be improved, refined, enhanced. • Convene stakeholder group to design data dissemination plan. • Disseminate results regularly to and with stakeholders

  21. HISTORY AND EXAMPLES OF CONSUMER-LED EVALUATION CINDY WILLEY WASHINGTON INSTITUTE

  22. In 1979 Prager and Tanaka reported to the Ohio Department of Mental Health on the results of involving mental health consumers in evaluation. They concluded: “Representing the consumer’s perspective on the meaning of mental illness and the correlates of ‘getting better,’ the process of client involvement in evaluation design and implementation is not only realistic and feasible; it is, we feel, a professional necessity whose time is overdue.”

  23. THE WELL-BEING PROJECTA LANDMARK STUDY1989 • Funded by the California Department of Mental Health Office of Prevention • First consumer research project: Conducted by the California Network of Mental Health Clients • Jean Campbell, Principal Investigator • Ron Schraiber, Co-Investigator

  24. Discovering the Consumer Perspective • Asking New Questions • negative outcomes identified • self-management skills revealed • importance of personhood established • effects of prejudice and discrimination quantified • incongruity of values, perspective, and identified needs between consumers, family members, and mental health professionals found

  25. Findings • 40% of mental health clients surveyed felt that “all” or “most of the time” people treated them differently when they found out they have received mental health services: like they are violent (16%) like a child (21%) like they don’t know what is in their own best interest (31%) like they are incapable of caring for children (20%) like they are incapable of holding a job (33%)

  26. Findings • More than half of the clients surveyed indicated that they “always” or “most of the time” recognize signs or symptoms of psychiatric problems and that they can take care of these problems before they become severe.

  27. Findings • 48% indicated that they have avoided treatment due to fear of involuntary commitment. • 30% reported that they had “little” or “no” control over the kind of mental health services they receive.

  28. HOW CONSUMERS BECAME PART OF THE MHSIP COMMUNITY MHSIP = Mental Health Statistics Improvement Program First consideration of adding consumers & family members came in the early 90’s. On the local and state levels consumers were beginning to be involved in policy and evaluation

  29. FIRST STEPS • STATE PLANNING COUNCILS INCLUDE CONSUMERS • CONSUMER REPRESENTATIVES JOIN MENTAL HEALTH BOARDS • ANN LODER PRESENTS PLENARY “HOPE WITH A CAPITAL H” AT NATIONAL MENTAL HEALTH STATISTICS CONFERENCE (1991) • JEANNE DUMONT BECOMES FIRST CONSUMER MEMBER OF MHSIP AD HOC ADVISORY GROUP (1992)

  30. MOVING FORWARD • OFFICES OF CONSUMER AFFAIRS ARE ESTABLISHED ACROSS THE NATION • THE CONSUMER/SURVIVOR MENTAL HEALTH RESEARCH AND POLICY WORKGROUP IS FORMED (1991) • THE MHSIP MENTAL HEALTH CONSUMER-ORIENTED REPORT CARD (1994-PRESENT) CONSUMERS AND MENTAL HEALTH PROFESSIONALS COLLABORATE TO PRODUCE AN OUTCOME TOOL GROUNDED IN CONSUMER VALUES

  31. CONSUMER OPERATED SERVICES PROGRAM (COSP) MULTI-SITE RESEARCH INTITIATIVEhttp://www.cstprogram.org/cosp/ Consumer and non-consumer interviewers were employed from 1998-2002 for this SAMHSA project. The study took place in 7 different states. (Connecticut, Florida, Tennessee, Missouri, Illinois, Maine, Pennsylvania) Among 6 of 7 study sites providing information about consumer hiring, the composition of their interviewing staff, it was reported that 2/3 of the project interviewers were consumers.

  32. CURRENT INFORMATION ON CONSUMER-LED EVALUATION MHSIP - http://www.mhsip.org/index.asp National Empowerment Center - http://www.power2u.org/ SAMHSA - http://www.samhsa.gov/ CONTAC - http://www.contac.org/ Directory of Consumer- Driven Services – http://www.cdsdirectory.org/ Self-Help Clearinghouse - http://www.mhselfhelp.org/index.htm Center for Psychiatric Rehabilitation Boston U. - http://www.bu.edu/cpr/index.shtml

  33. New York Association of Psychiatric Rehabilitation Services (NYAPRS)PEER BRIDGER PROJECT Peer Bridger Project OutcomesA participatory evaluation of the project prepared by Cheryl MacNeil, Ph.D. identified and examined several areas where the project benefited those involved: • The development of uniquely different relationships than one typically experiences in a psychiatric hospital • The temporary relief from a state of social isolation and physical enclosure often experienced by those hospitalized • The ability to share with each other the experiential wisdom and survival skills necessary for the process of recovery

  34. Rehospitalization Rates: "The most substantial finding is that the follow-up rehospitalization rate of Matches while enrolled in the Peer Bridger Project was significantly less than the baseline hospitalization rate (i.e.. the 2-year period prior to enrollment).That is, during the 2-year baseline period, the Matches were hospitalized an average of 60% of the time, while enrolled in the program, however, they were rehospitalized only 19% of the time. That's an improvement of 41%!". (National Health Data Systems, December 1998)For further information about the Peer Bridger Project please fill out our online request form or contact Tanya Stevens, Peer Bridger Project Director, by phone at 518-436-0008 x18 or by e-mail at TanyaS@nyaprs.org.

  35. EDITORIAL “I believe the real mental health transformation will come when our practice is guided by people first values. Values fundamental to our culture, such as self determination, freedom, opportunity and choice, must be fundamental to our practice. “ “When I look at the field of mental health services research, I see a field in which significant research typically follows significant changes in practice.”

  36. EDITORIAL CONTINUED “From my perspective, what currently is contributing most to delaying the creation of recovery-oriented, people first practices is not the absence of critical science underlying our practices, but the absence of the critical values underlying our practices. And if our practices were grounded in these people first values, subsequent research could study how to best incorporate these values into our program and system policies, procedures, and documentation. Science can certainly help close the gap between values and practice.” William A. Anthony Summer 2005 Psychiatric Rehabilitation Journal

  37. Group discussionEvaluation questions • What does your organization have to offer to improved health and wellness of clients or the community? • What would you like to evaluate with respect to those impacts? • What else could your organization learn in order to improve its effectiveness? • What evaluation questions would be related to the above questions?

  38. Group discussionEvaluation methods • How might you gather data on those evaluation questions? • What would you need to gather, analyze, report, and use the data? • What kind of support would you need? • Examples: A list of existing instruments, how to conduct a focus group, how to create a measure, how to analyze data, how to do a report

  39. Washington State Consumer and Family Evaluation Grant Program • How best to provide support to consumer organizations and consumer evaluators? • Ideas: • Conferences and educational opportunities • An RFP and Grant process for consumer and family organization evaluation and consumer-led evaluations • Consultation and technical assistance to support consumer and family-led evaluation projects

More Related