1 / 24

Evaluating the Effectiveness of Citizens Review Panels

Evaluating the Effectiveness of Citizens Review Panels. Blake L. Jones, MSW, LCSW Program Coordinator Kentucky Citizens Review Panels bljone00@uky.edu www.uky.edu/socialwork/trc.

caelan
Download Presentation

Evaluating the Effectiveness of Citizens Review Panels

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating the Effectiveness of Citizens Review Panels Blake L. Jones, MSW, LCSW Program Coordinator Kentucky Citizens Review Panels bljone00@uky.edu www.uky.edu/socialwork/trc

  2. "Never doubt that a small group of thoughtful, committed citizens can change the world; indeed it is the only thing that ever has." Margaret Meade

  3. “Service is the rent we pay forliving. It is the very purpose of lifeand not something you do in your spare time." ~ Marian Wright Edelman

  4. “This whole citizens review thing is nothing but a public relations ploy” ~Anonymous CRP member

  5. “We’re not really helping kids. All we’re doing is just generating another report that CPS won’t use!”~Survey Respondent

  6. Objectives • National Update • Dissertation proposal • What are YOUR thoughts one measuring CRP effectiveness?

  7. How are state’s using CRP’s? • Great Variability > Numbers (Alaska: 5 volunteers, New Mexico: several hundred) > Staff Support > Tasks • Recruitment vs. Appointment

  8. Examples of Models • Created new panels (KY, Tenn.) > contract with Universities, other governmental agencies • Using existing panels (I.e., Child fatality review boards, regional or county QA teams, Governor’s task force teams). This appears common. • Hybrid (create new panels, but coordinate with larger group of existing panels) • Some states—Maryland, for example--have long history of “citizen review panels.”

  9. Working on legislation issues (I.e, dual track response) Examining community collaboration with CPS (through surveys, focus groups) Evaluating state budget cuts on social services Case reviews (I.e, looking for family involvement in case planning) Employee satisfaction Mandated reporters Administrative tasks (by-laws) CPS’ involvement with immigrant populations CPS’ relationship with schools, law enforcement, mental health providers Policy and Procedure (I.e., exit interviews) What are Panel’s Doing?

  10. CRPs used in the CFSR process • Minnesota • Alaska (informal) • Kentucky • Wyoming • South Carolina • California (citizens reading PIP) • Maryland (maybe?) • Oregon • Others?

  11. Is it Working? • Mixed results • Citizens having difficulty in defining their role, staying on track (easier with foster care review boards) • Seem to have changes on the local level (I.e., examining local policy and procedure) • Difficulty in recruiting diverse membership • Budget considerations (some recommendations not financially feasible) • Citizens want more feedback

  12. Comments • “There is very little feedback from CPS so these panels rightly feel like their work is unappreciated.” • “ . . . Some team members are concerned that (their report) becomes so much paper in some big building” • “Some members do not have the basic skills to research a problem, develop a plan, begin working on it and track its progress” • “It makes no sense to me that an ‘objective’ body that is supposed to be evaluating a government agency would be housed within that agency”

  13. Evaluation of Citizen Boards • Mostly focused on “outcomes” (i.e., Litzelfelner and CASA research) • Most researchers have found positive results from citizens boards • Perceptions of outside reviewers generally positive, though some distrust reviewers (Leashore)

  14. Unauthentic Conflictual Input sought “after the fact” Reactive Citizen treated as necessary evil Mistrust Authentic Collaborative Input sought before decisions are made Proactive Citizen treated as partner Trust “Unauthentic” vs “Authentic” participationKing, Felty & Susel (1998)

  15. Difficulties in doing traditional program evaluations on CRPs • Wide variability in groups • Hard to measure impact of CRPs on child welfare systems (need longitudinal study) • What are “outcomes”? (reports?) • Shifting child welfare priorities (federal and state)

  16. Dissertation Study

  17. Wisconsin Florida Alaska New York Alabama Ohio Tennessee Maryland Georgia New Hampshire New Mexico South Carolina Arkansas Wisconsin North Carolina Wyoming West Virginia Nevada Minnesota States Involved

  18. CRP Members and CPS Staff surveyed • Looking for: > timely access to info > training > chairperson > frequent contact with liaison > ability to impact policy decisions > Feedback from CPS

  19. Arnstein’s Ladder • Where does YOUR citizens review panel fall on the “ladder” • Hypothesis: there will be a significant difference

  20. Delegated Power: Citizens on your panel have the needed power to actively change the child protective systemPartnership: The child protection system allows citizens to share in decision making, but retains all the power to change thingsPlacation: The system just “tells you want they want you to hear” about child protection in your stateConsultation: The child welfare system consults with citizens but does not give them any powerInforming: The child welfare administrators in your state engage in “one way” communication with citizens (i.e., by providing superficial answers or discouraging questions)Manipulation: The child welfare system uses the citizens review panels to push its own agenda

  21. Other Variables • Previous leadership/volunteer experience • Gender, ethnicity (“representative” of community?) • Content of report/response to report

  22. Limitations • Studying perceptions only • Point in time study (need longitudinal study to assess organizational change) • Generalizable nationwide?

  23. What are YOUR Thoughts?

More Related