1 / 70

Evaluation for the People

Evaluation for the People. Research Methods for Community Change. Funders and Interlocking Elites. Programs. Service Agencies. Bureaucrats. Projects. Community Members. Adapted from Stoecker p. 65. Reassessing the Pyramidal Structure of Evaluation Research .

xenon
Download Presentation

Evaluation for the People

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation for the People Research Methods for Community Change

  2. Funders and Interlocking Elites Programs Service Agencies Bureaucrats Projects Community Members Adapted from Stoecker p. 65 Reassessing the Pyramidal Structure of Evaluation Research • The typical structure of evaluation is quite pyramidal, with interlocking directorates of funding organizations, government institutions, and academic evaluators imposed on lower levels of service agencies, bureaucrats, and community members • The top imposes programs on the bottom, which implements projects to fulfill the program mandates

  3. Programs and Projects • Programs are comprehensive social change systems that are generally broad in scope with long term goals • LBJ’s Great Society Program • Projects are delimited implementation of specified program goals usually bounded in time and space by statute • The Texas Workforce Commission or more specifically, the Upper Rio Grande Workforce Development Board

  4. Evaluation on the “Pyramid Scheme” • Evaluation is rarely done of entire programs, it is commonly found a the level of projects • Elites don’t mind evaluating others… so long as they are not evaluated in turn • As a practical matter, even socially aware researchers must comply with this unstated basic structure of evaluation research

  5. Navigating the Political Landscape – Anticipate or Be Tackled • Keep the following basic questions about political opportunity structures in mind when proposing to engage in policy evaluation and policy formation as an SMO member • How is formal power distributed? • Power brokers • Linkage figures • How is informal power distributed? • Power brokers • Linkage figures • What are the relations between formal and informal power structures? • Which is more powerful? • Which is your SMO better connected to? • What are the controversial “sore” issues? • Degree of polarization • Elites • Public • Where does the project fit into the community’s political cleavage structure? • Will it build bridges or dig ditches?

  6. Taking Research From the Activist’s Perspective • To many activists, the groundwork that they do to prepare for a successful action campaign is not “academic” research • This view is slightly erroneous

  7. Old Research Paradigm • Classical academic research is to be value neutral • The researcher is to have no “horse that he/she is backing” • This is to avoid slanting questions, design, and methods to get “desired answers”

  8. Post Modern/Realistic Research Paradigm • No one is value free. The best we can hope for an open accounting of the researcher’s biases • This means that there is no reason that a frank and open activist cannot do quality research • There is really no such thing as “in too deep,” unless the researcher decides to not reveal how deep they are in

  9. Intensive and Extensive Approaches • Intensive – case study/qualitative approaches concentrate in detail on one or a few examples of a phenomena • Extensive – survey/quantitative approaches try to discern general patterns of behaviors or events

  10. Intensive • The goal is really to get deep enough into the situation to derive causal predictions • By getting into intense scrutiny, we try to eliminate alternate explanations for a an event • With enough scrutiny, we also can become quite familiar with possible history and maturation processes that extensive analysis tends to gloss over • The problem with this approach is that findings tend to be idiosyncratic (unique to the case)

  11. Extensive • The goal here is generalizability based on causal hypotheses gained from initial intensive studies • By dealing with repeated observations over space and time, we can make a case for the observed connection being a causal relationship or law of human behavior • The problem with such studies is that findings rarely investigate extensively multiple lines of causality • One line is preferred and any others are by default either used as controls or are not even investigated

  12. Project-Based Research Cycle • The project-based research cycle that Stoecker (2003) uses is akin to the logic described in Levin’s research cycle (see CRIJ 3300 powerpoints)

  13. The Goals are Deceptively Simple • Engage community members in evaluation • Diagnose community needs and strengths • Define potential solutions that a community would find acceptable and consonant with community norms • Use the community evaluators to help gain acceptance of the implementation of the selected solution(s) • Evaluate achievement of objectives according to both sponsors and community clients

  14. Finding One’s Place in the Cycle • Diagnosis • New services are in demand • New problem exists, cause unknown • General need to be “in touch” with community clients • Strategic planning • Prescription • Finding best practices (common solutions) for our problem • Seeking to decide if best practice(s) will apply to our community • Seeking to efficiently mobilize SMO and service provider extant resources • Implementation • Moving aspect of SMO to the fore in aiding community • Using POS (political opportunity structure, see Tarrow 1994) to get policy enacted • Public • Private (corporate) • Evaluation • Seeking to find if our project had an impact • Seeking to determine if a change in strategy or tactics is needed to meet SMO goals and community needs Adapted from Stoecker 2005, p.76.

  15. Who are These Community Persons? • Staff and volunteers of local social movement organizations and movements • The academic institutions and government institutions that house evaluation units • Interested college students that either research the community out of a personal connection or are committed to evaluation projects as part of their degree • Academic researchers who are engaged in community service or contract-research with sponsors of community affecting projects • Funding organizations • Service providers that do not house evaluation units • General community members who come in for a wide variety of personal reasons

  16. Government or Academic Institution Service Provider SMO General Community that Has Both Assets and Needs SMO Service Provider Government or Academic Institution Linkage Adapted from p. 46 of Stoecker (2005) Who are These Community Persons? (cont’d)

  17. Staff and Volunteers of SMOs1 • These folks are the foot soldiers of project based evaluation – the civil society version of street level bureaucrats • They have the history and maturation knowledge essential for good research design and implementation 1 SMO – Social Movement Organization, the general class of organizations that community organizations belong to.

  18. College Students and Faculty • These are the “specialists” or “guns for hire” • They have the methodological expertise to merge with the case expertise of SMO members to complete the designs and implementation plans • Keep in mind, the hero of “Have Gun Will Travel” was not a callous fellow. He was actually an idealist. Just because you are a methodological expert does not mean that you are aloof

  19. General Community Persons • These folks add perspective • SMO members tend to be a bit overcommitted at times • Academics may not know enough facts on the ground • Thus incorporating additional views can be a real bonus, if done with a view to improving the study design

  20. Steps for Doing Project-Based Research • Choosing questions • Designing the methods • Data collection • Data analysis • Reporting findings

  21. Choosing Questions • The selection of what to ask is always a negotiated process between stakeholders • The difference with project-based research is the relative emphasis one puts on the stakeholders • Here community stakeholders take moral (if not actual) priority over financial and institutional stakeholders • Theoretically the people matter most, only after their needs are addressed do the considerations of finances and legal mandates come into play

  22. Designing the Methods • Design can be top-down or bottom-up • Top-down – prefabricated designs are placed in many contexts to insure comparability and reliability • Bottom-up – using community input contextually relevant designs are employed to insure validity and consensual implementation of studies

  23. Data Collection • A personal favorite… • I like to use students to do phone surveys and intercept surveys. The students generally belong to the El Paso “community” • This increases their ability to converse in a linguistically appropriate manner with respondents • I would be abysmal at doing such interviews myself and hiring folks from even a prestigious firm like Princeton Research Associates would not make situation much better

  24. Data Analysis • Community contact brings nuance to the analysis • Non-SW US surveyors rarely understand the ire they raise when not considering Latino/Hispanic a race • History and maturation processes for time series analysis are fundamental • The totally bungled testimony by ASARCO’s expert witness on the health impacts of reopening the plant foundered on the out of towner’s ignoring the impacts that the plant would have on Cd. Juárez and that Juárenses would make it an issue at an El Paso hearing

  25. Reporting Findings • While the word of an impartial judge is the epitome of legal jurisprudence, it does tend to lack for the persuasive impact one might desire • Thus the classical academic discourse of findings lacks as Israelis say, “Zazooah” or “Ooomph” • Having a trusted community member express findings that are counter to their known positions on an issue is both highly credible and attention-getting • The challenge is getting a reputable community member to express such counter-ideologicals • The best way to do this is to have credible community members from all sides involved in the entire evaluation process so that they buy into the end products

  26. So How Do We Bring the Community in? • Try to be useful • Use a multimethod evaluation style • Listen as well as talk, learn as well as teach

  27. Useful • Not every evaluation need be stripped down to only the quintessential nuts and bolts of measurement and data analysis. In fact, to do so is generally counterproductive • Like in all genuine forms of politics, logrolling is an essential practice • Trade of services – as part of the evaluation, provide some community needed functions • Help organize community archives • Help generate donor lists

  28. Multimethod Evaluation • Despite my desires to the contrary, not all of us are statisticians at heart… nertz! • Thankfully, nor are all of us folklore specialists like my sister-in-law • We need both extensive/quantitative and intensive/qualitative methods

  29. Multimethod Evaluation (cont’d) • As my favorite Aesop’s fable goes, the ability of three blind men to describe a elephant/pachyderm depended on two things • Touching multiple sites of the “big tusks” (elephant) • Coordinating findings about the “wrinkly skinned” (pachy-derm) critter • So do our evaluations

  30. Don’t Be Your Professor or a College Freshman • Dr. Levin is a notorious talker and a lousy listener • College freshmen are notorious for not actively engaging in classroom discussions • Genuine collaboration needs a maximization of all parties to both listen and talk

  31. What to Think About When Collaborating with Community SMOs • SMO skills • SMO financial and personnel capacity • Compatibility of evaluation with SMO goals

  32. SMO Skills • By default, SMOs ought to be able to contribute • Community member contacting expertise • Street credibility • In addition, SMOs may be able to contribute • Archives • Funder/donor databases

  33. SMO Financial and Personnel Capacity • Clerical staff • Surveyors • Data entry specialists • Historians • Morale boosters – never underestimate the impact of a warm relationship on research productivity

  34. Compatibility of Evaluation With SMO Goals • Try to limit the input on evaluation goals to consensus goals that SMOs from several sides can agree on • This requires you to be slightly vague • This does not mean you lie! • This means keeping the goals to what is to be evaluated. Refrain from goals about what is to be done with the findings

  35. Flipping the Coin to the Other Side – SMO Questions • Can the academic/institutional evaluator listen to the SMO when needed? • Can the academic/institutional evaluator keep deadlines? • Can the academic/institutional evaluator speak to the SMO in its own language? • What credentials does the academic/institutional evaluator have?

  36. Can the Evaluator Listen? • Despite Dr. Levin’s notoriety, the Planned Parenthood Center of El Paso decided to take the plunge. • Since that decision, Dr. Levin has learned a new vocabulary of three letter words • If the formal evaluator can’t listen, the SMO should drop the relationship • Otherwise they risk having “words put in their mouth”

  37. Can the Evaluator Keep Deadlines? • In the period between retiring from the federal service and prior to starting at IPED, I have had nothing but deep schooling in the inability of academics to keep schedules • It is a norm for academics to • Come late • Turn in reports late • Turn in budgets late • Never write contracts out fully • Solution 1 – work only with academic research units • They are built to work with federal and state budgeting and grant application schedules • Solution 2 – work with private research units that share interests with your SMO • They too generally work on business schedules

  38. What Credentials Does the Evaluator have? • Credentials are a useful way to separate out the wheat from the chaff, but have one serious deficiency – young talents/institutions have limited credentials • Get as much input from the evaluator’s colleagues on the evaluator’s abilities as possible

  39. Tackling the Four Aspects of Project-Based Research • Diagnosis, Prescription, Implementation, and Evaluation are all worthy of discussion on their own. Thus the remainder of this lecture, like the Stoecker text, will address them separately

  40. Diagnosis • Forming a core group • Needs assessment • Asset mapping • Melding needs and assets

  41. Forming a Core Group • All social policy projects, like most complex games, require teams. Project success hinges on building a group that can • Survive mild degrees of internal conflict • Has enough breadth of external contacts to build allies and exploit POS • Either intentionally links in with the existing issue arena institutional structures or intentionally decides to “work around” the existing issue arena institutions

  42. Needs Assessment • As discussed last week, needs assessment is a complex form of evaluation • In this respect, I differ with Stoecker, who views only outcome evaluation/impact assessments and process evalutations as evaluation

  43. Extensive Needs Assessment • Extensive needs assessment • This is essentially what we reviewed last week • Surveys • Census data • SWOT analysis

  44. SWOT Analysis • Survey of Strengths, Weaknesses, Opportunities and Threats • This is a focus group approach • Bring in core stakeholders for a few 1-2 hour discussion sessions to • Discussion 1 – Generate a list of SMO or community successes and failures within a relevant time frame • Discussion 2 – Generate a list of SMO or community strengths and weaknesses within a relevant time frame • Discussion 3 – Maximize strengths and avoid weaknesses by utilizing POS of issue arena

  45. Intensive Needs Assessment • Beyond the quantitative assessment of community needs from surveys or census data exist the views of informed stakeholders • The basic SWOT method does not address the perceived depth of needs, polarization on goals, etc • Intensive assessments will add greater emphasis in SWOT to building lists of ranked priorities for needs • Weaknesses and threats will be prioritized on which need to be addressed most

  46. Asset Mapping • Some of us find a glass of water half empty (your Prof.), others find the same glass to be half full (his pal Marika). It takes both perspectives to realize what real potential the glass holds • Asset mapping is all about seeking to know what resources the general community, relevant SMOs, and service providers have as opposed to what they need • The goal behind asset mapping is to sensitize communities to their own potential to solve community issues without government largesse • Appropriate tools include surveys and census data

  47. Melding Needs and Assets • In truth, the best approach to a diagnosis of a body politic is the same as of a body – you need to know both what is healthy and what hurts • In the course of plotting out what hurts, you know what might be healthy • In the course of plotting out what is healthy, you know what you can rely on to deal with what hurts • It is much more sensible to approach a community diagnosis with a mind to seek to find the gap between internal needs and assets. This will avoid the overstatement of community and SMO helplessness that needs assessment often encouages

  48. Prescription • Service versus policy • Finding alternatives • Evaluating alternatives • Choosing an alternative – C-B/C-E Analysis

  49. Service Inward-focused on community Concrete plans Narrow application to a specific goal and context Can have policy changes in it Think of this as an experiment in Kuhn’s (1970) sense Policy Outward-focused on setting agenda for best practices solution to problem Abstract rules that service prescriptions will fill in Wide application Think of this as a paradigm in Kuhn’s (1970) sense Service Versus Policy

  50. Finding Alternatives • To come up with a vision that is broad, long-range, and has substantive meaning • Dig in deep in the existing literature • Use academic books, online academic journals, and research websites • Work with trade publications • In many service professions the federal government sponsors trade publications. So do unions. Find the union websites and you will find the publications • Ask friends in the field for their experiences • We all know someone who “knows someone.” Screw up your courage and ask around • Brainstorm with stakeholders • This is another focus group approach • Beware of groupthink!

More Related