1 / 25

Interpretation of data in Reporting on results under SWHISA context

Interpretation of data in Reporting on results under SWHISA context. Selamyihun. Performance Measurement. It is regular generation, collection, analysis and reporting of a range of data related to the operation and the impact of a project/organization

nhi
Download Presentation

Interpretation of data in Reporting on results under SWHISA context

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interpretation of data in Reporting on results under SWHISA context Selamyihun

  2. Performance Measurement • It is regular generation, collection, analysis and reporting of a range of data related to the operation and the impact of a project/organization • It is composed of a number and a unit of measure. The number gives us a magnitude (how much) and the unit gives the number a meaning (what) • Performance measures are always tied to a goal or an objective (the target). otherwise there is no logical basis for choosing what to measure, what decisions to make, or what action to take

  3. Purpose • Identifying and tracking progress against project goals • Identifying opportunities for improvement • Comparing performance against both internal and external standards • Formulating the direction of the strategic activities • To know where the strengths and weaknesses of a project lie

  4. What is Performance? Concepts • Given the multiple aims and multiple potential users of performance evidence, controversy can arise from the outset over how to define ‘performance’ • Much of the literature implies that performance is an objective phenomenon, consisting of a set of attributes of a program and its measurable impacts • The interpretations and the measures of performance arises out of an interactive process among individuals and institutions, as they do out of theories of programs, data generation and analysis

  5. What is Performance? (2) Concept • Performance captured by a particular set of measures will always be partial and contextual, reflecting the fact that the measures have been selected, analyzed and interpreted through the lenses of the organizations and individuals involved with the process • Given their inherently subjective nature, all measures should remain open to debate and possible replacement • Confidence and commitment to a particular set of measures versus ongoing debate and revisions to the measures, is an important part of the art of performance measurement

  6. Performance Measurement System Key features • It has clearly defined purposes • It uses and employs a limited cost-effective set of measures • It gives most attention to impacts or outcomes, not just to descriptions of activities and to volumes of outputs • It uses measures which are valid, clear, consistent, comparable, and controllable, in the sense that measure matters over which the project has control

  7. Performance measure system (2) Key features • Evidence from the system must be presented in a balanced, comprehensive, and understandable and credible fashion • It produces information which is relevant, meaningful, balanced and valued by the leaders/funders of the organization • it is linked to other key activities like planning and budgeting, which guides decision-making on a regular basis • It represents an aspirational statement rather than a description of what is

  8. Start Identify process Identify critical activity to be measured Establish performance goal (s) Establish performance measurement (s) Identify responsible party (ies) Collect data Analyze/report actual performance Compareactual performance to goal (s) PERFORMANCE MEASUREMENT PROCESS Yes Database No Are corrective actions necessary?

  9. Performance Reporting • Focus on the few critical aspects of performance • Look forward as well as back • Explain key capacity considerations • Explain other factors critical to performance • Provide comparative information • Present credible information, fairly interpreted • Disclose the basis for reporting

  10. Performance Reporting (2) • Tell the ‘performance story’ and not become too mesmerized by the numbers themselves • performance story assign meaning to the changing realities of program operations • Qualitative information is as important as quantitative information in telling the ‘performance story • Measures should be restricted to factors under ‘reasonable control’ by the project and there should be an opportunity to identify uncontrollable factors.

  11. Result Reporting :SWHISA Context • Reports may take many forms. However, at this stage, the report is intended to be a status transfer of information to the responsible decision maker for the process • informational reports, collect and present data; but their emphasis, is placed on analyzing, drawing conclusions, and proposing recommendations • Therefore, the report will likely consist of sets of tables or charts that track the performance measures, supplemented with basic conclusions

  12. Result Reporting :SWHISA Context (2) • Ideally, design formats should document sources, demonstrate cause and effect, promote comparison, recognize the multivariate nature of problems and indicate alternative explanations • Support the findings with evidence: facts, statistics, expert opinion, survey data, and other proof • It call for self-criticism and objectivity (Avoid exaggerating or manipulating the data to prove a point)

  13. Generating Information • Start by defining the question. Then, rather than diving into the details of data collection, consider how we might communicate the answer to the question and what types of analysis we will need to perform. • This helps us define our data needs and clarifies what characteristics are most important in the data • With this understanding as a foundation, we can deal more coherently with the where, who, how, and what else issue of data collection

  14. Formulate precisely the question we are trying to answer. Collect the data and facts relating to that question. Analyze the data to determine the factual answer to the question. Present the data in a way that clearly communicates the answer to the question. Information Needs Generating Information Questions Data Communication Analysis Information generation begins and ends with questions.

  15. Data Analysis • Before drawing conclusions verify the data collection process • Do the data collected still appear to answer those questions originally asked? • Is there any evidence of bias in the collecting process? • Is the number of observations collected the number specified ? If not, why? • Do you have enough data to draw meaningful conclusions?

  16. Data Analysis • group the data in a form that makes it easier to draw conclusions • Grouping or summarizing may take several forms: tabulation, graphs, or statistical comparisons

  17. Examples:1500 Watershed management Strengthen • Indicator i) Existence of watershed management participatory plans in target watersheds • RESULT: All of the six watershed management plans are complete ANALYSIS: • 100% of target reached. Participatory watershed management plans are in place in all of the six target watersheds. The process for plan development was fully participatory. • It is important to note that SWHISA’s contribution towards the development of these plans was one of accompaniment with the respective woreda experts. These experts took a participatory approach to the development of the plans working alongside the members of the Community Watershed Management Committees.

  18. Indicator ii) # of farmers adopting biological and physical conservation measures Table 2:- Farmers participated in SWCMs (physical and biological measures) at different watersheds Examples: 1500 Watershed management Strengthen

  19. Examples: 1500 Watershed management Strengthen ANALYSIS: • No of farmers involved in construction on voluntary basis reached 66% in the five woredas (Except Wurba): • Awareness raising programs provided through experience sharing and training to the beneficiaries of the watershed development expected to increase the number of farmers involved in the construction. • Recommendations for improvement:the woreda offices should give attention to these model watersheds to sustain the development activities. The major challenge in watershed development is ensuring sense of ownership of the community on the development endeavors. Building the capacity of the community is critical to empower the community or develop sense of ownership within the community.

  20. Examples:1500 Watershed management Strengthen Presence of watershed development plan (completed for all watershed) Capacity consideration • villagers play the major role in diagnosing their situation, planning development activities and implementing their action plans. (skills and Knowledge acquired) • They create new, or strengthen existing, village organizations to take charge of mobilizing the labor force and internal and external resources, ensuring transparency in the use of these resources, overseeing implementation of the action plan, monitoring progress, keeping records and evaluating outcomes (improve sense of ownership) • Learning to change reality by using existing resources in a more efficient way can contribute to breaking the vicious cycle of dependence. activities will be more sustainable if people learn first how to better manage the limited resources they have

  21. Examples: 1500 Watershed management Strengthen Assessment of intermediate results • should include both qualitative indicators (strengthening of village organizations, improvement of service delivery in villages), and quantitative indicators (hectares planted, kilometers of PSWC structures maintained, number of people who contributed labor). Provide Comparative information • impact on intermediate indicators such as the strengthening of village organizations and degree to which village action plans are implemented • BSCM success stories in connection with village organization and new emerging norms adopted during watershed development plan (if, any)

  22. Examples: 1500 Watershed management Strengthen Multivariate effect • compare the situation in villages that have been through the exercises with those that have not. • This involves the collection of base line data from villages that are planning to implement the WMP but have not yet done so, and from villages that are not planning to adopt the approach for the next few years Explain other factors critical to performance • Experience sharing tour for (both farming community and experts) • Training on various aspect watershed management

  23. # of ARARI personnel trained in irrigation research functions Table 2. Number of researchers trained (200-2010) Example2: 4500ARARI’s human resource practices, skills, tools and feedback mechanisms improved

  24. Example2: 4500ARARI’s human resource practices, skills, tools and feedback mechanisms improved ANALYSIS: • 50% of training target achieved. The target of 20 participants for each training has been reached or exceeded in each of the trainings in 2006 and 2009. The trainings in 2007, 2008 and 2010 fell short by 8 and 3, (16-18) respectively • The shortfall was due to the specific nature of the trainings in these years and the corresponding number of irrigation researchers to whom these topics were relevant

More Related