1 / 60

CSI PROFESSIONALS BRIEFING 2014

CSI PROFESSIONALS BRIEFING 2014. Reana Rossouw Next Generation Consultants. Session Two: Impact Investment Index. Our Context. Started in 2009 – global benchmarking – 40 models currently being applied internationally Conclusion:

toya
Download Presentation

CSI PROFESSIONALS BRIEFING 2014

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSI PROFESSIONALS BRIEFING 2014 ReanaRossouw Next Generation Consultants

  2. Session Two:Impact Investment Index

  3. Our Context • Started in 2009 – global benchmarking – 40 models currently being applied internationally • Conclusion: • Africa needs its own solution– applicable to both the funding and development sectors • Practitioners need capacity, support and knowledge • Practitioners don’t need complex methodologies or approaches requiring specialist skills, licenced software or specific hardware or expensive solutions • The industry needs a transparent, comparable, and flexible solution that contextualises and take into consideration the complexities, relationships and fundamental development principles of our specific development context

  4. Our Objective • The purpose of Impact Investment Index is to create a shared performance measurement system to be utilized by all organisations in the social investment and community development sector • The current system lacks coordination, leading to added expense, limited learning, and inadequate ability to assess shared value and collective impact

  5. Why the pressure to measure? • The debate on impact and return on investment are playing out in three arenas: • In private foundations and corporate CSI divisions • Aiming to be more strategic about their philanthropy, grant making and social/community investments • In nonprofit organisations in response to pressures from corporates, foundations and government • To be more accountable for the resources received and program outcomes expected • Among international development organisations such as bilateral government agencies and intermediary organisations • Seeking to improve development effectiveness and lessen dependency on grant/development aid

  6. Why now? • Funders are increasingly asking for demonstrable results– to understand the difference they make, directly and indirectly • This trend is accelerating, as the development and in particular the funding sector is increasingly looking to pay by results – to learn from what they, and those they fund, do • It is not just donors that care about impact. In a age of competition, transparency, recessionary economies, there is growing competition for resources

  7. Three primary applications of Impact Assessments • Prospective • Looking forward to determine whether or not the projected costs and benefits indicate a favorable investment • Ongoing • Testing assumptions and projections along the way in order to aid course correction • Retrospective • Looking back to determine whether or not it was a favorable investment given the costs incurred, and therefore to inform future decisions

  8. Variety of purposes – our school of thought • One can and should use impact data to make funding allocation decisions across program areas • You can compare programs once you get in the sector of health, but you cannot compare health vs. arts vs. education vs. sport. • One can and should use impact data to make funding decisions within program areas • It is not about building a unifying measurement across domains, but to build a conceptual framework for understanding the biggest impact across a Rand value unit. So it is not about comparing health to education to sport or the arts, but to determine which program yields the highest return for the most effective use of resources

  9. The conundrum • Funders and non-profits often use the words “evaluation” and “impact” loosely, stretching these terms to include any type of report on the use of funds or the results they achieve • Practitioners should distinguish between measuring performance(monitoring inputs, activities, and outputs), measuring outcomes (near-term results), and evaluating impact (long-term changes that can be attributed to the investor’s activities) as well as return on investment (benefits accrued to the investor) as a result of funds and other resources invested

  10. What is impact • “Measure of the tangible and intangible effects (consequences) of one thing's or entity's action or influence upon another. An impact evaluation/assessment assesses changes in the well-being of individuals, households, communities or firms that can be attributed to a particular project, program or policy. The central impact evaluation question is what would have happened to those receiving the intervention if they had not in fact received the program. ” • For us this means: The broad or longer-term effects of a project or organisation’s work (also referred to as the difference it makes). This can include effects on people who are direct users/beneficiaries of a project or organisation’s work, effects on those who are not direct users/indirect beneficiaries, or effects on a wider field/aspectsuch as government policy, processes, systems, infrastructure or support systems

  11. What did we want to achieve? • To provide evidence • To demonstrate performance • To prove accountability • To show program/ investment effectiveness • To demonstrate value • To empower and capacitate communities and funders • Ultimately - to alleviate, eliminate and eradicate poverty

  12. What we have achieved • Assessed R1 billion worth of investment but also considered all types of input resources from money to hours, products and services, infrastructure and skills and volunteerism • Across both socio economic and enterprise development sectors • Including 15 focus areas • 400 programmes – from flagship to donations, once off to 3-5 year programs, sponsorshipsand cause related marketing, social and business enterprises • Across 15 dimensions of impact and 25 dimensions of return • Developed a library of more than 500 indicators

  13. But more than that • Impact across the value chain: • Outcomes of partnerships, relationships and applied resources (to be more sustainable and effective) • Outcomes of the initiative (policy change, organisations working more effectively together reducing/alleviating/ eradicating poverty) • Outcomes at the community level (are we moving the needle against the strategic objectives/mandates indicators?) • Returns for the donor/funder (are we getting value, recognition, licence to operate, etc.)

  14. Guiding Principles of our work • Impact means impact • The goal is to understand what changes as a result of the investment from funders in communities as beneficiaries and recipients of interventions • The impact is shared • The goal is to understand who is impacted along the impact value chain – including funders , intermediaries and beneficiaries • Impact includes and involve all stakeholders • Analysis must be comprehensive. Instead of cherry picking something that’s working and leaving out what is not, the analysis should include all aspects of impact and those impacted • Results must be transparent • Companies should report to their investors, and investors should aggregate and report results. What is left out should be stated. Assumptions and sources should be stated. It should be possible for a third party to replicate the analysis based on the documentation of it and get the same result. • Context matters • It is harder to create a stable job in a rural area than in a city. The qualitative and quantitative context should be provided to inform the impact as well as an understanding of how much of the problem may exist or remain.

  15. Challenges remain • There are two main challenges to measuring impact • One: how to do it well • Two: how to do it cheap • Getting both to happen can be done in either of two ways: • Force it, by making it a requirement • Or attract it, by creating an environment in which the parties whose efforts are necessary to make it happen want it

  16. The breakthrough – a Comparative Performance System The Outcome: The Impact Investment Index

  17. Our process for impact assessment

  18. Impact Value Chain - Our Focus

  19. The Impact Investment Index - III

  20. Impact Assessment – HOW?

  21. Impact Assessment – WHAT?

  22. Dimensions of Impact – FOR WHOM? • How do we calculate? • We count each and every stakeholder group • We count each and every impact • We distinguish between community and business impact • Get to a figure: X:Y

  23. Calculating Impact – The Process

  24. Proving Impact and Return THE RESULTS

  25. Some examples of impact (1)

  26. Some examples of impact (2)

  27. Some examples of impact (3)

  28. How much was spent – Where?

  29. Spent per: Year, focus area, region

  30. Testing Perceptions

  31. Testing Needs and Recommendations

  32. Testing Expectations

  33. Comparing and Benchmarking • Across Industry • Across Focus Areas • Across Perceptions • Including local and global benchmarks • Vision, Mission, Strategy and Policy, guidelines, frameworks • Strategic intent, objectives, mandate • Focus Areas • Grant making criteria • Beneficiaries • Intermediaries • Operational alignment and integration • Governance and compliance • Reporting structure • Budget and resources • Monitoring and evaluation • Employee Volunteerism • Marketing, communication and awareness • Sponsorship, donations and cause related marketing • Partnerships • Recognition, Awards, perceptions • Future focus, GAP and SWOT analysis

  34. Strategic and Operational ReviewTheory of change to measure impact

  35. Co-define and co-develop indicators to measure impact

  36. Co-Development of impact assessment guidelines

  37. Transparency in assessment

  38. Scorecard per project per focus area per rating

  39. Community Impact

  40. Business Return

  41. Ratings and Rankings

  42. Impact Across Portfolios

  43. Collective impact and return

  44. Impact and return score cards

  45. Ranking and Rating per impact

  46. Some examples of indicators Business Return Community Impact Increased quality of life – 45 dimensions Increased social cohesion Increased access to services, infrastructure, support, income, jobs – 45 dimensions Skills Development – 30 dimensions Infrastructure – transport, hospitals, schools, electricity, water, housing – 60 dimensions Education – infrastructure, in general, specific – early childhood, primary, secondary, tertiary Economic, Environmental, Social Health, education, housing, safety & security, enterprise development, sport, agriculture, social justice, social participation, technology Impact on individuals, communities, government, companies, funding partners Quantitative, qualitative Volunteerism - dimensions Mitigate climate change risk Reduce environmental accidents and remediation Reduce water use and management; waste management and effluents Reduce and assist with energy management GHG emissions and air pollution Reduce biodiversity impacts Increase communications and engagement Increase customer satisfaction Increase access to tenders Increased access to new markets Increased reputation Increased sales Increased staff motivation Increased BEE Scorecard Strengthened value chain Maintained/obtained social licence to operate Increased product innovation and services development

  47. The impact of the impact assessment

  48. What our clients say - investors • Improved financial analysis and reportingas well as management capacity and operational efficiency • Improved operational efficiency and information management processes and systems • Improved board governance and oversight, understanding, knowledge and capacity • Improved M&E; reporting; communication; stakeholder relationships and partnerships • Improved operational processes from application requirements; due diligence processes; reporting requirements; responsiveness; knowledge and expertise of staff • Improved access to further investment

  49. What our clients say – intermediaries and beneficiaries • Intermediaries • We feel comfortable with the transparency of the process • The process have added value to our own work – especially M&E and reporting practices • The processes have increased our effectiveness and own performance; increased our learning and knowledge; built internal capacity; and increased our credibility • We believe we were assured independently by someone who can verify our claims – it validated our own beliefs • Beneficiaries • We had an opportunity to talk without being judged – we could be honest • We learnt to document our own work and the contribution we made • We feel we are being trusted, being heard and someone asks our opinion • We had an opportunity to share and learn

  50. What we have learnt (1) • One size of evaluation does not fit all • Funders should take extra time in their planning to learn which evaluation techniques will work with indigenous populations and specific communities. • Trying to define and measure empirical changes is difficultand “a slippery process,” • Understanding, defining, qualifying and quantifying long-term social change is an incremental effort and on-going process. • Although evaluations are significant for their organisational development, requiring intermediaries to perform them is a challenge. • A suggested solution is to consider evaluation results as one – but not the only – source of information and to couple it with knowledge, experience, strategy, and context to obtain a fuller picture.

More Related