csi professionals briefing 2014 n.
Skip this Video
Loading SlideShow in 5 Seconds..
CSI PROFESSIONALS BRIEFING 2014 PowerPoint Presentation
Download Presentation

Loading in 2 Seconds...

play fullscreen
1 / 60

CSI PROFESSIONALS BRIEFING 2014 - PowerPoint PPT Presentation

  • Uploaded on

CSI PROFESSIONALS BRIEFING 2014. Reana Rossouw Next Generation Consultants. Session Two: Impact Investment Index. Our Context. Started in 2009 – global benchmarking – 40 models currently being applied internationally Conclusion:

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
csi professionals briefing 2014



Next Generation Consultants

our context
Our Context
  • Started in 2009 – global benchmarking – 40 models currently being applied internationally
  • Conclusion:
    • Africa needs its own solution– applicable to both the funding and development sectors
    • Practitioners need capacity, support and knowledge
    • Practitioners don’t need complex methodologies or approaches requiring specialist skills, licenced software or specific hardware or expensive solutions
    • The industry needs a transparent, comparable, and flexible solution that contextualises and take into consideration the complexities, relationships and fundamental development principles of our specific development context
our objective
Our Objective
  • The purpose of Impact Investment Index is to create a shared performance measurement system to be utilized by all organisations in the social investment and community development sector
  • The current system lacks coordination, leading to added expense, limited learning, and inadequate ability to assess shared value and collective impact
why the pressure to measure
Why the pressure to measure?
  • The debate on impact and return on investment are playing out in three arenas:
    • In private foundations and corporate CSI divisions
      • Aiming to be more strategic about their philanthropy, grant making and social/community investments
    • In nonprofit organisations in response to pressures from corporates, foundations and government
      • To be more accountable for the resources received and program outcomes expected
    • Among international development organisations such as bilateral government agencies and intermediary organisations
      • Seeking to improve development effectiveness and lessen dependency on grant/development aid
why now
Why now?
  • Funders are increasingly asking for demonstrable results– to understand the difference they make, directly and indirectly
  • This trend is accelerating, as the development and in particular the funding sector is increasingly looking to pay by results – to learn from what they, and those they fund, do
  • It is not just donors that care about impact. In a age of competition, transparency, recessionary economies, there is growing competition for resources
three primary applications of impact assessments
Three primary applications of Impact Assessments
  • Prospective
    • Looking forward to determine whether or not the projected costs and benefits indicate a favorable investment
  • Ongoing
    • Testing assumptions and projections along the way in order to aid course correction
  • Retrospective
    • Looking back to determine whether or not it was a favorable investment given the costs incurred, and therefore to inform future decisions
variety of purposes our school of thought
Variety of purposes – our school of thought
  • One can and should use impact data to make funding allocation decisions across program areas
      • You can compare programs once you get in the sector of health, but you cannot compare health vs. arts vs. education vs. sport.
  • One can and should use impact data to make funding decisions within program areas
      • It is not about building a unifying measurement across domains, but to build a conceptual framework for understanding the biggest impact across a Rand value unit. So it is not about comparing health to education to sport or the arts, but to determine which program yields the highest return for the most effective use of resources
the conundrum
The conundrum
  • Funders and non-profits often use the words “evaluation” and “impact” loosely, stretching these terms to include any type of report on the use of funds or the results they achieve
  • Practitioners should distinguish between measuring performance(monitoring inputs, activities, and outputs), measuring outcomes (near-term results), and evaluating impact (long-term changes that can be attributed to the investor’s activities) as well as return on investment (benefits accrued to the investor) as a result of funds and other resources invested
what is impact
What is impact
  • “Measure of the tangible and intangible effects (consequences) of one thing's or entity's action or influence upon another. An impact evaluation/assessment assesses changes in the well-being of individuals, households, communities or firms that can be attributed to a particular project, program or policy. The central impact evaluation question is what would have happened to those receiving the intervention if they had not in fact received the program. ”
  • For us this means: The broad or longer-term effects of a project or organisation’s work (also referred to as the difference it makes). This can include effects on people who are direct users/beneficiaries of a project or organisation’s work, effects on those who are not direct users/indirect beneficiaries, or effects on a wider field/aspectsuch as government policy, processes, systems, infrastructure or support systems
what did we want to achieve
What did we want to achieve?
  • To provide evidence
  • To demonstrate performance
  • To prove accountability
  • To show program/ investment effectiveness
  • To demonstrate value
  • To empower and capacitate communities and funders
  • Ultimately - to alleviate, eliminate and eradicate poverty
what we have achieved
What we have achieved
  • Assessed R1 billion worth of investment but also considered all types of input resources from money to hours, products and services, infrastructure and skills and volunteerism
  • Across both socio economic and enterprise development sectors
  • Including 15 focus areas
  • 400 programmes – from flagship to donations, once off to 3-5 year programs, sponsorshipsand cause related marketing, social and business enterprises
  • Across 15 dimensions of impact and 25 dimensions of return
  • Developed a library of more than 500 indicators
but more than that
But more than that
  • Impact across the value chain:
    • Outcomes of partnerships, relationships and applied resources (to be more sustainable and effective)
    • Outcomes of the initiative (policy change, organisations working more effectively together reducing/alleviating/ eradicating poverty)
    • Outcomes at the community level (are we moving the needle against the strategic objectives/mandates indicators?)
    • Returns for the donor/funder (are we getting value, recognition, licence to operate, etc.)
guiding principles of our work
Guiding Principles of our work
  • Impact means impact
    • The goal is to understand what changes as a result of the investment from funders in communities as beneficiaries and recipients of interventions
  • The impact is shared
    • The goal is to understand who is impacted along the impact value chain – including funders , intermediaries and beneficiaries
  • Impact includes and involve all stakeholders
    • Analysis must be comprehensive. Instead of cherry picking something that’s working and leaving out what is not, the analysis should include all aspects of impact and those impacted
  • Results must be transparent
    • Companies should report to their investors, and investors should aggregate and report results. What is left out should be stated. Assumptions and sources should be stated. It should be possible for a third party to replicate the analysis based on the documentation of it and get the same result.
  • Context matters
    • It is harder to create a stable job in a rural area than in a city. The qualitative and quantitative context should be provided to inform the impact as well as an understanding of how much of the problem may exist or remain.
challenges remain
Challenges remain
  • There are two main challenges to measuring impact
    • One: how to do it well
    • Two: how to do it cheap
  • Getting both to happen can be done in either of two ways:
    • Force it, by making it a requirement
    • Or attract it, by creating an environment in which the parties whose efforts are necessary to make it happen want it
the breakthrough a comparative performance system
The breakthrough – a Comparative Performance System

The Outcome: The Impact Investment Index

dimensions of impact for whom
Dimensions of Impact – FOR WHOM?
  • How do we calculate?
  • We count each and every stakeholder group
  • We count each and every impact
  • We distinguish between community and business impact
  • Get to a figure: X:Y
comparing and benchmarking
Comparing and Benchmarking
  • Across Industry
  • Across Focus Areas
  • Across Perceptions
  • Including local and global benchmarks
    • Vision, Mission, Strategy and Policy, guidelines, frameworks
    • Strategic intent, objectives, mandate
    • Focus Areas
    • Grant making criteria
    • Beneficiaries
    • Intermediaries
    • Operational alignment and integration
    • Governance and compliance
    • Reporting structure
    • Budget and resources
    • Monitoring and evaluation
    • Employee Volunteerism
    • Marketing, communication and awareness
    • Sponsorship, donations and cause related marketing
    • Partnerships
    • Recognition, Awards, perceptions
    • Future focus, GAP and SWOT analysis
some examples of indicators
Some examples of indicators

Business Return

Community Impact

Increased quality of life – 45 dimensions

Increased social cohesion

Increased access to services, infrastructure, support, income, jobs – 45 dimensions

Skills Development – 30 dimensions

Infrastructure – transport, hospitals, schools, electricity, water, housing – 60 dimensions

Education – infrastructure, in general, specific – early childhood, primary, secondary, tertiary

Economic, Environmental, Social

Health, education, housing, safety & security, enterprise development, sport, agriculture, social justice, social participation, technology

Impact on individuals, communities, government, companies, funding partners

Quantitative, qualitative

Volunteerism - dimensions

Mitigate climate change risk

Reduce environmental accidents and remediation

Reduce water use and management; waste management and effluents

Reduce and assist with energy management

GHG emissions and air pollution

Reduce biodiversity impacts

Increase communications and engagement

Increase customer satisfaction

Increase access to tenders

Increased access to new markets

Increased reputation

Increased sales

Increased staff motivation

Increased BEE Scorecard

Strengthened value chain

Maintained/obtained social licence to operate

Increased product innovation and services development

what our clients say investors
What our clients say - investors
  • Improved financial analysis and reportingas well as management capacity and operational efficiency
  • Improved operational efficiency and information management processes and systems
  • Improved board governance and oversight, understanding, knowledge and capacity
  • Improved M&E; reporting; communication; stakeholder relationships and partnerships
  • Improved operational processes from application requirements; due diligence processes; reporting requirements; responsiveness; knowledge and expertise of staff
  • Improved access to further investment
what our clients say intermediaries and beneficiaries
What our clients say – intermediaries and beneficiaries
  • Intermediaries
    • We feel comfortable with the transparency of the process
    • The process have added value to our own work – especially M&E and reporting practices
    • The processes have increased our effectiveness and own performance; increased our learning and knowledge; built internal capacity; and increased our credibility
    • We believe we were assured independently by someone who can verify our claims – it validated our own beliefs
  • Beneficiaries
    • We had an opportunity to talk without being judged – we could be honest
    • We learnt to document our own work and the contribution we made
    • We feel we are being trusted, being heard and someone asks our opinion
    • We had an opportunity to share and learn
what we have learnt 1
What we have learnt (1)
  • One size of evaluation does not fit all
    • Funders should take extra time in their planning to learn which evaluation techniques will work with indigenous populations and specific communities.
  • Trying to define and measure empirical changes is difficultand “a slippery process,”
    • Understanding, defining, qualifying and quantifying long-term social change is an incremental effort and on-going process.
  • Although evaluations are significant for their organisational development, requiring intermediaries to perform them is a challenge.
    • A suggested solution is to consider evaluation results as one – but not the only – source of information and to couple it with knowledge, experience, strategy, and context to obtain a fuller picture.
what we have learnt 2
What we have learnt (2)
  • Impact assessments describes three relationships:
    • between evaluation and the funder’s approach;
    • between evaluation and the funder’s strategy;
    • and between the intermediary evaluation and the beneficiary evaluation.
  • Grantmakers may have to change their funding approach to better accommodate intermediaries’ capacity to conduct evaluations, or provide funding to help them develop it
  • Funders need to be clear about their overall goals and how individual investments fit within that model.
  • Intermediary and funder evaluations should be linked:
    • Both partners face the same issues of inadequate resources, coordination and expertise for conducting evaluations.
what we have learnt 3
What we have learnt (3)
  • ANY resources CAN be measured – books, wheelchairs, buildings, time – cash and non-cash
  • The same project can deliver varied results for different funders
  • How and on what you spend the money (inside the program) has a direct influence on the impact and return
  • The strategy and focus areas has to clearly define the return and impact required- upfront
  • Sustainability has to be clearly defined for exit and completion
  • Indicators have to be developed, agreed, and documented as part of the contractual phase
  • Internal monitoring and external evaluation processes has to be established and adhered too - Impact Assessment does not replace evaluation and monitoring
what we have learnt 4
What we have learnt (4)
  • You have to consider the impact of the impact assessment on so many levels
    • As a result of our work we can now categorically state that most programs:
      • Have only short term impact
      • Those that have medium term impact are not necessarily sustainable
      • The long term impact is mostly social only as opposed to economic impact which really contributes to poverty alleviation and eradication!
  • It is possible to determine impact and return
    • The real value lies in independent, verifiable, assurance of social investment expenditure, program results, outcomes and impact
what i have learnt 5
What I have learnt (5)
  • There is so much return for business – we just need to prove it
what i now know
What I now know…
  • We all have impact – but it is not necessarily measurable and sustainable impact
    • Do we want economic or social impact?
    • By implication social impact help people right now – but may not help them in the future – which renders the project/our intervention UNSUSTAINABLE
    • Sometimes it is our own (CSI Practitioner’s) fault we don’t have higher impact as we decide what, who and how to fund/not to fund
    • The most sustainable projects/programs with the highest impact have social, socio economic and ECONOMIC impacts i.e. the number of jobs created
    • Sometimes there is negative impact – i.e. dependencies are created
    • Mostly there is only short termimpact –which makes our interventions UNSUSTAINABLE
going forward and doing it better
Going Forward and Doing it better
  • Impact assessment can help funders, intermediaries, and beneficiaries they support to:
    • Plan how their work will make a difference, and determine how much of a difference they are making
    • Understand what does or does not work and why and detect unintended consequences
    • Build an (scientific) evidence base to share with others, thus influencing and informing debate, and increasing the sector’s body of knowledge
    • Challenge yourself and others by looking critically at your/their work in order to improve, to replicate good work, or to innovate and develop new processes, products and services
    • Inspire and motivate staff, trustees and other stakeholders including volunteers, beneficiaries, policy makers and other practitioners, funders and investors to build relationships with others, communicate added value and raise the profile of their work
in closing
In closing
  • Involve stakeholders
    • Establish the scope and identify the key stakeholders impacted
  • Understand what changes
    • Map the outcomes – identify the indicators to measure impact
  • Value things that matter
    • Look for evidence, don’t forget baseline and impact studies
  • Only include what is material
    • Focus on agreed outcomes first then incidental impact
  • Do not over claim
    • Calculate the impact based on evidence
  • Be transparent
    • Report the outcome – the good and BAD news
  • Verify the result
    • Share the learning
questions and discussion
Questions and Discussion
  • How to start
  • Where to start
  • How to do it
  • When to do it
  • Just do it
  • ReanaRossouw
  • Next Generation Consultants - Specialists in Development
  • E-mail: rrossouw@nextgeneration.co.za
  • Web: www.nextgeneration.co.za