1 / 80

6 th Annual Research Report Launch Event Johannesburg Stock Exchange 25 September 2014

6 th Annual Research Report Launch Event Johannesburg Stock Exchange 25 September 2014 Michael H. Rea Integrated Reporting & Assurance Services michael@iras.co.za / 082 788 3966. Who is Michael H Rea?. Integrated Reporting & Assurance Services…or ‘IRAS’

devon
Download Presentation

6 th Annual Research Report Launch Event Johannesburg Stock Exchange 25 September 2014

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 6th Annual Research Report Launch Event Johannesburg Stock Exchange 25 September 2014 Michael H. Rea Integrated Reporting & Assurance Services michael@iras.co.za / 082 788 3966

  2. Who is Michael H Rea? • Integrated Reporting & Assurance Services…or ‘IRAS’ • Team of 3 full timers, interns and a network of ‘associated practitioners’ • Backed by 15 years’ experience in sustainability reporting and assurance in in 17 countries • Providers of integrated report authorship, assurance, training and advisory services • Current roster of clients = 8; past clients = 35, most of whom are multiple year repeats • South Africa’s leading assurance provider…in terms of ‘the value proposition’

  3. Today’s game plan… • This is to be an ‘informal discussion’ designed to share what IRAS has learned as – • A provider of Independent Third Party Assurance (ITPA) over the Environmental, Social and Governance (ESG) data contained within Integrated Annual Reports (or stand-alone Sustainability Reports) • The only company that has reviewed the sustainability/ESG reporting of every JSE-listed company for the past six years (starting with our 2009 research report) • However, we also expect input from you…representing the following: • JSE-listed companies • Consultancies/’Other Reporting Practitioners’ • Media • Other interested and affected parties • The following slides are designed to be a conversation guide…but questions and/or arguments are encouraged…and can be raised at any time!

  4. Why produce an IAR? • Global stakeholder expectations continue to evolve with respect to corporate disclosures • The near-global collapse of the financial sector in 2008 – on the back of the sub-prime lending scandals of the likes of Fannie Mae and Freddie Mac – has decreased the level of trust in existing governance controls. • The Deepwater Horizon oil spill in the Gulf of Mexico in 2010 (an oil rig working for BP) • Other global disasters include… • 17 major mine tailings dam collapses in the past 10 years, in Canada, the US, Hungary, Peru, etc. • Cancellation of Barrick Gold’s Pasca Lama gold mine in Chile…at a cost of $5.4 billion over 10 years due to community protests • Suspension of Newmont Mining’s Conga Copper project in Peru due to water contamination in mountain lakes • Growing shareholder activism…seeking increasing transparency of Environmental, Social and Governance (ESG) issues

  5. Why produce an IAR? • The current global participation in SRI accounts for $30 Trillion in assets under professional management…or 20% of the global capital markets, according to the 2012 Global Sustainable Investment Review produced by the Global Sustainable Alliance • Data from the US Sustainable Investment Forum (SIF): • Alternative investment vehicles (private equity, venture capital, hedge funds, property funds) account for $132B identified in 301 investment vehicles • Alternative Investment Funds investing in Environmental, Social & Governance (ESG) strategies has experienced as much as 250% growth in assets since 2010. • CII's (Community Investment Institutions) account for $61.4B between 1 043 institutions • Institutional investors and high-net worth individuals pooled account for $234.2B within 45 products • Institutional Investors dominate the sector with $2.7 Trillion in Assets involved in ESG incorporation

  6. Why produce an IAR? • The trend drivers for this growth include: • Client demands and values – whereby people want to know they are investing “responsibly” • Consumer demand and campaigns • Emergence of specialized stock exchanges with requirement for sustainability data disclosure…including the “Sovereign Wealth Funds” • Firms that have not historically identified themselves as SRI are adopting SRI strategies in decision-making; no ‘typical’ type of firm anymore; paradigm shift • Governance criteria incorporation as leading ESG issue…including the JSE’s SRI Index, the UN Principles for Responsible Investment (UNPRI), the Code for Responsible Investment in South Africa (CRISA), and Regulation 28 of the Pension Fund Act • Increased investment tied to impact – or ‘mission’ = where investors are tying investment to environmental and/or social challenge reduction • The need for comparable standards of reporting = fundamental shift in corporate reporting structure, including the SASB (Sustainability Accounting Standards Board), shift to ‘integrated’ reporting in South Africa, the US and elsewhere.

  7. Why produce an IAR? • Environmental factors are among the most frequently incorporated criteria among money managers (551 funds with $240B in assets). • Increased prominence of environmental issues (mainly climate change and carbon emissions) is a driver in the 23% increase in institutional asset owners in the U.S. who consider ESG.

  8. OUR RESEARCH

  9. Research Scope & Objectives

  10. Scope, Objectives & Approach • Purpose is to measure the ‘transparency’ of JSE-listed companies…with respect to the ESG (or ‘Sustainability’) disclosure within public documentation, such as: • Integrated Annual Reports • Stand-alone Sustainability Reports • Online supplemental ESG information • Unlike ALL other research in SA, the IRAS research report covers ALL of the JSE-listed companies – excluding those yet to report…those that have de-listed since 01 January…and/or those that have been deemed ‘un-S’African’ (e.g., where the primary domicilium is outside SA and/or where reporting is completed outside SA) • In 2014, the final research population was 311 companies (down from 331 in 2013…but we now exclude NGOs and SMMEs from our population sample).

  11. Scope, Objectives & Approach • Our analysis includes 122 ESG indicators, including: • 7 Standard Disclosures…such as whether or not reports are assured and/or whether additional reports are generated (e.g., CDP submission) • 12 scored Labour indicators • 12 scored Economic indicators • 10 scored Corporate Social Investment (CSI)/Socioeconomic Development (SED) indicators • 10 scored Environmental indicators • 11 scored Health & Safety indicators • 12 scored Governance indicators • 48 non-scored calculated ratios…used to provide truly comparable measures of performance (e.g., Carbon Emissions per Person Hour Worked)

  12. Scope, Objectives & Approach • Scoring is based on a 2-1-0 system…where: • ‘OK’ = 2 points = A response where data was easily located and ‘made sense’ • ‘OI’ = 1 point = ‘Opportunity for Improvement’ = A response where data was found, but was either difficult to locate, required some level of interrogation and/or interpretation, or significant effort in order to find, or was obviously incorrect. • ‘NC’ = 0 points = ‘Not Covered’ = Where no data for the indicator could be found. • ‘SDTI Compliance Score’ = (Sum of all indicator-specific scores) / (74 * 2)

  13. Scope, Objectives & Approach • Our base research is conducted by a team of interns…mostly from Canada…with the following Quality Control procedures in place: • As You Go Review…where each indicator response is compared by the reviewer against our database of historical data for the company under review, and against sector averages, to identify any possible anomalies that ought to be double-checked • Internal Peer Review…where each SDTI Gap Analysis is reviewed by one of the other researchers to ensure that no obvious errors have occurred (e.g., incomplete indicator responses) • Internal Research Manager Review…where each SDTI Gap Analysis is reviewed by Jordan, to ensure that there are no obvious errors in the data extracted from company reports (e.g., incorrect units of measure, • External Company Review…where each SDTI Gap Analysis is sent to the reporting entity for their own review and/or confirmation that IRAS has not missed data and/or captured incorrect data. • NOTE: Only 50 of the 311 companies under review provided feedback on our analysis. • Some of the best feedback came from Russell & Associates – for the 2nd straight year – on behalf of their clients.

  14. Scope, Objectives & Approach • Once the base research is complete – following the closure of our feedback period – all data is collated into a comprehensive spreadsheet and analysed for trends and anomalies. • All significant anomalies are interrogated internally and/or externally to at least attempt to avoid uncomfortable errors in what IRAS reports in our annual research publication. • The research report – available only in soft copy, until the week of the 29th of September – is then compiled to provide a comprehensive review of our findings relative to each of the 311 companies…for each of 122 SDTI indicators. • In almost all cases – unlike in prior years – data is presented according to the 23 JSE-specific sector designations…so as to avoid anyone continuing to accuse IRAS of comparing banks for mining companies.

  15. Research Scope & Objectives

  16. Let’s begin with an award… • With a score of 82.43%...they are not only the Top Performer Overall…but the ‘Most Improved Overall’…increasing their SDTI Score from 22.03% last year. • When I asked if IRAS had somehow screwed up in our evaluation of their reporting in 2013, their response was… • “No. You were right. What happened was that when we received your report in the mail, we saw how poorly we were ranked (231st out of 331 companies) and we were upset. We then looked at how you scored us, and we realised that most of the data we didn’t include in our previous report was information that we actually had. The problem was that we weren’t disclosing it. So this year we used your assessment as a starting point and made sure that we included all of the data that we previously – and inexplicably – excluded. We also reviewed the rest of the data points and have begun to put the systems and controls in place to be able to report on everything else.”

  17. Let’s begin with an award… • With a score of 82.43%...they are not only the Top Performer Overall…but the ‘Most Improved Overall’…increasing their SDTI Score from 22.03% last year. • When I asked if IRAS had somehow screwed up in our evaluation of their reporting in 2013, their response was… • “No. You were right. What happened was that when we received your report in the mail, we saw how poorly we were ranked (231st out of 331 companies) and we were upset. We then looked at how you scored us, and we realised that most of the data we didn’t include in our previous report was information that we actually had. The problem was that we weren’t disclosing it. So this year we used your assessment as a starting point and made sure that we included all of the data that we previously – and inexplicably – excluded. We also reviewed the rest of the data points and have begun to put the systems and controls in place to be able to report on everything else.” • THE WINNER IS…

  18. Let’s begin with an award… • With a score of 82.43%...they are not only the Top Performer Overall…but the ‘Most Improved Overall’…increasing their SDTI Score from 22.03% last year. • When I asked if IRAS had somehow screwed up in our evaluation of their reporting in 2013, their response was… • “No. You were right. What happened was that when we received your report in the mail, we saw how poorly we were ranked (231st out of 331 companies) and we were upset. We then looked at how you scored us, and we realised that most of the data we didn’t include in our previous report was information that we actually had. The problem was that we weren’t disclosing it. So this year we used your assessment as a starting point and made sure that we included all of the data that we previously – and inexplicably – excluded. We also reviewed the rest of the data points and have begun to put the systems and controls in place to be able to report on everything else.” • THE WINNER IS…SANTOVA LOGISTICS!

  19. The GRI’s downward spiral… • The GRI’s own database of GRI-based reports shows that uptake of the Guidelines has plateaued – if not started to drop off – possibly as a result of their demonstrated need to over-complicate what ought to be a simple process…and/or an inability to demonstrate an effective value proposition for GRI-based reporting. • 2 336 reports in 2011…2 584 in 2012…2 581 in 2013 • Inclusion of “GRI-referenced” and “Non-GRI” reports suggests ‘desperation setting in.

  20. The GRI’s downward spiral… • South Africa has slipped to 5th place in the Top 10 GRI-based reporting countries (for 2013 reports), from 3rd in 2012 (134) and 2011 (128), • IRAS continues to criticise the G4 version of the Guidelines because they will reduce the overall comparability of reports through reporters’ ability to select from the list of “Material Aspects” and the underlying indicators. • We predict a continued slide in GRI uptake.

  21. The GRI’s downward spiral… Problems with the GRI Guidelines… While the Standard Disclosures continue to be a useful set of indicators – helping companies ensure that key information about the company is included in reports 46 Material Aspects and 150 indicators is simply ‘too many’ for most companies to worry about in their reports, and thus most will opt for the ‘Core’ – rather than ‘Comprehensive’ – application level. The ability to select from the list of Material Aspects (for the Core application level) will effectively eliminate any possibility for comparability between companies…even within sectors – such as Metals & Mining – where reporting has matured with the GRI Guidelines over the past 15 years. The lack of clear guidance on what ought to be included in a response to an indicator – as well as the limited requirement for quantitative data disclosure – allows companies to produce GRI-based reports that are little more than tick-box success stories predicated on an ability to write hollow assertions that sound good. This – and the JSE’s failure to assess companies beyond a similar set of qualitative indicators – is one of the primary reasons why IRAS scores companies based on “Data Transparency”!

  22. Our Research Findings • Average sector-specific SDTI scores range from…to… • Three highest: 54.81% Banking & Financial Services • 53.19% Energy & Natural Resources • 52.20% Government & Parastatals • Three lowest: 33.54%Financial Services • 31.38% Media & Communications • 28.72% Household & Leisure Goods • * Note the difference between the ‘Banking & Financial Services’ and ‘Financial Services’ sectors…where the customer-centric business appears to be much more au fait with the need for greater ESG transparency.

  23. Distribution of SDTI Scores • Unlike in 2013 – with a highest score of 75.13% – 5 companies scored above 80%.

  24. Distribution of SDTI Scores • Economic and Governance indicators score best, while Health & Safety, CSI and environmental indicators score worst.

  25. Distribution of SDTI Scores • The JSE Top 60 and Top 100 – by market cap – score fairly equally (52.86% and 53.11%, respectively), which is significantly higher than the overall JSE average of 42.0% (up from 33.7% in 2013) • Companies obtaining ITPA score significantly higher (63.1%) than those not seeking assurance (53.6%...the average for those companies scoring above the median of 41.9%)…suggesting that companies that seek assurance tend to pay much more attention to comparable quantitative data within their reports…most possibly because they are ‘mature reporters’ that have moved up to seeking assurance.

  26. Research Scope & Objectives

  27. Interesting Findings – Labour

  28. Interesting Findings – Labour Very few companies provide Person Hours Worked (PHW) data…demonstrating a lack of understanding of the need to ‘normalise data’ to efficiencies. Question: What’s the easiest way to reduce your Total Electricity Consumption? Answer: Shut the lights off and go home! Question: Assuming you want to stay in business, how should you measure electricity consumption improvement? Answer: Reduce the volume of electricity consumed per PHW! Question: Why ‘per PHW’? Answer: Because it is highly unlikely that any two companies will produce exactly the same item – unless they’re based in Russian cooperatives – and therefore the only truly effective denominator for comparable efficiency is PHW. Per ‘units of production’ or ‘m2 of space’ is only useful for internal time-series comparability, and therefore should not be used for external reporting. NOTE: It is ‘reasonably assumed’ that ALL outputs can be traced back to the productive efficiency of a workforce, where PHW is a primary measurement indicator.

  29. Interesting Findings – Labour • 42 companies don’t even provide their ‘Number of Employees”…down from 61 last year. • 10 companies reported ‘Zero Employment’…mostly retail and/or income funds • Only 155 companies provided adequate contractor employment data…up from 115 last year. • 82.3% of all employees are deemed ‘Permanent’ • Only 194 companies (62.4%) provided adequate HDSA employment data…up from 54.1% last year. • Only 196 companies (63.0%) provided adequate Female employment data…up from 53.5% last year. • Only 128 companies (41.2%) provided adequate Unionisation data…up from 32.0% last year • Only 126 companies (40.5%) provided adequate Unionisation data…up massively from only 8.2% last year • Only 124 companies (39.9%) provided Employee Training data…up from only 36.6% last year…demonstrating almost no improvement in training data disclosure

  30. Interesting Findings – Labour • 15 companies reported training more people than they actually employ…suggesting a problem with data comparability and/or the assumption that ‘sense’ is ‘common’ when it comes to reporting. • Spur reported training 7 220 employees…even though they only employ 279 people…which equates to them having trained 2 587.8% of their workforce. • Clearly, they were talking about the number of franchisee employees trained (or franchisee training interventions)…which is very different than ‘Employees Trained’ • One of the most common reporting errors with respect to training is the confusing of “the number of persons trained” and “the number of training interventions”. • If /when an employee is as thick as two planks, their need for repeat training ought not result in the company counting them twice…or as many as 23 times (in one case identified through an assurance engagement).

  31. Interesting Findings – Labour Only 75 companies reported the number of days lost due to strike action…with 47 of those companies reporting ‘Zero Lost Days’. One of the most common reporting errors with respect to lost days is the confusing of “the number of calendar days lost” and “the person days lost”…the latter being the more effective measure of ‘production loss risk due to labour unrest’. Where strikes occur, the correct way to report would be to indicate the number of persons affected (i.e., striking workers + employees unable to perform their duties due to strike action) multiplied by the number of calendar days affected. Of the 701 642 days lost due to strikes, 76.4% of all lost days were within the Metals & Mining sector…which does not include Lonmin and/or Anglo Platinum’s losses in the 5-month platinum strike…as they are still to report on their affected periods. Question: Why is it that while Anglo Platinum, Lonmin and Implats suffered for 5 months, NO days were lost at Royal Bafokeng Platinum…just down the road from the other three companies’ operations?

  32. Labour Graphs (an example)

  33. Interesting Findings – Economic

  34. Interesting Findings – Economic • Average Economic Data Transparency Score decreased from 72.7% last year to 68.3% this year…remaining second highest across the six sections…2nd only to Governance (78.3%) • Anglo American plc and Sasol reported the highest Total Revenue – R283 and R181 billion, respectively – but neither was in the Top 20 in terms of Revenue per Employee, suggesting that both suffer from inefficient workforce issues • Setting aside Rand Merchant Bank – due to it’s nature as a holding company that essentially double reports the performance of it’s subsidiary companies – Kumba Iron Ore was the only company that was within the Top 20 for Net Proftit After Tax (NPAT) and NPAT per employee…suggesting the presence of an efficient workforce and/or higher levels of cost effective mechanisation in an industry plagued by workforce inefficiencies • Our calculation of Income Disparity in 2013 was – by far – the most contentious of all issues raised in our 2013 research report…and, as a result, led to three new disparity ratios: • Income Disparity inclusive of Gains on Shares (i.e., last year’s ratio) • Income Disparity exclusive of Gains on Shares (New!) • Income Disparity inclusive of remuneration paid to Prescribed Officers…including Gains on Shares (New!) • Income Disparity inclusive of remuneration paid to Prescribed Officers…excluding Gains on Shares (New!)

  35. Income Disparity Ratios

  36. Income Disparity Ratios

  37. Income Disparity Ratios

  38. Income Disparity - Question Which company would you expect to have a higher Income Disparity Ratio… Shoprite or Woolworths? Why?

  39. Income Disparity Ratios

  40. Income Disparity Ratios – Problem Some of these nine companies have no reason not to report properly, as their IDR was below their sector average. However, the likes of the following may be a concern, as their 2013 IDRs were above their sector averages: Tongaat Hulett 58.2 vs 53.2 Ecsponent (JDH) 33.2 vs 14.9 Netcare 25.5 vs 23.6

  41. R&D Spend

  42. Employee Wages to Dividends

  43. Economic Graphs (an example)

  44. Interesting Findings – CSI/SED

  45. Interesting Findings – CSI/SED • 42 companies don’t even provide their ‘Number of Employees”…down from 61 last year. • 10 companies reported ‘Zero Employment’…mostly retail and/or income funds • Only 155 companies provided adequate contractor employment data…up from 115 last year. • 82.3% of all employees are deemed ‘Permanent’ • Only 194 companies (62.4%) provided adequate HDSA employment data…up from 54.1% last year. • Only 196 companies (63.0%) provided adequate Female employment data…up from 53.5% last year. • Only 128 companies (41.2%) provided adequate Unionisation data…up from 32.0% last year • Only 126 companies (40.5%) provided adequate Unionisation data…up massively from only 8.2% last year • Only 124 companies (39.9%) provided Employee Training data…up from only 36.6% last year…demonstrating almost no improvement in training data disclosure

  46. Interesting Findings – CSI/SED

  47. CSI/SED Spend - Question Why do you believe the Metals & Mining sector reports CSI/SED Spend of more than R3.9 Billion…relative to a total JSE CSI/SED Spend of R8.5 Billion (or roughly 46.4% of all CSI/SED Spend)? NOTE: Average NPAT for the Metals & Mining sector is 14.7%...whereas the average across the JSE is 25.8%...and Metals & Mining is fifth, behind Real Estate (108.1%), Financial Services (38.4%), Banking & Financial Services (28.6%) and Software & Computers (21.7%).

  48. Interesting Findings – CSI/SED

  49. Interesting Findings – CSI/SED • Over 25% of all CSI/SED Spend is not allocated to one or more of the Developmental Priority areas…at least not in reports • Education and Infrastructure Development are gaining the lion’s share of all allocated funding

More Related