Download
slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
October 2002 PowerPoint Presentation
Download Presentation
October 2002

October 2002

94 Views Download Presentation
Download Presentation

October 2002

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. A Typical Model Audit/Review ApproachGrenville CrollEuSpRIG -European Spreadsheet Risks Interest Group, UK October 2002

  2. Queries responded to by client High Level Review Preparation of queries list - issued to client Report issued to client Low Level Review Familiarisation No Model requires alteration? Sensitivities Yes Comparisons run between model versions Model amended by client Model Audit/Review Process

  3. SES-Risk • Software for Estimating Spreadsheet Risk • Identifies Used Cells, Data cells, Logic Cells • Calculates formulae length • Calculates ratio of Original to Repeated cells • Determines number of cell precedents • Uses clustering to infer spatial layout • Neural Network Engine • Outputs Risk Score (0.0 - 1.0) • Cell Error Rate • Estimated Review Time • Probability of Bottom-Line Error

  4. 1 Model Maps 2 Code Review 3 Range Names 4 Queries Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons

  5. Step 1: Model Maps

  6. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  7. 1.1 Model Maps • Why do we use maps? • To ensure each original formula is checked • To help document the review • So that the review can be divided up between different people • How we run maps • Use commercially available software • Maps all sheets in Workbook • Map Key • L = Label • N = Number • F = Original Formula • ^, <, + = copied form above, left, above & left • M = Long complex formula

  8. 1.2 Spreadsheet Map

  9. 1.3 Model Maps - checking formulae • What we look for when checking formulae • All references to other cells or ranges are correct • All arguments of functions are correct (e.g. IF, INDEX, MATCH, OFFSET, VLOOKUP, HLOOKUP, SUMIF) • ROUND functions for appropriate accuracy • Review #REF, #ERR etc to establish cause • Check any inconsistent formulae identified from maps • Embedded constants • Reperform array & data table calculations • Consistent units eg currency, weights, measures, dimensions • Calculation produces something sensible • Result has the correct sign (i.e. + or -, CR or DR)

  10. 1.4 Model Maps • How we check the formulae • Tick off each original checked formula (F) on the map • Mark queried formulae with a Q • Painstaking but essential • Independent peer review • academically verified method of detecting & correcting errors

  11. Step 2: Code Review

  12. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  13. 2. Code Review • Ways we review code (VBA & Macros) • Print out and look through • Step through • Run complete procedure • Hit Buttons • Things we look for • Does the code do what it is intended to do? • Is there anything that would make it break? • Eg macro contains absolute cell addresses • How we document code review • Print out and annotate along with queries

  14. Step 3: Range Names

  15. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  16. 3. Range Names • Why do we need to do a range name review? • Names in formulae • Names in macros • Redundant names • How do we do it? • Software tools reports all range names • Check that each range is complete and appropriate

  17. Step 4: Queries

  18. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  19. Queries • Query document • Each query is numbered uniquely • Records each query & response from the client • Reference Resource • useful if queries from different reviewers/clients • Six Different query types • AE, PE, C, N, I, BP • Before query document gets sent to the client • Reviewed for duplicates • Edited so it is clear and concise

  20. Queries - Basic types of query • Actual Error • All actual errors, including Excel errors e.g. #N/A or #Div/0! • Potential Error • If the value of a precedent changed, would the result be wrong? • Clarification • Request for clarification of a point • Numbers • Constants in formula: require confirmation and/or explanation • Information • Items of interest to the modeller, but do not affect output • Best Practice • Consider time constraints, materiality etc

  21. Responses • All responses reviewed and documented • Check that changes have been made in new version • Mark query as resolved or unresolved • Carry forward unresolved to next query list • Check that no queries have been missed

  22. Step 5: High Level Review

  23. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  24. High Level Review • Why do we do it? • Check big picture items that may be missed by low level review • Consistency with Documentation • Accounting issues • What does it include? • Circular references • Interest calculations • Tax calculations • Balance Sheet/P&L/Cashflow integrity • Payment of Dividends • What can it include? • Input to the commercial nature of the model

  25. High Level Review • Some High Level Checks • The Balance Sheet balances • Retained Earnings flow from P&L to BS • Capex causes cash outflow • Fixed Assets don’t depreciate below zero • Increase in Credit Sales impacts Trade Debtors • Is effective tax rate reasonable? • Is Interest calculated on prior period ending balance? • Do Revenues & Costs reflect production? • Is debt amortised correctly? • Are foreign currency losses handled appropriately? • Are financial statements in the correct format?

  26. Step 6: Documentation

  27. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  28. Project Documentation • What do we do? • Check the project documentation against the model • Review term sheets, agreements, contracts, model specifications • Review work of associated industry experts • Interface with other external teams • Add items to query list as identified • Our Documentation • Mark maps as for formulae checks • Note from which data source the numbers have been obtained • For each source, note where in the model the data is located • Changes in project documentation cause re-review

  29. Step 7: Comparisons

  30. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  31. Step 8: Sensitivities

  32. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  33. Sensitivities • A model review cannot deal with every possible input • billions of data permutations & combinations • Comfort required on certain sensitivities • All compared against Base Case • Delayed start • high/low revenue • high/low cost of finance • high/low capex & opex • otherwise as specified by client • Each sensitivity methodically reviewed & documented • Changes in base case model cause re-run of sensitivities

  34. Step 9: Final Review

  35. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  36. Final Review • Clear any outstanding queries • Final high level sense check • Final documentation & sensitivity check • Client made clearly aware of any outstanding queries • Resolved if possible • Document Resolutions • Unresolved queries appear in final report

  37. Step 10: Report

  38. Model Review framework 10 Report 8 Sensitivities 9 Final Review 5 High Level Review 6 Documentation 7 Comparisons 1 Model Maps 2 Code Review 3 Range Names 4 Queries

  39. Report • By standard letter • Lists exact versions of models & files • Models identified by filename, date, time & size in bytes • Reports performance of agreed-upon-procedures • Reports any unresolved queries

  40. Financial Modelling by Grenville Croll • Model Review • Model Build • Financial • Marketing • PFI • Production • Training • Building Better Business Spreadsheets • Technologies • MS Excel • Monte Carlo (Crystal Ball/@Risk)

  41. Grenville Croll - contact details www.grenvillecroll.com