Download
pathfinder n.
Skip this Video
Loading SlideShow in 5 Seconds..
PATHFINDER PowerPoint Presentation

PATHFINDER

146 Views Download Presentation
Download Presentation

PATHFINDER

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. PATHFINDER Mission: Maximise consequential outcomes for New Zealanders

  2. Outcomes for a Vital Nation • Focus on outcomes; keep output platform • Clearly specify & measure outcomes, to inform strategy, plans, purchase, ops • Continuously improve interventions, etc • Business models for enhancing outcomes (focus: core activities & new initiatives) Increased benefit to our communities

  3. Current Initiatives

  4. Overall Schedule • 2 years – like each agency to achieve 3-4 outcome mgmt goals it sets (more later) • Early milestones (indicative) are by: - Oct ’01: identify / confirm outcome measures - Dec ’01: identify international best practice for managing for improved outcomes in your area - Dec ’01: proposed outcome measurement, analysis and internal reporting frameworks out for discussion • You set the pace via goals / resources

  5. Sponsors’ Group • Support & guidance for Working Group • Peer review of learning points, etc • Link to Senior Mgmt teams, CE, Minister Working Group • Identify / document best practice • Review agencies mgmt systems / models • Advice on design & application Both: Growing capability to improve outcomes

  6. Participation / Lead Representative

  7. PATHFINDER Mission: Maximise consequential outcomes for New Zealanders

  8. Principles of Pathfinder • Bottom-up, organic process • Coalition of the willing • Eight Govt Agencies, each with diverse businesses & business processes • Hot house for sharing / building capacity • DC / Vote Teams remain point of contact • SSC & TSY advise; collate lessons learnt

  9. Ownership from Agencies, backed by Collective Action • Want ownership / steerage from your CE, senior management and your organisation • Benefits your agency (& community served) • Want you to own process & mgmt models • Other agencies offer ‘wise counsel’ • Positive culture: sharing, helping, learning

  10. Working Group Functions? • Monthly meeting (2-3 topics, & material for SG) • Focus teams +/- workshops • Establish mentoring / bilateral relationships • Brokerage (SSC/TSY/Others putting you in contact) • E-mail discussion (hot topics needing urgent advice) • Web Site (password access to outcome mgmt info) • Collate WG papers into ‘living manual’

  11. Resource Commitment • Gains will require high calibre staffing • Low marginal cost (<0.2 FTE?), if committed to outcome-driven mgmt • For newer entrants, marginal costs are driven by the objectives you set (and orientating business to achieve them) • Central agencies cover admin. support (for meetings / process - not an offer to do your photocopying)

  12. Agency Team Profile? • Highly competent analyst-managers • Highly motivated, results-driven • Good communicators, working within and with strong internal networks • Knowledgable about your business (incl. purpose, goals and operational realities) • Continuity of staffing critical

  13. Unique Sector, Unique Outcomes,Unique Opportunities • Primary outcomes differ … • What can be measured will differ … • What applications will work varies by sector … • No one approach will work for all agencies … • For some functions, no approach may work well Be clear about what you want / need, but Be prepared to innovate & make compromises

  14. PATHFINDER Mission: Maximise consequential outcomes for New Zealanders

  15. Managing for Outcomes • Scope (measures, maximisation, strategy) • Decision making, better outcomes (evaluation just a tool) • Measuring outcomes (measure types, groups) • Linking measures • Outcome hierarchies • Defining your work agenda (‘straw man’) • Next Steps

  16. Business and strategic planning activities informed and focused ‘Blind’ managers see hurdles Feedback +/- motivation for staff Active prioritisation within teams

  17. 3 Families of Outcome Measure ?

  18. Outcome Management • Outcomes measured to inform decisions (start by defining the decisions to be informed) • Rhetorical vs. quantified & measurable • Defining measurement groups critical • Focus on critical measures / groups (overdesign will result in poor prioritisation) • Check utility of systems & measures against needs of different stakeholders Increasing benefit to our communities

  19. Short-Term Wins – Long-term Gains • Define & measure ‘mission critical’ outcomes (using ‘state’ or ‘situation’ outcome indicators) • Map causal logic linking outcomes to outputs • Assess impact of interventions / targeting • Assess cost-effectiveness (vs. outcomes achieved) • Managing to maximise outcomes (e.g. core/pilots) • Benchmarking with outcomes (business units / nations) • Focus strategic / annual plans on improving o/c • Redesign planning & operations to maximise o/c (incl. feedback & continuous improvement) See information pack for details

  20. INTERVENTION FRAMEWORK(Systems Analysis & Design) Use outcomes of assessment system to enhance system and set intervention thresholds to maximise intervention outcomes (go beyond just measuring targeting errors)

  21. When is Targeting Useful? Target Intervention Goal: maximise intervention effect from (targeting + intervention) effort

  22. Sector / Department Level Outcome Frameworks (Disaggregation generally required by demographic / service groups)

  23. Cross-Service Outcome Framework Pool all proposals with a common outcome, and allocate new initiatives funding using cost-effectiveness or cost-benefit principles?

  24. Next Steps • Identify learning fora on topics where you have special skills & experience to share, or have needs (build collective knowledge base) • Identify your development objectives (3-4 ‘wins’?) & indicative timeline • SG (you?) to comment on ToR out of session • Early Sept: second Working Group meeting • Indicative work plan to discuss Sept at SG Comment on ToR ASAP please; Identify your work agenda

  25. PATHFINDER Mission: Maximise consequential outcomes for New Zealanders

  26. (Slides to be Presented End)

  27. Outcome Performance Measurement & Alignment to Govt Goals Input Monitoring & Control Direction Value for Money 1 8 Economy Outcomes Inputs Ex-Ante Intervention Analysis Value Wheel 2 7 Production & Processes Efficiency Effectiveness Effectiveness Efficiency 6 3 Outputs 4 5 Ex-Post Policy Evaluation Output Measurement & Delivery Delivery (current and future = capability) Governance & Accountability Arrangements Output Pricing & Funding Arrangements Treasury 19 April 2001

  28. Outcome Measurement Traps • Outcome definition specific to particular service(s) • Selective outcome measures • Fuzzy outcome definitions • Measuring outcomes for few / wrong groups • Slow feedback (decision cycle times critical) • Measurement error • Poor accountability / attribution / disclosure • Outcomes via other agents (e.g. policy; science?) • Generating capability only (e.g. Defence)

  29. Measurement Tips (1 of 3) • Measure(s) too specific (if feasible, identify at least one unifying measure spanning service group(s), preferably in terms linked to risks & costs, e.g. days unemployment [average or avoided], Quality Adjusted Life Years, lives lost / saved, criminal re-offending risks / costs avoided; future costs to the Crown / society avoided) • Selective measures (define range & hierarchy of potential outcome measures and groups against strategic goals and plans; review & plagiarise outcome definitions used by overseas agencies) • Fuzzy definitions (specify clear units of measure; ensure measures & measurement groups cover critical interests & expectations of the agency / sector [see ‘Poor Accountability’]) • Slow feedback(reduce selection interval if sample size permits; reduce follow-up period for initial measures; intermediate outcomes)

  30. Measurement Tips (2 of 3) • Few / wrong groups(report overall outcomes for major service categories; must also identify & report outcomes for key demographic, treatment, geographic, business and/or risk sub-groups [see strategic plans for sector/agency])(for intervention outcomes, ensure intervention and comparison groups differ ‘only’ vs. intervention(s) being evaluated, or use specialist statistical tools [e.g. Odds Ratio]; can also partition individual intervention groups by demographic variables, risk, service provider, etc for more resolution) • Measurement error(match outcomes to best data; enhance input and grouping data quality; determine statistical variation; allow for statistical variation in interpreting outcome information)(for intervention outcomes, randomly select or match comparison group) (for risk tools, must validate precision of risk measurements against NZ longitudinal data set other than that used to create the risk tool)

  31. Measurement Tips (3 of 3) • Poor accountability(ensure groups used in primary care of accountable agency; and outcome is specific to that agency; record external influences as inputs to statistical tests)(for intervention outcomes, must create reference group [e.g. randomly assigned, paired or wait-list controls, alternate treatment(s)]; allow for interactions between interventions via evaluation protocols) • Outcomes delivered via others(Problem: accountability limited unless agency has control or substantive influence over other agencies; otherwise excuses abound. In the case of sector policy agencies, limited accountability could be expected for polices contributing across the sectors’ hierarchy of outcomes) • Capability outcome(define clear performance expectation; model likely performance against expectation or assess during unplanned contingencies +/- field exercises +/- deployments)

  32. Costs & Risks of Targeting Without recognising it, we are targeting all the time … The question is how far to push … discrimination • Information distortion (‘cheat strategies’, error) • Incentive distortion (e.g. reduced motivation) • Negative effects & stigma (for those targeted) • Administrative costs & invasive losses • Political sustainability (incl. quality reduction) After Amartya Sen, in Public Spending & the Poor (World Bank, 1995)

  33. Targeting Efficiency (Error) CLASSIFICATION ERRORS (Types 1 & 2) “… greatest source of error is with false-negative diagnoses, which are detected only rarely by review …” AA Renshaw, Am.J. Clin. Pathol. 115 (3): 338-341, 2001

  34. HM Treasury (UK) Economy Efficiency Effectiveness Outputs and Outcomes Value for Money Outcome Resources Input Output More money More nurses More treatments Better health Other influences

  35. PATHFINDER Mission: Maximise consequential outcomes for the public from Government activity