1 / 11

The Evolution of Evaluation Practices in the Canadian Federal Government 1980 - 2005

The Evolution of Evaluation Practices in the Canadian Federal Government 1980 - 2005. Presented at CES / AEA Conference Toronto, Ontario, Canada October 29, 2005 Presented by: George Teather Tel: (613) 824-2423 Fax: (613) 824-2583 Email: gteather@sympatico.ca. PRESENTATION OUTLINE.

whalenl
Download Presentation

The Evolution of Evaluation Practices in the Canadian Federal Government 1980 - 2005

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Evolution of Evaluation Practices in the Canadian Federal Government1980 - 2005 Presented at CES / AEA Conference Toronto, Ontario, Canada October 29, 2005 Presented by: George Teather Tel: (613) 824-2423 Fax: (613) 824-2583 Email: gteather@sympatico.ca

  2. PRESENTATION OUTLINE • General overview of evolution of Canadian federal government evaluation practices • Description of use of evaluation in program management • Review of challenges • International comparisons 2

  3. Review of Evaluation Practices • Long standing practice, experience with evaluation, as the formal requirement for evaluation began by 1980 • Evaluation groups are in Departments, with central agency overview, guidelines and some workshops • Private sector consultants, with evaluation and performance measurement expertise, conduct many of the studies under the oversight of Departmental staff • Evaluation has incorporated key characteristics • Use of logic model to describe the program • Generic issue groups • Relevance • Objectives achievement • Alternatives (design and delivery) • Multiple lines of evidence to increase validity and credibility • In mid 1990s, periodic, indepth evaluation and ongoing performance measurement became more closely linked 3

  4. Integration of Evaluation and Performance Measurement with Management Practices • Results-based Management and Accountability Framework (RMAF), introduced in 2001, provides an integrated approach to annual performance measurement and periodic evaluation • Logic model, description of program • Performance measurement strategy, indicators, sources of info • Evaluation issues, methodological approach • Accountability, reporting requirements and schedule • Departmental management practices incorporate RMAF concepts and information • Key results commitments describe objectives • Annual Departmental Report on Plans and Priorities identifies annual plans, intended results • Annual Departmental Performance Report provides information on annual outcomes and achievements • More recently, Program Activity Architecture has been introduced, which identifies key results linked to Departmental objectives and indicators to demonstrate level of achievement and performance 4

  5. How is Evaluation Used? What is Its Impact on Policy Development and Implementation? • Evaluation is embedded in management practices • RMAF is required for all new and renewing contribution programs and recommended for others • Annual performance measurement and reporting is required • Evaluation studies are usually required before renewal of funding • However, evaluation is only one of many inputs to decision making • Political imperatives • Policy groups, stakeholder influence • Location of evaluation in Departments leads to: • Use to describe program benefits, and defend programs from central agencies • Use to adjust program design and delivery in some cases, almost never call for cancellation of program • Because existing programs are being evaluated, evaluation has a greater role in policy implementation than policy development • Logic model has been useful in program design and development as well as evaluation and performance measurement 5

  6. What are the Most Compelling Evaluation Challenges? • Main challenges are closely linked: • engaging senior management’s attention and interest in understanding and utilizing evaluation results • educating policy analysts and central agencies as to what can be expected from evaluation and performance measurement • producing relevant, credible evaluation studies • There is a long standing issue of linking policy development and implementation with scientific evidence • Council of Science and Technology Advisors produced a report on Scientific Advice for Government Effectiveness (SAGE) that identified ways to improve linkages and improve consideration of evidence • Difficult to undertake horizontal, multidepartment evaluations due to location of evaluation expertise • Most S&T policy emphasis is linked to economic policy, with S&T as a driver for economic growth, less consideration of the role of S&T in social policy and public good • Government has cut back on training and building evaluation capability in departments specific to the needs of government, with a negative effect on expertise, quality of work • Some evaluation studies identify results and benefits that have limited attribution to the program and don’t identify other contributors • There is no regular independent / central agency review of the quality and credibility of evaluation reports 6

  7. Tracking National Performance – What are the Correct Indicators for International Comparisons? • Comparisons require international agreement on indicators, access to credible, consistent data • Standard comparisons include: • Resources • GERD, (funding by sector, contribution of government, industry)) • Highly qualified personnel (engineers, scientists, PhDs) • Outputs • Scientific publications (English journals) • Patents • Example for Biotechnology • Statistics Canada has developed an international standard for the measurement of biotechnology activities that is being adopted by OECD countries, including U.S. • More work needs to be done on identifying indicators, developing data sources • Need the support of Statistics Canada, other agencies ( RSA) • Comparison of national innovation systems, public / private sector interactions 7

  8. Most Promising Evaluation Methods • There has been a general realization of the need to extend the logic model concept to explicitly identify: • the group or individuals (participants, clients, beneficiaries) that the program intends to influence to change their behaviour in order to achieve objectives • The partners, collaborators and stakeholders whose participation and support are required for the program to fully succeed • In a given program, there may be different groups at different stages of the program, or participating for different purposes, objectives • Research – university, gov’t scientists • Development – engineers, innovative firms • Commercialization – venture capital, banks, companies • Examination of actual compared to intended clients of a program is an important issue • Profile of clients and participants • Identification of key partners and their level of involvement is also important in examining the “pathway to success” • Network analysis • Analysis of roles and relationships 8

  9. Generic Program Logic Model* Program Objective: high level strategic purpose Resources Reach Results HOW? WHO / WHERE? WHAT do we want? WHY? users / clients / co-deliverers / beneficiaries activities outputs direct outcomes intermediate outcomes ultimate impacts • Program deliverables • Policy guidelines, regulations • Communications • plans • - internal communications • - promotion • - info transfer • - consultations • - meetings/events • Funding • Service Outputs • New knowledge • Improved capability • Improved decision making • Target group changes in behaviour / other outcomes • Sector / Industry / Regional Impact • Economic/ Environmental/ Societal Impact • Contribution to organizational objective • Client Service • - addresses needs • - meets / exceeds expectations • - service quality • Behavioral Influence • awareness • understanding • attitude / perception • support • Program / Service Delivery • Client Management • Policy & Issue Management • Financial Management • Human Resources Management • Asset Management • Primary Targets (clients, ultimate beneficiaries) • Co-delivery Agents • Other Stakeholders 9 *reference www.pmn.net

  10. Spheres of Influence* State Your environment of indirect influence e.g., Industrial sectors, government decision makers, other communities of interest where you do not make direct contact Behavioural Change Your environment of direct influence e.g.people and groups in direct contact with your programs, staff (i.e. clients, target audience, co-delivery partners Time Operational Your operational environment You have direct control over the behaviours within this sphere *reference S. Montague, www.pmn.net 10

  11. References • Treasury Board RMAF and other evaluation policies and guidelines www.tbs-sct.gc/ca/eval • Council of Science and Technology Advisors SAGE and other reports www.csta-cest.ca • Evaluation and performance measurement courses and consulting services www.pmn.net 11

More Related