1 / 42

Success in the workplace

Success in the workplace. Performing to make a difference to children and their families Jo Fox Child Centred Practice. How much do you understand about…. What does success look like to you?. How much have you thought about being successful in your workplace?

terri
Download Presentation

Success in the workplace

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Success in the workplace Performing to make a difference to children and their families Jo Fox Child Centred Practice

  2. How much do you understand about…..

  3. What does success look like to you? How much have you thought about being successful in your workplace? Do you have an internal script for success? Do you expect to be successful? Do you expect to be recognised for being successful? Whom by?

  4. What does success look like to service users? What expectations do they have of us? What expectations do they have of themselves? Do we understand and talk about each others purpose enough?

  5. Understanding failure • What is your failure policy? • How do you define failure? • What does failure mean to you? • How do you use failure ? • What is your organisations failure policy? • How is failure defined? • What does it mean? • How do they use failure?

  6. Where do our goals overlap? How strongly do you identify with your organisations goals and values? Are you proud to be a part of your organisation? Are the functions you carry out what the organisation and service user need you to do? Do you believe that your task is worthwhile and significant? What benefits will you gain personally from achieving your task professionally? Does it matter to you if your organisation fails? Does it matter to your organisation if you are successful at carrying out your tasks? Does it matter to the service user if you are successful in your work or not? HOW DO YOU KNOW?

  7. Think about your most valuable failure What did it tell you about yourself? About the people you worked for? About the people you worked with?

  8. Links between performance and outcomes

  9. The 60’s • After WWII local authorities responded to the fact that some areas required services that cut across the traditional departments • Councillors were concerned with budgets and performance was the domain of the professional – standards and accountability maintained through professional integrity and scrutiny of any kind seemed like an affront.

  10. The 80’s • Audit commission started in 1983 • In 1986, it published a well regarded tool kit for Performance Review which introduced the idea of the three E's (economy, efficiency and effectiveness). • Councils were encouraged to compare value for money and performance against a range of indicators - though these were mainly cost based at this stage.

  11. The 80’s • It was also a period which saw an increased focus on what Professor John Stewart (Birmingham University's INLOGOV) described as the "wicked issues" - demands and problems which cut across traditional departmental and service "silos" and which demanded a more corporate approach.

  12. The 80’s • Some councils (a notable example being the London Borough of Bexley) introduced systematic processes for corporate planning and review. • Typically, these included a high level corporate plan (setting out the main council aims and priorities) supplemented with service plans reviewed annually. • Many councils set up performance review committees (prompted by local auditors and the work of the Audit Commission) - although it was many years before these became an effective agent of change within organisations.

  13. The late 80’s • The next decade saw an increasing challenge to the traditional belief that council services were delivered to people by professional experts whose judgements should not be questioned. • In the late 1980s, York City Council pioneered the idea of published "citizen charters" - setting out, in plain language, the standards of service that users should expect and what to do if services failed to meet expectations. • This idea was taken up by John Major's Government which, in 1993, launched a national citizen's charter movement incorporating a Chartermark award scheme and, for the first time, a set of statutory performance indicators (organised and validated by the Audit Commission).

  14. This same period also saw a massive expansion across local government in the use of market research to explore attitudes of the local population to council services and the way they were delivered. This had previously been resisted by many councillors who saw it as a threat to their traditional view of themselves as the voice of the local community.

  15. The noughties • The early part of the twenty first century can be seen as a logical continuation of these trends; the citizens' charter and compulsory competitive tendering (CCT) morphing into the best value regime and then onto the comprehensive performance assessment (CPA). • Gradually, there was growing awareness that Stewart's "wicked issues" were not going away - indeed they went far beyond the collective powers of the entire council and required cooperative working with outside agencies such as the health service, police, voluntary and business sectors. CPA became (albeit briefly) the comprehensive area assessment (CAA). • The Total Place initiative has started to explore in depth the extent to which public bodies (both national and local) have the same clients and objectives and, individually, have considerable resources yet lack the organisational and political capacity to operate in an effective "joined up" way to deliver services which are cost effective and which meet service user and community needs.

  16. The new • The coalition government has dismantled the old performance frameworks, the audit commission is to be abolished. • In it’s place an expectation that: • Local authorities are responsible for their own performance and for leading the delivery of improved outcomes for their area • Local authorities are accountable to their local communities

  17. Context - preparing for leaner times • In March 2010, the Audit Commission summarised the challenge facing councils, over the next decade or so, as:"The financial impact of the recession has been manageable for most councils up to January 2010.The government has honoured the three-year grant settlement up to 2010/11; on average, grant is two-thirds of council income. • Staff pay increased by 1 per cent in 2009/10, less than expected. • Many councils received a windfall VAT refund. • Many councils, especially districts, have enough reserves to cover short-term funding pressures. • However, some councils – often districts – have been hit hard by falling local income.Development-related income has reduced; planning applications are down by 22 per cent. • Investment income fell by £544 million (43 per cent) in 2008/09, and the fall continued in 2009/10. • Capital receipts are down from over £3.5 billion in 2007, to just £800 million in the first three quarters of 2009. • Some districts that rely heavily on local income are struggling.

  18. Each individual to think carefully about how they are spending their time and resources. Performance is no longer something done by ‘funny little people in small dark cupboards that has nothing to do with the real world”. It is what you do everyday to ensure your efforts do not go to waste.

  19. The benefits and risks • Good councils will flourish as they focus on local priorities with less national prescription; but coasting or defensive organisations may flounder • We will have to publish a lot of data, but will be set far fewer associated national targets – again this is an opportunity for good councils but also a threat if councils do not rigorously drill down on key performance metrics

  20. BURNING PLATFORM Apathy & complacency VISION Lack of direction or coherence so change fizzles out LEADERSHIP Poor alignment & inertia Change Management CAPACITY & CAPABILITY Anxiety & frustration COMMUNICATE & ENGAGE People feel thechange won’t affect them OWNERSHIP AT ALL LEVELS Poor design that won’t last QUICK WINS Cynicism that change is possible & disbelief PERSONAL IMPACT Lack of individual commitment EMBED CHANGE SO IT’S BUSINESS AS USUAL Revert to the old ways Key: ESSENTIAL FOR CHANGESymptom of missing piece

  21. Encouraging Innovation Targets What, but not how Specific call for innovation Tie to strategic plan ‘Stretch’ Clear case for need Risktaking Emotional support Balanced assessment Learning from failure Trying new things Information Wide scope search Uncensored, unfiltered, unsummarised Free-flowing Resources Funding Time Authority to act Capacity & Capability Flexibility Process Training Encouragement for skills development Tools Aligned with organisational goals Recognition Intrinsic motivation Individualised Rewards Honouring everyone’s input Diversity Trusting, open environment Team based work Relationships

  22. Key performance indicators • Can these be usefully re-framed as “is what I do each day making a good difference to the lives of children and their families?” • What would your key performance indicators look like? • What evidence base do you have for your performance indicators?

  23. Unpicking indicators The indicator The evidence Messages from research Poor decision making Children in drift No services only assessment Child protection action taken without understanding issues and needs of child • Completing an initial assessment within 10 days

  24. How do we measure things?

  25. Outcome or output? • Outcomes are end results. They can describe different aspects of wellbeing for whole populations – for example, all children, as with the Every Child Matters (ECM) outcomes – or they can refer to the wellbeing for users of a particular service or intervention over time. Examples are a safe community, a clean environment, or a reduction in the number of looked-after children. These are outcomes, not outputs. • Outputs describe service specifications, delivery mechanisms and procedures. For example, a successful parenting support programme might deliver a specific number of training sessions and increase the number of trained facilitators and participating parents. These are outputs, not outcomes.

  26. Measuring outcomes instead of activity • What should the measure be? • Who should we ask? • What should we ask ? • When should we ask? • How can we share the results?

  27. IS IT A OUTCOME, INDICATOR OR PERFORMANCE MEASURE? 1. Safe Community 2. Crime Rate 3. Average Police Dept response time 4. A community without graffiti 5. % of surveyed buildings without graffiti 6. People have living wage jobs and income 7. % of people with living wage jobs and income 8. % of participants in job training who get livingwage jobs

  28. The difficulty with assigning value Economic rationalist approach to understanding outcomes believes that everything can have a value attached.

  29. What do we need to measure the right things? • Left Brain • Bureaucracy • Systems • Politics • Planning • Rule compliance • Analysis • Formality • Order • Right Brain • Autonomy • Ownership • Risk taking • Rewriting rules • Informal • Synthesis • Intuition • Trust

  30. “Not everything worth counting can be counted; and not everything that can be counted counts” Measuring outcomes not activity

  31. What are you counting • How often do you measure what you do each day? • How often do you measure the work of your colleagues and peers? • How much faith do you have in the tools you are using? • What would you do differently? • How much responsibility are you taking for ensuring that you understand what ‘doing a good job’ means.

  32. The hidden dynamics

  33. Other models for valuing outcomes • Appreciative inquiry is a strategy for intentional change that identifies the best of 'what is' to pursue dreams and possibilities of 'what could be'; a cooperative search for the strengths, passions and life-giving forces that are found within every system and that hold potential for inspired, positive change. • It is a process of collaborative inquiry, based on interviews and affirmative questioning, that collects and celebrates 'good news stories' of a community; these stories serve to enhance cultural identity, spirit and vision. • Appreciative inquiry is an approach which focuses on a desired future or outcome and is different from a problem-solving approach.

  34. Appreciative inquiry • Its four guiding principles are: • every system works to some degree; seek out the positive, life-giving forces and appreciate the best of “what is'. • knowledge generated by the inquiry should be applicable; look at what is possible and relevant. • systems are capable of becoming more than they are, and they can learn how to guide their own evolution – so consider provocative challenges and bold dreams of 'what might be'. • the process and outcome of the inquiry are interrelated and inseparable, so make the process a collaborative one.

  35. How would you make this work in Children’s Services? What gets in the way What helps

  36. Cost benefit analysis in social care Benefits challenges Count the wrong thing Confuse cost with value • Knowing how to allocate precious resources • Able to move services flexibly with the child and families needs • Commission effectively to meet the community needs

  37. How could you make this work in Children’s Services? What gets in the way? What makes it work?

  38. Outcome based results drawbacks strengths It makes us work to something tangible It should result in changes that are valuable to the child and their family • It is sometimes hard to evidence outcomes • The results of the work we do can take years for the child to show developmentally • We can miss the underlying causes if we always treat the symptom – short term gain only

  39. How would you make this work in children’s services What gets in the way What helps

  40. The most significant change • What ‘soft data’? • How do we collect it? • What gets in our way?

  41. ‍Most Significant Change The most significant change (MSC) technique is a form of participatory monitoring and evaluation. It is participatory because many project stakeholders are involved both in deciding the sorts of change to be recorded and in analysing the data. It is a form of monitoring because it occurs throughout the program cycle and provides information to help people manage the program. It contributes to evaluation because it provides data on impact and outcomes that can be used to help assess the performance of the program as a whole.‍When to use: • Program evaluation. • Organizational review and evaluation. • Building community ownership through participatory evaluation. • ‍How to use: • The process involves the collection of significant change (SC) stories from the field level, and the systematic selection of the most important of these by panels of designated stakeholders or staff. The designated staff and stakeholders are initially involved by ‘searching’ for project impact. Once changes have been captured, various people sit down together, read the stories aloud and have regular and often in-depth discussions about the value of the reported changes. When the technique is successfully implemented, whole teams of people begin to focus their attention on programme impact. • From: http://www.mande.co.uk/docs/MSCGuide.pdf by Rick Davies and Jess Dart

  42. These ten steps are usually included: • Raising interest at the start. • Defining the domains of change. • Defining the reporting period • Collecting SC stories. • Selecting the most significant of the stories. • Feeding back the results of the selection process. • Verifying the stories. • Quantification. • Secondary analysis and meta-monitoring. • Revising the system.

More Related