evaluating organizational change how and why l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Evaluating Organizational Change: How and Why? PowerPoint Presentation
Download Presentation
Evaluating Organizational Change: How and Why?

Loading in 2 Seconds...

play fullscreen
1 / 29

Evaluating Organizational Change: How and Why? - PowerPoint PPT Presentation


  • 821 Views
  • Uploaded on

Evaluating Organizational Change: How and Why?. Dr Kate Mackenzie Davey Organizational Psychology Birkbeck, University of London k.mackenzie-davey@bbk.ac.uk. Aims. Examine the arguments for evaluating organizational change Consider the limitations of evaluation

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Evaluating Organizational Change: How and Why?' - Faraday


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
evaluating organizational change how and why

Evaluating Organizational Change: How and Why?

Dr Kate Mackenzie Davey

Organizational Psychology

Birkbeck, University of London

k.mackenzie-davey@bbk.ac.uk

slide2
Aims
  • Examine the arguments for evaluating organizational change
  • Consider the limitations of evaluation
  • Consider different methods for evaluation
  • Consider difficulties of evaluation in practice
  • Consider costs and benefits in practice
arguments for evaluating organizational change
Arguments for evaluating organizational change
  • Sound professional practice
  • Basis for organizational learning
  • Central to the development of evidence based practice
  • Widespread cynicism about fads and fashions
  • To influence social or governmental policy
research and evaluation
Research and evaluation
  • Research focuses on relations between theory and empirical material (data)
    • Theory should provide a base for policy decisions
    • Evidence can illuminate and inform theory
    • Show what does not work as well as what does
    • Highlight areas of uncertainty and confusion
    • Demonstrate the complexity of cause-effect relations
    • Understand predict control
pragmatic evaluation what matters is what works
Pragmatic Evaluation: what matters is what works
  • Why it works may be unclear
  • Knowledge increases complexity
  • Reflexive monitoring of strategy links to OL & KM
  • Evidence and cultural context
  • May be self fulfilling
  • Tendency to seek support for policy
  • Extent of sound evidence unclear
why is sound evaluation so rare
Why is sound evaluation so rare?
  • Practice shows that evaluation is an extremely complex, difficult and highly political process in organizations.
  • Questions may be how many, not what works
evaluation models
Evaluation models
  • Pre-evaluation
  • Goal based (Tyler, 1950)
  • Realistic evaluation (Pawson & Tilley,1997; Sanderson, 2002)
  • Experimental
  • Constructivist evaluation (Stake, 1975)
  • Contingent evaluation (Legge, 1984)
  • Action learning (Reason & Bradbury, 2001)
  • A study should be technically sound, administratively convenient and politically defensible. Alec Rodger
1 1 pre evaluation goodman dean 1982 the extent to which it is likely that a has an impact on b
1.1 Pre-evaluation (Goodman & Dean, 1982)The extent to which it is likely that... A has an impact on b
  • Scenario planning
  • Evidence based practice
    • All current evidence thoroughly reviewed and synthesised
    • Meta-analysis
    • Systematic literature review
  • Formative v summative (Scriven, 1967)
1 2 pre evaluation issues
1.2 Pre-evaluation issues
  • Based on theory and past evidence: not clear it will generalise to the specific case
  • Formative: influences planning
  • Argument: to understand a system you must intervene (Lewin)
2 1 goal based evaluation tyler 1950
2. 1. Goal based evaluation Tyler (1950)
  • Objectives used to aid planned change
  • Can help clarify models
  • Goals from bench marking, theory or pre-evaluation exercises
  • Predict changes
  • Measure pre and post intervention
  • Identify the interventions
  • Were objectives achieved?
2 2 difficulties with goal based evaluation
2.2 Difficulties with Goal based evaluation

Who sets the goals? How do you identify the intervention?

  • Tendency to managerialism (unitarist)
  • Failure to accommodate value pluralism
  • Over-commitment to scientific paradigm
  • What is measured gets done
  • No recognition of unanticipated effects
  • Focus on single outcome, not process
3 1 realistic evaluation conceptual clarity pawson tilley 1997
3.1 Realistic evaluation: Conceptual clarity (Pawson & Tilley,1997)
  • Evidence needs to be based on clear ideas about concepts
  • Measures may be derived from theory
  • Examine definitions used elsewhere
  • Consider specific examples
  • Ensure all aspects are covered
3 2 realistic evaluation towards a theory what are you looking for
3.2 Realistic evaluation Towards a theory: What are you looking for?
  • Make assumptions and ideas explicit

What is your theory of cause and effect?

    • What are you expecting to change (outcome)?
    • How are you hoping to achieve this change (mechanism)?
    • What aspects of the context could be important?
3 3 realistic evaluation context mechanism outcome
3.3 Realistic evaluation Context-mechanism-outcome
  • Context: What environmental aspects may affect the outcome?
    • What else may influence the outcomes?
    • What other effects may there be?
3 4 realistic evaluation context mechanism outcome
3.4 Realistic evaluation Context-mechanism-outcome
  • Mechanism: What will you do to bring about this outcome?
    • How will you intervene (if at all)?
    • What will you observe?
    • How would you expect groups to differ?
    • What mechanisms do you expect to operate?
3 5 realistic evaluation context mechanism outcome
3.5 Realistic evaluation Context-mechanism-outcome
  • Outcome: What effect or outcome do you aim for?
    • What evidence could show it worked?
    • How could you measure it?
4 1 experimental evaluation
4.1 Experimental evaluation:

Explain, predict and control by identifying causal relationships

  • Theory of causality makes predictions about variables eg training increases productivity
  • Two randomly assigned matched groups: experimental and control
  • One group experiences intervention, one does not
  • Measure outcome variable pre-test and post-test (longitudinal)
  • Analyse for statistically significant differences between the two groups
  • Outcome linked back to modify theory
  • The gold standard
4 2 difficulties with experimental evaluation in organizations
4.2 Difficulties with experimental evaluation in organizations
  • Difficult to achieve in organizations
  • Unitarist view
  • Leaves out unforeseen effects
  • Problems with continuous change processes
  • Summative not formative
  • Generally at best quasi-experimental
5 1 constructivist or stakeholder evaluation
5.1 Constructivist or stakeholder evaluation
  • Responsive evaluation (Stake, 1975) or Fourth generation evaluation (Guba & Lincoln, 1989)
  • Constructivist interpretivist hermeneutic methodology
    • Based on stakeholder claims concerns issues
    • Stakeholders: agents, beneficiaries, victims
5 3 constructivist evaluation issues
5.3 Constructivist evaluation issues
  • No one right answer
  • Demonstrates complexity of issues
  • Highlights conflicts of interests
  • Interesting for academics
  • Difficult for practitioners to resolve
6 a contingent approach to evaluation legge 1984
6 A Contingent approach to evaluation(Legge, 1984)
  • Do you want the proposed change programme to be evaluated? (Stakeholders)
  • What functions do you wish its evaluation to serve? (Stakeholders)
  • What are the alternative approaches to evaluation? (Researcher)
  • Which of the alternatives best matches the requirements? (Discussion)
7 action research
7. Action research
  • Identify good practice(Reason & Bradbury, 2001) Action research
    • Responds to practical issues in organizations
    • Engages in collaborative relationships
    • Draws on diverse evidence
    • Value orientation - humanist
    • Emergent, developmental
problems with realist models
Problems with realist models
  • Tendency to managerialise
  • Over-commitment to scientific paradigm
  • Context stripping,
  • Over-dependence on measures
  • Coerciveness: truth as non-negotiable
  • Failure to accommodate value pluralism
  • Every act of evaluation is a political act, not tenable to claim it is value free
problems with constructionist approach
Problems with Constructionist approach
  • Evaluation judged by who for whom and in whose interests?
  • Identify different views, then what?
  • Who has power?
  • Leaves decisions open
  • May lead to ambiguity
why not evaluate
Why not evaluate?
  • Expensive in time and resources
  • De-motivating for individuals
  • Contradiction between “scientific” evaluation models and supportive, organization learning models
  • Individual identification with activity
  • Difficulties in objectifying and maintaining commitment
  • External evaluation ‘off the shelf’ inappropriate and unhelpful
why evaluate legge 1984
Overt

Aids decision making

Reduce uncertainty

Learn

Control

Covert

Rally support/opposition

Postpone a decision

Evade responsibility

Fulfil grant requirements

Surveillance

Why evaluate?(Legge, 1984)
conclusion
Conclusion
  • Evaluation is very expensive, demanding and complex
  • Evaluation is a political process: need for clarity about why you do it
  • Good evaluation always carries the risk of exposing failure
  • Therefore evaluation is an emotional process
  • Evaluation needs to be acceptable to the organization
conclusion 2
Conclusion 2
  • Plan and decide which model of evaluation is appropriate
  • Identify who will carry out the evaluation and for what purpose
  • Do not overload the evaluation process:judgment or development?
  • Evaluation can give credibility and enhance learning
  • Informal evaluation will take place whether you plan it or not