managing od process approach n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
MANAGING OD PROCESS & APPROACH PowerPoint Presentation
Download Presentation
MANAGING OD PROCESS & APPROACH

Loading in 2 Seconds...

play fullscreen
1 / 43

MANAGING OD PROCESS & APPROACH - PowerPoint PPT Presentation


  • 241 Views
  • Uploaded on

MANAGING OD PROCESS & APPROACH . DIAGNOSIS/DISCOVERY. Readiness for Change LO of OD are appropriate Culture open to change Key people Layers of Analysis Symptoms of problems Political Climate Resistance to Sharing Information Interview as Joint Learning Event; change has begun

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'MANAGING OD PROCESS & APPROACH' - aimee


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
diagnosis discovery
DIAGNOSIS/DISCOVERY
  • Readiness for Change
    • LO of OD are appropriate
    • Culture open to change
    • Key people
  • Layers of Analysis
    • Symptoms of problems
  • Political Climate
  • Resistance to Sharing Information
  • Interview as Joint Learning Event; change has begun
    • Pursue issues early on, don’t shy away
feedback
FEEDBACK

Funneling Data into actionable items

Present personal and organizational data on which recommendations may be implemented

Manage and control feedback meeting

Focus on present and how client is managing and dealing with feedback

Don’t take reactions personally; it’s hard to own up to problems

intervention
INTERVENTION
  • Do not implement fads for fad sake
  • Interventions address diagnosis
    • Depth of interventions is to needed level
    • Careful not to appease clients; some risk-taking may be necessary
  • Engage in top-down vs. bottom-up interventions
  • More participation than presentation
  • Allow for difficult situations to surface
  • Commitment to solution through choices
  • Dialogue on responsibility, purpose, meaning, & opportunities
  • Physical environment of intervention
pitfalls
PITFALLS

Client commitment to change

Power to influence change

Appeasing clients

Becoming expert on content

Getting socialized into organizational culture and politics

Collusion/Manipulated use of practitioner

Providing confidential reports

Removing parts of reports so as others won’t know

role modeling
ROLE MODELING
  • Self-awareness
  • Clear messages: words, feelings, & behaviors “fit”
  • Practice what you preach
  • Consultant team role models for organization’s teams
    • Communication
    • Roles
    • Goals
    • Action Research on OD process
  • Don’t model after the organization
substance feelings
SUBSTANCE & FEELINGS

Value interpersonal relationship

Label feelings about the relationships

Verbalizing data about relationships in order to reduce defensiveness

terminating relationship
TERMINATING RELATIONSHIP
  • Deliverables include steps for ensuring client internalizes skills
  • End date in contract
  • Sense assistance no longer needed
    • Poorly facilitate mourning old process (not ready for change)
    • Internal power struggles not discovered early enough
    • Crises pulled away attention of key people
      • Discovery: putting out fires vs. prevention
guiding principles of od practitioners
GUIDING PRINCIPLES OF OD PRACTITIONERS

Honesty

Openness

Voluntarism

Integrity

Confidentiality

Development of people

Development of consultant expertise

High standards

Self-awareness

no action without research no research without action
NO ACTION WITHOUT RESEARCH, NO RESEARCH WITHOUT ACTION

Diagnosis: Collaborative process between organizational members and the OD consultant to collect pertinent information, analyze it, and draw conclusions for action planning and intervention.

Discovery: Consultant serves as “a guide through a process of discovery, engagement, & dialogue.”

Purpose: to mobilize action on a problem

data collection feedback cycle
DATA COLLECTION-FEEDBACK CYCLE

C

Core Activities

Collect-ing Data

Feeding Back Data

Following Up

Planning to Collect Data

Analyzing Data

need for diagnostic models
NEED FOR DIAGNOSTIC MODELS

Why are models important?

Insight: trust your intuition on what to spend data collection time

open systems model
OPEN SYSTEMS MODEL

Inputs

Information

Energy

People

Transformations

Social Component

Technological Component

Outputs

Goods

Services

Ideas

Feedback

properties of systems
PROPERTIES OF SYSTEMS

Inputs, Transformations, Outputs

Boundaries: limitations to the system

Feedback (i.e., info. used to control future functioning) holds each of these parts together

Equifinality: diff’t ways of achieving equally acceptable goals

Alignment: how well various elements of the system support one another in achieving goals

unit of analysis
UNIT OF ANALYSIS

Organization

Group

Individual

Can we cross levels of analysis when conducting research? E.g., can we study organizational effectiveness and presume that the findings are applicable to the individual level?

diagnosis at organizational level
DIAGNOSIS AT ORGANIZATIONAL LEVEL

Intergroup processes

Culture

Technology in place

Structure of social system

diagnosis at group level
DIAGNOSIS AT GROUP LEVEL

Group processes (e.g., communication)

Leadership

Team development and problem-solving

think about it
THINK ABOUT IT…

How would you determine these areas for improvement?

What methods would you use to diagnose areas for improvements/change?

4 methods for organizational diagnosis
4 METHODS FOR ORGANIZATIONAL DIAGNOSIS

Observations

Records

Interviews

Questionnaires

observations
OBSERVATIONS

Block p. 203: how are you treated?

  • Advantages
    • Real not symbolic behavior (no self-report bias)
    • Reveal patterns of individual behavior and interpersonal and group behaviors (e.g., in meetings)
    • Real-time behaviors, not distorted remembrance
    • Vary in degree of structure
    • Highly structured reduces interpretation bias
  • Disadvantages
    • Highly structured restricts potential information
    • Expensive
    • Obtrusive
    • Time-consuming
records
RECORDS
  • Documents, accounts, journals, legal & regulatory policies, newspapers, etc
  • Advantages
    • Hard data (e.g., absenteeism, production, turnover)
    • Can be unobtrusive
    • Generally free from bias
    • Inexpensive
    • Unobtrusive
  • Disadvantages
    • Not always easy to retrieve
    • Poor quality
    • Errors of coding or interpretation
    • Violate informed consent
interviews
INTERVIEWS
  • Advantages
    • Structure & formality differ
    • Conduct with individuals or focus groups (SME)
    • Data are rich
    • Establish rapport with participants
    • Frank and honest replies
  • Disadvantages
    • Subject to bias from self-reports of participants and interpretations of interviews
    • Expensive (because of the amt. of time consumed)
questionnaires
QUESTIONNAIRES
  • Advantages
    • Typically structured
    • Perceptual and attitudinal data – aggregated at group and organizational levels
    • Psychological tests – individual level
    • High reliability (if standardized)
    • Opportunity to construct norms
    • Custom-tailored to gather specific information from a company
    • Compare companies on a specific survey
    • Distribution to large random sample
    • Inexpensive
    • Easy to administer and score
  • Disadvantages
    • People often recycle surveys that are not applicable to other organizations
    • No opportunity to build rapport or provide explanation
    • Self-report bias
weisbord six box model
WEISBORD SIX-BOX MODEL

Purposes: What business are we in?

Structure: How do we divide up the work?

Rewards: Do all needed tasks have incentives?

Helpful Mechanisms: Have we adequate coordinating technologies?

Relationships: How do we manage conflict among people? With technologies?

Leadership: Does someone keep the boxes in balance? How are the various components (and presenting problems) managed?

remember
REMEMBER:
  • What must we look for when diagnosing an organization, group(s), or individuals?
    • Positives and Negatives
    • Goals of each unit of analysis
organization level diagnostic model
ORGANIZATION-LEVEL DIAGNOSTIC MODEL

Outputs

Design Components

Inputs

General Environment

Industry Structure

Organization

Effectiveness

Technology

Culture

Strategy

Structure

HR Systems

Measurement

Systems

organizational level
ORGANIZATIONAL LEVEL
  • Inputs
    • General environment: (in)direct forces; Social, technological, ecological, economic, political factors?
    • Industry structure: Customers, rivalry?
  • Design Components
    • YDS’s strategy (i.e., vision, mission, goal)
    • Technology, structure, measurement systems, and HR systems
    • School’s culture
  • Outputs
    • Financial performance: profits, profitability
    • Productivity: cost/employee, error rates, quality
    • Efficiency
    • Stakeholder satisfaction: employee satisfaction, compliance
  • Assessment
    • How well is the fit between input and design components?
    • How well do the Design components align?
group level diagnostic model
GROUP-LEVEL DIAGNOSTIC MODEL

Outputs

Design Components

Inputs

Organization Design

Team

Effectiveness

Goal Clarity

Task

Structure

Team

Functioning

Group

Composition

Group

Norms

group level
GROUP LEVEL
  • Design Components
    • Goal Clarity: Objectives understood
    • Task structure: the way group’s work designed
    • Team functioning: quality of group dynamics among members
    • Group composition: Characteristics of group members
    • Group norms: unwritten rules that govern behavior
  • Outputs
    • Service Quality
    • Team Cohesiveness: commitment to group and organization
    • Member satisfaction/QWL
  • Assessment
    • How well is the fit between inputs and design components?
    • How well do the design components align?
individual level diagnostic model
INDIVIDUAL-LEVEL DIAGNOSTIC MODEL

Outputs

Design Components

Inputs

Organization Design

Group Design

Personal Traits

Individual

Effectiveness

Goal Variety

Task

Identity

Autonomy

Task

Significance

Feedback

about Results

individual level
INDIVIDUAL LEVEL
  • Inputs
    • Design of the larger organization within which the individual jobs are embedded
    • Design of the group containing the individual jobs
    • Personal characteristics of jobholders
  • Job Dimensions
    • Skill variety: range of activities and abilities required for task completion
    • Task identity: Ability to see a “whole” piece of work
    • Task significance: impact of work on others
    • Autonomy: amount of freedom/discretion
    • Feedback about results: knowledge of task performance outcomes
  • Outputs
    • Employees’ attitudes and feelings toward YDS
    • Performance; absenteeism; personal development (growth)
  • Assessment
    • How well is the fit between input and job design components?
    • How well does the job design fit the personal characteristics of the jobholders?
good to know
GOOD TO KNOW

95% of OD interventions are questionnaires and interviews

80% use consultant’s judgment

sampling
SAMPLING
  • How many people?
    • Size
    • Complexity
    • Quality of sample
    • Limiting resources
  • How do you select?
    • Random sample: each member, behavior, or record has an equal chance of being selected
    • Stratified sample: population members, events or records are segregated into subpopulations and a random sample from each subpopulation is taken
techniques for analyzing data
TECHNIQUES FOR ANALYZING DATA
  • Qualitative tools
  • Content Analysis: identify major themes
  • Force-Field Analysis (FFA)
    • Assumes current condition is a result of opposing forces (forces for change and forces for maintaining status quo)
  • Quantitative Tools: #s and graphs
  • Survey Feedback Programs (SFP)
7 steps of force field analysis
7 STEPS OF FORCE FIELD ANALYSIS
  • Identify problem
  • Describe desired condition
  • Identify forces operating in current forcefield: driving and restraining forces
  • Examine the forces for strength, influence, under control
  • Add driving forces, remove restraining forces; develop action plans
  • Implement action plans
  • What actions must be taken to stabilize the equilibrium at the desired conditions?
engaging in ffa
ENGAGING IN FFA

Who is the intervention agent?

How is it known that change is needed?

What change is needed?

What technology or activities are used?

How will this technology succeed in reaching the goals?

How will it be known if the goals are reached?

After success, then what? How long does the effect go on?

engaging in sfp
ENGAGING IN SFP
  • Who is the intervention agent?
  • How is it known that change is needed?
  • What change is needed?
  • What technology or activities are used?
  • How will this technology succeed in reaching the goals?
    • Top management
    • Data must be collected from all
    • Data fedback from top-down
    • Data are discussed
    • Subordinates help interpret data
    • Plans are made for changes
    • Plans for introducing data to lower levels
    • Consultant serves as a resource
engaging in sfp1
ENGAGING IN SFP
  • Characteristics of Effective Data
    • Data must be seen as valid
      • Relevant
      • Understandable
      • Descriptive
      • Verifiable
    • Group must accept responsibility
      • Significant
      • Comparative
    • Group must be committed to problem solution
      • Timely
      • Limited
      • Unfinalized
engaging in sfp2
ENGAGING IN SFP

How will it be known if the goals are reached?

After success, then what? How long does the effect go on?

five steps to sfp
FIVE STEPS TO SFP

Members of the organization are involved in preliminary planning of the survey.

Survey instrument is administered to all members of the organization/department

OD consultant analyzes data, tabulates results, suggests approaches to diagnosis, and trains client to lead feedback process with lower level employees

Begin data feedback from top-down and discuss info. only pertinent to each level

Work with data during feedback meetings: discuss strengths and weaknesses; develop action plans

limitations of sfp
LIMITATIONS OF SFP
  • Ambiguity of purpose
  • Distrust (anonymity; confidentiality)
  • Unacceptable topics
  • Organizational disturbances: a survey alone can peak respondents’ thoughts of changes or issues that need to be resolved, but management will not resolve.
  • SFPs are most widely used; but works best augmented by other mechanisms.
  • People are inundated with surveys and it may lead to ineffectiveness