Affordable Comfort Indianapolis: Assessing Energy Education Effectiveness. Author, “Poverty and the Public Utility” Involved in notable assistance programs Consultant - Service delivery infrastructure Program design and evaluation Utility billing systems. Presented by: Kevin Monte de Ramos
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Affordable Comfort Indianapolis:Assessing Energy Education Effectiveness
Author, “Poverty and the Public Utility”
Involved in notable assistance programs
Consultant - Service delivery infrastructure Program design and evaluation
Utility billing systems
Presented by: Kevin Monte de Ramos
May 18, 2005
Public Policy Analysis
Collaborative Strategic Planning
Where to start?The beginning I suppose …
-- Sumerian Law Clerk, 1758 BC The first office jerk to suffer the dire
consequence of poor judgment and bad humor
“We already know energy education is effective.”Q. Really, how do you know?
A. It has been studied before!
“We prefer to invest our money in program services.”
Q. Sounds smart. Which services will you invest in?
A. Energy education. We want to develop an in-home video.
Q. Nice. Are your customers asking for the video?
A. No, but they need help with programmable thermostats, filters, etc.
Q. Really, how did you choose the content for your video?
A. We identified these problems in field and through follow-up surveys.
Q. So you verify customer behavior and track program outcomes?
A. Sure, evaluations are mandated by our state commission.
“Sorry, we just don’t have the funds available for evaluation.”
Q. Would it help if we could provide some program statistics
for your fund raising activities.A. Sure, are you able to do that for us!!!
Learning/teaching activities, often interdisciplinary in nature, that focus on such topics as energy, resources, conversions, conservation forms, uses and issues – including both general and technical educational programs.
-- Thesaurus of Environmental Education Terms
Environmental Education and Training Partnership
Encourage measure installation, energy conservation behavior, increase awareness
Increase energy awareness, conduct home energy survey, influence head of household
Influence purchasing decisions, lower energy demand and use, transform market.
Increase energy awareness, promote energy efficiency programs, increase brand value
Encourage measure installation, energy conservation behavior, increase awareness
Payment compliance, timeliness, total dollars
Lower home energy use, encourage energy efficient behavior, heating system maintenance
Influence organizational decisions, impact energy demand and use
Legislative action, public support
Technology Adoption Programs
Energy Efficiency Lobby Efforts
WHERE YOUR FIND THESE PROGRAM COMPONENTS
Utility-sponsored low-income assistance
State Weatherization Assistance Programs
Often associated with private hardship funds Often linked to LIHEAP and other state cash grant assistance
Frequently used by REACh applicants/award recipients
All impact utility operations
All encourage participant behavioral modification
Services offered with an expected outcome
Frequently require third party assessment/verification
WHY WE ARE TALKING ABOUT THEM
Performance metrics still under development
Legislatively capped at 5% of LIHEAP expenditures
Cost effectiveness has been challenged
Education components difficult to study
“An Annotated Bibliography of Research-verified Energy Education Programs”
Version 2 – July 1994 from the Professional Assn of Consumer Energy Education
1984, Timothy Dunsworth attributed 4.3% savings to lost-cost weatherization training
Provided by Minneapolis Energy Office via the Neighborhood Energy Workshop program
1987, Tom Lent estimated the 7% incremental effect resulting from an in-home energy
Education visit by the Energy Coordinating Agency of Philadelphia via PA WAP
1989, Patti Witti and Martin Kushler found a similar 7% impact via Michigan's Low-Income
Weatherization Energy Education and Incentives Program
1991, Marilee Harrigan of the Alliance to Save Energy found an 8% incremental effectwhen 3 in-home education visits were added to PECO’s load management program
SELECTED FINDINGS (Con’t)
1993, Marialena Selvaggio found a small but significant difference between of 'high intensity‘education services over the 'low intensity' and 'medium intensity' offerings.
1992, Judy Gregory studied recipients of Ohio's Home Weatherization Assistance Programwho participated in the Client Education Pilot Program (CEPP). She estimated an incremental effect of energy education to be 6.7%. While participants of the 1989 HWAP program realized just over a 3% incremental effect for energy education.
Taking both reports into consideration, Judy Gregory indicated that energy education without a follow-up visit may yield lower savings than does programs including a follow-up education visits.
1994, Financial Energy Management modest, statistically insignificant, decreases in energybut significant increases in energy efficient resulting from tenant education in multi-unit, HUD-managed housing in Colorado.
SUMMARY OF APPROACHES USED
General approach was to look at energy use one year prior to and following participation in home weatherization programs.
Difference between groups participating in various levels of energy educational activities.
Educational effects ranged between 4% - 9% of pre-program consumption.
More recent studies suggest findings up to 12% of pre-program consumption levels.
Beginning to track energy efficient behaviors of customers.
Educational activities often associated with use of installed measures; such as
timers, programmable thermostats, weatherization kits.
PROBLEMS WITH THE APPROACHES USED
Educational effects inseparable from other program offerings
- Only small samples available to study
- Groups often interrelated- Individual methods/messages cannot be studied
Measure effects confound educational effects
- Thermostat effects (default schedule, manual setback, customer specified)
- Timers (educational effect or just an untouched measure)
- Wx Kits (savings from measures installed measures or other client behaviors)
Participation effects confound educational effects
- Lighting (change in use patterns or simply more efficient bulbs)
- Interactive Effects (discussions with technicians vs. educational session)
- Raised Awareness (motivated to see savings following weatherization)
“…Evaluation reports have substantial design and implementation shortcomings that compromise the validity of [their] findings.”
“… other shortcomings … preclude an overall assessment of the projects’ effectiveness.”
“Every state project plan should be [able] to measure whether its activities are more cost- effective in the long term than energy assistance payments alone, none of the project evaluation reports provided such an analysis.
“… most of the evaluation reports did not report lessons learned or best practices”
NOT WHAT WE WANT TO HEAR WHEN CONGRESS IS SCRUTINIZING ITS BUDGET EXPENDITURESAND 20% CUTS ACROSS THE BOARD ARE UNDER CONSIDERATION!!!
CHALLENGE NUMBER 1
If we wish to influence customer behavior, shouldn’t we first understand the process of behavioral change?
CHALLENGE NUMBER 2
Research suggests that information/education does NOT leadto behavioral changes! Are we to believe these findings?
CHALLENGE NUMBER 3
Positive change comes with a cost! Are they (our clients) ready
for change? Better yet, are we ready for change?
CHALLENGE NUMBER 4
Inertia is overcome only by an external force acting on an object alreadyin motion! Where will this force come from? Evaluation firms …unlikely to abandon proven industry approaches. Will program managers consider approaches from other industries? Or must pressures first come from senior management, regulators, or government offices?
A transtheoretical approach:The six stages change
-- Drs. James Prochaska, John Norcross, and Carlo Diclemente,
Recognized as one of the ‘most important developments’ in smoking cessation and health behavior change.
Widely adopted by the Centers of Disease Control for programs addressing AIDS and HIV.
Used by the National Health Service of Great Britain in their promotional campaigns against smoking.
Better than action-oriented programs at achieving long-term behavioral modification.
Proven to raise program participation rates and lower risk profiles of participants.
Environmental policy and energy efficiency programs also use promotional and informational campaigns across broad market segments
We struggle with program participation rates and long-term behavioral compliance.
Will be considered by Dr. Lori Megdal at IEPEC (August 2005) for use to assess market transformation programs
KMDR Research will apply this approach within our industry to assess a broad range of educational programs, both processes and impacts.
Precontemplation: The stage before problem behaviors are acknowledged; therefore, no desire to change. Instead, the individual is frustrated and wants others to change.
Contemplation: The stage where the problem is first acknowledged and the individual begins to embrace the idea of a new self without the destructive behavior.
Preparation: The stage at which individuals begin planning specific actions, environmental adjustments, and modifications to their normal routines.
Action: A time when the first steps are being taken. Others begin to take notice, either assisting or resisting these newly adopted behaviors.
Maintenance: The stage whereby individuals fight relapsing to old behaviors by modifying their plans and developing new action items.
Termination: A stage where individuals are no longer tempted or have desires to return to problem behaviors.
Theories: In this case, the scientific grounding upon which behavioral change is sought. Five major theories are relevant to change: psychoanalytic (Freud/Jung), humanistic/existentialism (Rogers/May), gestalt/experiential (Perls/Janov), cognitive (Ellis/Beck), behavioral (Skinner/Wolpe).
Processes: The strategies available to encourage and support behavioral modification. Broadly speaking, there are nine important processes.
Techniques: The specific tactics used within a given process. There are a limitless number of techniques, that add to our confusion.
Stages: Identifiable mental states of individuals that adhere to an established model and follow well known patterns.
MUST WE ALL BECOME BEHAVIORAL PSYCHOLOGISTS. NO, but we should be aware of the processes and when they are effectively used.
Addressing Session ObjectivesApplying this approach to our efforts
Understanding the processes of change helps us understand whether or not the techniques employed by our educators will work.
Matching educational efforts with participants at varying stages of change can help us achieve greater cost-effectiveness. Educators employing techniques at appropriate times will be more effective than others that utilize techniques in an untimely manner.
Can your educators identify the problem behaviors?
Are educators pointing out potential pitfalls and providing strategies to get around them?
Just how engaged are participants in the conversation? Are the emotional involved? Have they abandoned their defense mechanisms.
In short, passive observation or ‘secret shoppers’ can be used to gauge the ability of the educator to effect behavioral modification.
How can you determine whether or not energy education is an effective component within your program?
Simply treating education as another energy saving measure will not resolve the question of cost-effectiveness. This is especially true when education involves the use/installation of other measures.
Before we can assign causative effects of energy education, we must first track the actions taken by our participants. New methods will be needed for this; sub-metering, follow-up surveys, and customer diaries.
By identifying the processes used within our programs, we can establish objective metrics to track program performance. Eventually, we can assign dollars to these metrics and quantify educational cost-effectiveness.
How do I know if I am connecting with my clients? How should we deal with varied learning styles, in the classroom and in the field?
Educators can monitor dialog and emotional states of participants to determine their level of influence. Dialog that moves from defense mechanisms to acknowledgement will serve as a positive indication. Similarly, emotional arousal puts customers into a receptive state. Mirroring of body language also suggests a participant receptiveness.
Trainers have a more difficult task before them. Reading a group is like reading an individual. Acknowledging nods, uniform actions, and probative dialog are also demonstrative of participant receptiveness.
We do not need to be practicing behavioral psychologists to employ these methods. Many individuals can read persons instinctively. For the rest of us, there are many books available in the self-help/psychology sections of your favorite bookstore.
What are realistic savings expectations from client education?
There is no pat answer to this question. As indicated earlier, most studies conducted thus far do not adequately isolate the educational effects nor do they link achieved savings with specific behavioral changes. So, we have nothing to build upon.
We hear estimates ranging from 4% to 12% percent of pre-program consumption levels for the energy effect. Also, we have seen increased payment frequency for participants in arrearage forgiveness programs, resulting in larger annual contributions toward home heating.
One cause for concern is that only 20% of individuals seeking help with problems are ready to take action. If that statistic holds true for energy education, action-oriented participants would have to achieve savings 5 times that of stated program impacts. Is this level of savings realistic? We can calculate technical potential to validate the assumptions.