The Evaluation, Measurement, & Verification Challenge
The Importance of Evaluating Energy Efficiency Program Effectiveness
Funding for energy efficiency (EE) has grown tremendously in recent years. A growing number of states have established energy efficiency resource standards (EERS) or renewable electricity standards (RES) that include EE components, prompting increased EE investment. According to the Consortium for Energy Efficiency, investment through the utility sector topped $8 billion in 2011. Some states provide financial incentives to utilities that achieve certain EE goals. The American Recovery and Reinvestment Act (ARRA) allocated $25 billion for EE projects, kick-starting numerous state and local programs. Private companies, public institutions and households invest in EE to reduce their energy bills, enhance their properties and improve environmental performance.
The increased financial and, in some cases, regulatory stakes have raised the level of scrutiny of EE projects and programs. Taxpayers, ratepayers, shareholders and property owners want to know if they are “getting their money’s worth” from investments in energy efficiency. Further, policy makers seek assurances that the programs are delivering energy savings and other benefits, such as air pollution and greenhouse gas reductions and enhanced electric grid reliability.
Determining Energy Savings
It is the role of evaluation, measurement and verification (EM&V) to answer these questions. This is a tough task. There is no way to connect a meter and measure energy not consumed because a given EE project was implemented. Instead, energy savings must be estimated using such approaches as field measurements, modeling, energy bill analysis, user surveys and assumed savings for certain equipment. EM&V practitioners must be attuned to issues of precision, accuracy, statistical significance, and cost. Often there are judgment calls. EM&V is technically complex, but it is an art as well as a science.
The EM&V practitioner must compare energy used after implementation of an EE measure with what would have been consumed without it. The EE measure may be installation of equipment or retrofit of a building, or it may be a behavioral measure, such as providing enhanced energy use and cost information and advice to utility customers or training for building operators. S/he must differentiate impacts of the EE measures from such things as the effects of weather and changes in business hours, production levels and building occupancy. For instance, how much of a building’s reduced energy bill is from better windows rather than milder weather? Were changes in business hours considered in estimating savings from a lighting upgrade?
The interactions between installed efficiency measures can further complicate savings estimates. For example, compact fluorescent lamps (CFL) emits less heat than equivalent incandescent bulbs so summer air conditioning loads may decrease, but winter heating needs may increase. EE projects may affect behavior in different ways. One consumer may become more energy conscious, conserving more and applying additional EE measures. Another consumer may raise the heat in his better insulated home or leave her energy efficient lights on longer.
Crediting Energy Savings
Beyond determining the amount of energy savings, evaluators may also have to attribute the savings to particular EE programs. Evaluations provide feedback that is useful in selecting and designing more effective programs. It is also increasingly important for utilities and others to demonstrate that their programs are achieving energy savings as they seek to comply with regulatory requirements and/or claim financial incentives. For example, hundreds of millions of dollars in performance-based compensation are at stake in California for that state's investor-owned electric utilities.
Here again, there are difficult questions and issues. Among these are free-ridership (would the program participant have performed the EE measure anyway without the program?) and spillover (were savings incited beyond the program, such as a non-participant installing LED light bulbs after seeing her participant neighbor using them?). Multiple programs, sometimes run by multiple entities, may interact. For example, in some jurisdictions, multiple federal and state tax credits, utility rebates, state or utility subsidized energy audits and loan and grant programs may be available to the consumer and affect his/her decisions. Also, it is easier to quantify the impacts of some types of projects and programs (such as direct equipment installation) than others (such as public service advertising or energy education in schools). Indeed, one concern is that over-emphasis of rigorously quantified results may lead to under investment in public information, education and behavioral programs.
Multiple Inconsistent Methods are a Challenge
If different programs, states or other jurisdictions use different, inconsistent assumptions and approaches to EM&V, the credibility of programs and their energy savings claims may suffer.
There is a need for more consistent EM&V assumptions, methodologies and protocols, particularly with a possibility of future federal clean energy standards or greenhouse gas regulations that could allow crediting of EE program impacts. Also, some states allow interstate trading of energy savings credits to meet their EERS or RES requirements. Further, there is increasing interest in quantifying EE program savings for purposes of electric grid resource planning and for crediting avoided pollution emissions in air quality planning.
The table below illustrates how differing EM&V approaches even for a simple case (replacing a living room incandescent light bulb with a 15 watt CFL) can make a big difference in claimed savings. The three states included in the table use different assumptions of per lamp energy savings, hours of use, lifetime and other parameters, resulting in significantly different energy savings estimates. Quite different levels of effort and cost are needed to achieve a given level of claimed energy savings depending on these assumptions.
Technical Resource Manual Comparisons: Energy Savings for a Living Room Installation of a 15 W CFL replacing a Incandescent Light Bulb
Change in watts per lamp
Hours used per day
Life in years of CFL
Lifetime gross electricity savings (kWh)
Lifetime net electricity savings (kWh)
Lifetime net energy savings (kWh)****
Difference from California
*California Database for Energy Efficiency Resources (DEER). Note that the California example is for an average existing home in Oakland with central air conditioning and natural gas heating.
Improving EM&V and EE Program Credibility
EE program administrators, EM&V practitioners, utility regulators, and others have long known of these challenges, but attention has grown with the stakes.
The Alliance to Save Energy, in Scaling-Up Energy Efficiency Programs: The Measurement Challenge, recommends:
- that processes for EM&V design and review promote transparency and thorough debate over methods, data, and assumptions;
- improvement of those methods, data, and assumptions;
- increased consistency of methods and assumptions between regions and program types;
- enhancement of EM&V practitioner professional competency and integrity;
- management of stakeholder expectations of what EM&V can and cannot do; and
- reasonable budget allocations for EM&V.
Fortunately there are efforts underway that comport with the Alliance's recommendations. These include:
- the State and Local Energy Efficiency (SEE) Action Network’s EM&V work group, which develops relevant documents, revised the Energy Efficiency Program Impact Evaluation Guide and performs outreach;
- the Department of Energy’s Uniform Methods Project, which is developing model measurement and verification protocols for a number of EE measures;
- the EPA EM&V Webinar series;
- the Northeast Energy Efficiency Partnerships' (NEEP) EM&V Forum’s efforts to harmonize definitions and protocols among some of the Northeastern states;
- the Regional Technical Forum of the Northwest Power & Conservation Council;
- the Efficiency Valuation Organization (EVO) [developer of the International Performance Measurement and Verification Protocol (IPMVP)] and Association of Energy Engineers (AEE) offering of training and certification for EM&V professionals;
- development of an M&V protocol for industry under the auspices of the DOE-supported Superior Energy Performance program; and
- open public information and participation provisions in some states’ EE program development and evaluation processes.
Federal and state policymakers need to be aware of these issues and attempts to advance the state-of-the-art. The Alliance will continue to work with stakeholders and policymakers on these issues.