Carbon Wars

Published — January 29, 2018 Updated — February 6, 2018 at 1:34 pm ET

Most of the EPA’s pollution estimates are unreliable. So why is everyone still using them?

Downtown Houston shrouded in smog in 2008. (David J. Phillip/AP Photo)

Emission factors used to paint a broad picture of air quality often underestimate pollution, evidence shows

Introduction

Engineer Jim Southerland was hired by the U.S. Environmental Protection Agency in 1971 to join the nascent war on air pollution. He came to relish the task, investigating orange clouds from an ammunition plant in Tennessee and taking air samples from strip mines in Wyoming. Among his proudest accomplishments: helping the agency develop a set of numbers called emission factors — values that enable regulators to estimate atmospheric discharges from power plants, oil refineries, chemical plants and other industrial operations.

By the time Southerland left the EPA in 1996, he was “frustrated and ticked off,” he says, because the numbers he had helped develop were being misused. The original aim had been to paint a broad-brush picture of pollution. Instead, the numbers — meant to represent average emissions from industrial activities — were incorporated into permits stipulating how much pollution individual facilities could release. This happened despite EPA warnings that about half of these sites would discharge more than the models predicted. “These factors were not intended for permits,” says Southerland, now retired and living in Cary, North Carolina.

“This is what tells you what’s being put in the air and what you’re breathing. You don’t want those numbers to be wrong.”

Eric Schaeffer, executive director of the Environmental Integrity Project

The number of emission factors used by the EPA since Southerland’s time has proliferated and stands at 22,693. The agency itself admits most are unreliable: It rates about 62 percent as “below average” or “poor.” Nearly 22 percent aren’t rated at all. About 17 percent earned grades of “average” or better, and only one in six has ever been updated. There is a slew of common problems, such as poor accounting for emissions from aging equipment.

The upshot: in some cases, major polluters are using flawed numbers to calculate emissions of substances such as benzene, a carcinogen, and methane, a powerful greenhouse gas. Regulators at times are flying blind. The factors color everything we know about air quality and many of the decisions the EPA and state environmental agencies make, from risk assessment to rulemaking.

In an email, an EPA spokeswoman told the Center for Public Integrity that the agency has been working on the problem for a decade. “EPA believes it is important to develop emissions factors that are of high quality and reliable,” she wrote.

Some experts, however, say the agency hasn’t done enough. The unreliability of the numbers has been flagged over a period of decades by the EPA’s own internal watchdog and other government auditors. “This is what tells you what’s being put in the air and what you’re breathing,” says Eric Schaeffer, former head of civil enforcement at the EPA and now executive director of the Environmental Integrity Project, an advocacy group. “You don’t want those numbers to be wrong.”

Accuracy questions

Emission factors are based on company and EPA measurements as well as external studies. They are plugged into equations to estimate total emissions from industrial activities, such as the burning of coal in boilers.

As early as the 1950s, regulators in places like Los Angeles were using emission factors to try to pinpoint the origins of dangerous smog episodes. The numbers allowed them to avoid “time-consuming, expensive testing programs and extensive surveys of individual sources,” according to a 1960 paper by the Los Angeles County Air Pollution Control District.

In 1965, the U.S. Public Health Service — which regulated air pollution at the time — released its first comprehensive list of factors, a document the agency would label “AP-42” in a 1968 update. The EPA, created two years later, kept revising the estimates as they became more widely used in emission inventories depicting pollution levels and sources around the country

The EPA knew early on there were problems with the numbers. In 1989, for example, the Office of Technology Assessment — a now-defunct, nonpartisan science adviser to Congress — reported many U.S. metropolitan areas had not met their goals for controlling smog-forming ozone in part because of inaccurate emission inventories. In 1990 amendments to the Clean Air Act, Congress gave the agency six months to make sure all emissions contributing to ozone formation were assigned up-to-date, accurate factors, and directed the EPA to review the numbers every three years thereafter.

The EPA missed both deadlines. It has failed to do at least some of the three-year reviews. It claims to have created all the necessary ozone-related factors, but questions about their accuracy remain.

For decades, government watchdogs, including the EPA’s Office of Inspector General, have pointed out deficiencies in the factors, which drive actions ranging from enforcement cases to the drafting of regulations. “We believe the status of emission factor development … is a significant weakness that impedes achievement of major air program goals,” the IG wrote in a 1996 report. The EPA’s dependence on industry studies because of funding constraints could result in factors that minimized pollution, it warned. The U.S. General Accounting Office — now the Government Accountability Office — reported in 2001 that polluters rely on the estimates even though “facilities’ actual emissions can, and do, vary substantially from the published factors.” The EPA’s IG came back with a targeted reproach in 2014, questioning the validity of factors used to estimate methane emissions from some pipelines.

Still, there was little movement. Whereas emission factors are recognized as crucial tools in understanding air quality and underpinning inventories, they tend to be forgotten. “That foundation is buried to such an extent that it’s not often appreciated,” says David Mobley, who worked on emission factors in the 1990s. “The urgency is rarely there.”

Vehicles travel along Highway 225 near Shell Oil Company’s refinery and petrochemical facility in Deer Park, Texas. David J. Phillip/AP File Photo

Test case in Houston

Accurate pollution data matters. Consider what happened in the ozone-plagued city of Houston, a hub of oil refining and chemical manufacturing.

The city had been using emission inventories to guide its ozone-control strategy. Air monitoring by researchers in 2000 found levels of volatile organic compounds — highly reactive ozone precursors, such as benzene, known as VOCs — were 10 to 100 times higher than what had previously been estimated. The study — conducted by what was then the Texas Natural Resource Conservation Commission, the EPA and more than 40 other public, private, and academic institutions — singled out as culprits VOCs such as ethylene, a flammable gas used mainly in the production of plastics.

Houston, it turned out, had focused on controlling the wrong emissions from the wrong sources to lower its ozone levels, says Daniel Cohan, an associate professor of environmental engineering at Rice University. The city changed course, expanding VOC monitoring and developing rules to reduce emissions. Ozone production rates dropped by up to 50 percent in six years, Cohan and his colleagues found in a follow-up study. The study showed that reliance on emission factors alone is a bad idea, Cohan says. “We need scientists to measure these pollutants in the air to find out how much is really being emitted,” he said.

The underestimation problem surfaced at individual facilities as well, including Shell’s 1,500-acre petrochemical complex in the Houston suburb of Deer Park. A study begun by the City of Houston and the EPA in 2010 showed levels of benzene wafting from one Shell tank were 448 times higher than what the relevant emission factor had predicted. The discrepancy led to an EPA enforcement action; in a consent decree, Shell agreed to pay a $2.6 million fine and spend $115 million to control pollution from flaring — the burning of gas for economic or safety reasons — and other activities. Shell did not respond to requests for comment, but a spokeswoman told the Houston Chronicle in 2013 “the provisions of the settlement are consistent with Shell Deer Park’s objectives and ongoing activities to reduce emissions at the site and upgrade our flaring infrastructure.”

Despite the findings of these studies and others, the EPA didn’t update emission factors for the U.S. refinery and petrochemical sector until 2015, seven years after Houston had petitioned the agency to do so and two years after it was sued by environmental justice groups.

Unreliable methane estimates

The low-balling of pollution isn’t limited to toxic chemicals. Many emission factors used to estimate releases of methane — a potent greenhouse gas associated with oil and natural-gas development — are “far too low,” says Robert Howarth, an ecology and environmental biology professor at Cornell University. Identifying how much methane these operations discharge can help scientists calculate the impact of natural gas — which in 2016 displaced coal as the nation’s biggest source of electric power generation — on global warming. This is crucial to preventing “runaway climate change,” Howarth says.

Much remains unknown. A 2015 study sponsored by the Environmental Defense Fund found methane releases from oil and gas production and processing in the Barnett Shale Formation in northern Texas were 90 percent higher than what the EPA’s Inventory of U.S. Greenhouse Gas Emissions had estimated.

About a third of the factors used to estimate pipeline leaks and other natural-gas emissions in the most recent inventory, for 2015, are based on a 1996 study by the EPA and an industry group then known as the Gas Research Institute. The EPA’s IG found in 2014 “there was significant uncertainty in the study data,” meaning the EPA’s assumptions on the amount of methane that spews from pipelines “may not be valid.”

The harm caused by faulty estimates extends beyond oil and gas. An emission factor designed to estimate ammonia releases from poultry farms, for example, “is probably far too lowaccording to a report by the Environmental Integrity Project. These emissions contribute to problems like algae blooms, which can spread rapidly and kill marine life in waterways like the Chesapeake Bay.

‘Pandora’s box of problems’

The EPA, according to its spokeswoman, has begun executing a plan to improve the science that underlies emission factors and review the estimates more frequently. Among the changes: some companies now must report pollution data electronically to the agency.

The Trump administration proposed slashing the EPA’s budget by 31 percent for fiscal year 2018, although Congress has so far extended existing funding levels through a series of short-term resolutions. Progress on emission factors will hinge on “available resources,” the EPA spokeswoman wrote in an email, declining to specify a deadline for the project.

The agency said it does not intend to limit the use of emission factors to the purpose for which they were originally intended — to inform pollution inventories. That means, for example, that the numbers will still be used in permits.

Many in industry are fine with that. When the EPA asked in a 2009 Federal Register notice for suggestions on how to improve the system, companies from electric power generators to auto manufacturers argued for the status quo, saying emission factors were sometimes their only data option. Trade groups like the American Petroleum Institute and the American Chemistry Council argued their members should not be penalized if the EPA discovered a deficient factor had caused a permit to underestimate pollution. API said it worried that additional industry data supplied to the EPA to help it improve the numbers “could be misused for enforcement or other purposes.” Neither group responded to requests for comment.

Public health advocates, on the other hand, want more. Some companies game the system to avoid EPA permitting fees and civil penalties, says Neil Carman, clean air director for the Lone Star Chapter of the Sierra Club in Austin. “We don’t know what the emissions really are,” he says. “It’s a real Pandora’s box of problems.”

Carman and other advocates say they understand emission factors will have to be used in some circumstances, and that some types of pollution can be estimated with reasonable accuracy. They also maintain, however, that air monitoring should be more widely deployed. “Where you can do direct monitoring of emissions, that should be required,” says Schaeffer, of the Environmental Integrity Project.

Schaeffer faults the EPA for giving some companies an out. It allows operators of power plants, for example, to choose between using continuous monitoring to measure fine particles, or a combination of quarterly testing and emission factors. Some of these plants already have monitoring systems installed, Schaeffer says, but “it’s easier to mask noncompliance using emission factors.”

Shining a ‘bright light’ on pollution

California’s Bay Area Air Quality Management District changed its approach after studies showed leaks from oil refineries in the area — known as fugitive emissions — were likely underrepresented in emission factors. “We decided, based on that information, that we needed additional ways to better identify fugitive emissions and to shine a bright light on those fugitive emissions,” says Eric Stevenson, the district’s director of meteorology, measurement and rules.

In 2016, the district urged refineries to install “open path” monitoring systems — which use beams of light to detect the presence of gases like benzene — and make the data available to the public in real time. Chevron installed such a system on the perimeter of its refinery in Richmond, California, in 2013.

The company didn’t respond to specific questions about the monitoring but said its focus “on running the refinery efficiently and investing in new technologies” has significantly reduced air pollution since the 1970s. Denny Larson, executive director of the Community Science Institute-CSI for Health and Justice, an environmental group that helps the public test for pollution, says the system in Richmond shows levels of chemicals in the air at a given moment and can alert residents to emission spikes that can trigger asthma attacks and other serious health problems.

“It’s showing lots of pollution has been flying under the radar that’s extremely toxic and problematic,” Larson says. “We can prove what we’ve always known.”

Read more in Environment

Share this article

Join the conversation

Show Comments

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments