In the immediate aftermath of the devastation of the Philippines by Typhoon Haiyan, the media predictably ran to weather and other scientific experts to ask what if any role climate change may have played in the storm that killed more than 5,200 people and left millions still homeless.
The first cautious answers were, not much. Higher sea levels caused by climate change may have contributed to higher storm surges, but overall, as has often been the case, most would say it is difficult to connect any one weather event to climate change.
But, in a radio interview Monday on Public Radio International’s “The World,” Kerry Emanuel, an atmospheric scientist at the Massachusetts Institute of Technology and an expert on tropical storms, took a second look at that view, which he himself had advanced two weeks earlier.
Working with a computer model used to forecast wind speeds in tropical storms, Emanuel and his colleagues at MIT compared weather conditions from the 1980s, before the current warming of the climate, with those forming the background for Haiyan.
What they found, he said, is ”that the wind speeds are about ten pecent larger now.”
Warmer surface temperatures provide more fuel for tropical storms, Emanuel said, “so that really corresponds to something like 30 to 40 percent more damage than the same exact event might’ve done had it occurred in the thermal environment of the 1980s.”
While still cautious on the connection between climate change and the storm, he said, it is the prime suspect.
Emanuel also contends that the U.S. is less well prepared for a storm the size of Haiyan because we simply aren’t used to them, compared to the Philippines where multiple typhoons occur yearly. We increase our vulnerability by continuing to promote development in high-risk coastal areas.
To drive home the point, he did a side-by-side comparison of Haiyan and Hurricane Katrina, imposing Haiyan on the Gulf Coast where Katrina struck.
The Emanuel interview aired the same day as news broke of a new study suggesting that U.S. emissions of methane, the second most important greenhouse gas after carbon dioxide, may be significantly underestimated by major national and international inventories.
This is importnat because while methane emissions overall are many times less than carbon dioxide, they trap more heat — 34 times as much over a 100-year period according to a recent report from the Intergovernmental Panel on Climate Change.
The study released Monday, done by a group of researchers at seven universities and research institutes, found that methane emissions across the U.S. may be 1.5-1.7 times higher than currently estimated by the Environmental Protection Agency or the Emissions Database for Global Atmospheric Research, aka EDGAR, which is compiled by the European Commission.
The difference between those two inventories and the study is their methodologies. The EPA and EDGAR estimates are based on a bottom-up method, which uses set “emissions factors,” a certain amount of methane released by specific sources, such as livestock or fossil fuel mining and refining, two major sources of methane emissions.
The new study used a top-down approach, measuring methane emissions actually in the air in 2007-2008 and then tracing them back to regional sources using meteorological data and statistical analysis.
Breaking down the results even further, the researchers found that regional methane emissions from fossil fuel extraction and processing — which would include the haudraulic fracturing, or fracking, used for natural gas and shale oil — could be five times higher than estimated. Emissions from livestock may be twice as high.
Significantly, the EPA recently lowered its estimates for methane emissions from natural gas by 25 percent to 30 percent, as noted by one fossil fuel industry official quoted in a USA Today article on the study.
Steve Everley of Energy In Depth, a research group launched in 2009 by the Independent Petroleum Association of America, said the time frame of the study, 2007-2008, ignored recent advances in mining and refining technology. His other criticism was that the study’s mathematical modeling could only provide an “educated guess” of emissions’ starting points.
The two science stories were particulary striking coming one day after the conclusion of the United Nations conference on climate change in Warsaw, where after two weeks of heel-dragging discussions, the delegates late on Saturday approved a watered-down declaration committing everyone to as little as possible.
Moving toward a 2015 deadline for an international agreement on combatting climate change, the final agreement has the world’s developing and industrial nations each coming up with their own individual plans for greenhouse gas reductions by early 2015, without setting any global target for emissions cuts. If a plan is finally approved in 2015, it may not go into effect until 2020.
Similarly, the conference came out with a detailed agreement called the “Warsaw mechanism for loss and damage associated with climate change impacts,” which encourages stepping up efforts at developing adaptation strategies for countries such as the Philippines, that are most vulnerable to extreme weather events.
The three-page agreement spends much time extolling efforts to share information and strengthen dialog and coordination, but provides no concrete funding mechanism, beyond requesting industrialized countries to provide “finance, technology and capacity building.”
The disconnect between science and politics on climate change is nothing new. But as the gap that so urgently needs to be bridged keeps getting wider, the options for finding a way across may grow fewer, more expensive and more unlikely.