World headed for irreversible climate change in five years, IEA warns

World headed for irreversible climate change in five years, IEA warns

The world is likely to build so many fossil-fuelled power stations, energy-guzzling factories and inefficient buildings in the next five years that it will become impossible to hold global warming to safe levels, and the last chance of combating dangerous climate change will be “lost for ever”, according to the most thorough analysis yet of world energy infrastructure.

Anything built from now on that produces carbon will do so for decades, and this “lock-in” effect will be the single factor most likely to produce irreversible climate change, the world’s foremost authority on energy economics has found. If this is not rapidly changed within the next five years, the results are likely to be disastrous.

“The door is closing,” Fatih Birol, chief economist at the International Energy Agency, said. “I am very worried – if we don’t change direction now on how we use energy, we will end up beyond what scientists tell us is the minimum [for safety]. The door will be closed forever.”

If the world is to stay below 2C of warming, which scientists regard as the limit of safety, then emissions must be held to no more than 450 parts per million (ppm) of carbon dioxide in the atmosphere; the level is currently around 390ppm. But the world’s existing infrastructure is already producing 80% of that “carbon budget”, according to the IEA’s analysis, published on Wednesday. This gives an ever-narrowing gap in which to reform the global economy on to a low-carbon footing.

If current trends continue, and we go on building high-carbon energy generation, then by 2015 at least 90% of the available “carbon budget” will be swallowed up by our energy and industrial infrastructure. By 2017, there will be no room for manoeuvre at all – the whole of the carbon budget will be spoken for, according to the IEA’s calculations.

Birol’s warning comes at a crucial moment in international negotiations on climate change, as governments gear up for the next fortnight of talks in Durban, South Africa, from late November. “If we do not have an international agreement, whose effect is put in place by 2017, then the door to [holding temperatures to 2C of warming] will be closed forever,” said Birol.

But world governments are preparing to postpone a speedy conclusion to the negotiations again. Originally, the aim was to agree a successor to the 1997 Kyoto protocol, the only binding international agreement on emissions, after its current provisions expire in 2012. But after years of setbacks, an increasing number of countries – including the UK, Japan and Russia – now favour postponing the talks for several years.

Both Russia and Japan have spoken in recent weeks of aiming for an agreement in 2018 or 2020, and the UK has supported this move. Greg Barker, the UK’s climate change minister, told a meeting: “We need China, the US especially, the rest of the Basic countries [Brazil, South Africa, India and China] to agree. If we can get this by 2015 we could have an agreement ready to click in by 2020.” Birol said this would clearly be too late. “I think it’s very important to have a sense of urgency – our analysis shows [what happens] if you do not change investment patterns, which can only happen as a result of an international agreement.”

Nor is this a problem of the developing world, as some commentators have sought to frame it. In the UK, Europe and the US, there are multiple plans for new fossil-fuelled power stations that would contribute significantly to global emissions over the coming decades.

The Guardian revealed in May an IEA analysis that found emissions had risen by a record amount in 2010, despite the worst recession for 80 years. Last year, a record 30.6 gigatonnes (Gt) of carbon dioxide poured into the atmosphere from burning fossil fuels, a rise of 1.6Gt on the previous year. At the time, Birol told the Guardian that constraining global warming to moderate levels would be “only a nice utopia” unless drastic action was taken.

The new research adds to that finding, by showing in detail how current choices on building new energy and industrial infrastructure are likely to commit the world to much higher emissions for the next few decades, blowing apart hopes of containing the problem to manageable levels. The IEA’s data is regarded as the gold standard in emissions and energy, and is widely regarded as one of the most conservative in outlook – making the warning all the more stark. The central problem is that most industrial infrastructure currently in existence – the fossil-fuelled power stations, the emissions-spewing factories, the inefficient transport and buildings – is already contributing to the high level of emissions, and will do so for decades. Carbon dioxide, once released, stays in the atmosphere and continues to have a warming effect for about a century, and industrial infrastructure is built to have a useful life of several decades.

Yet, despite intensifying warnings from scientists over the past two decades, the new infrastructure even now being built is constructed along the same lines as the old, which means that there is a “lock-in” effect – high-carbon infrastructure built today or in the next five years will contribute as much to the stock of emissions in the atmosphere as previous generations.

The “lock-in” effect is the single most important factor increasing the danger of runaway climate change, according to the IEA in its annual World Energy Outlook, published on Wednesday.

Climate scientists estimate that global warming of 2C above pre-industrial levels marks the limit of safety, beyond which climate change becomes catastrophic and irreversible. Though such estimates are necessarily imprecise, warming of as little as 1.5C could cause dangerous rises in sea levels and a higher risk of extreme weather – the limit of 2C is now inscribed in international accords, including the partial agreement signed at Copenhagen in 2009, by which the biggest developed and developing countries for the first time agreed to curb their greenhouse gas output.

Another factor likely to increase emissions is the decision by some governments to abandon nuclear energy, following the Fukushima disaster. “The shift away from nuclear worsens the situation,” said Birol. If countries turn away from nuclear energy, the result could be an increase in emissions equivalent to the current emissions of Germany and France combined. Much more investment in renewable energy will be required to make up the gap, but how that would come about is unclear at present.

Birol also warned that China – the world’s biggest emitter – would have to take on a much greater role in combating climate change. For years, Chinese officials have argued that the country’s emissions per capita were much lower than those of developed countries, it was not required to take such stringent action on emissions. But the IEA’s analysis found that within about four years, China’s per capita emissions were likely to exceed those of the EU.

In addition, by 2035 at the latest, China’s cumulative emissions since 1900 are likely to exceed those of the EU, which will further weaken Beijing’s argument that developed countries should take on more of the burden of emissions reduction as they carry more of the responsibility for past emissions.

In a recent interview with the Guardian recently, China’s top climate change official, Xie Zhenhua, called on developing countries to take a greater part in the talks, while insisting that developed countries must sign up to a continuation of the Kyoto protocol – something only the European Union is willing to do. His words were greeted cautiously by other participants in the talks.

Continuing its gloomy outlook, the IEA report said: “There are few signs that the urgently needed change in direction in global energy trends is under way. Although the recovery in the world economy since 2009 has been uneven, and future economic prospects remain uncertain, global primary energy demand rebounded by a remarkable 5% in 2010, pushing CO2 emissions to a new high. Subsidies that encourage wasteful consumption of fossil fuels jumped to over $400bn (£250.7bn).”

Meanwhile, an “unacceptably high” number of people – about 1.3bn – still lack access to electricity. If people are to be lifted out of poverty, this must be solved – but providing people with renewable forms of energy generation is still expensive.

Charlie Kronick of Greenpeace said: “The decisions being made by politicians today risk passing a monumental carbon debt to the next generation, one for which they will pay a very heavy price. What’s seriously lacking is a global plan and the political leverage to enact it. Governments have a chance to begin to turn this around when they meet in Durban later this month for the next round of global climate talks.”

One close observer of the climate talks said the $400bn subsidies devoted to fossil fuels, uncovered by the IEA, were “staggering”, and the way in which these subsidies distort the market presented a massive problem in encouraging the move to renewables. He added that Birol’s comments, though urgent and timely, were unlikely to galvanise China and the US – the world’s two biggest emittters – into action on the international stage.

“The US can’t move (owing to Republican opposition) and there’s no upside for China domestically in doing so. At least China is moving up the learning curve with its deployment of renewables, but it’s doing so in parallel to the hugely damaging coal-fired assets that it is unlikely to ever want (to turn off in order to) to meet climate targets in years to come.”

 

From The Guardian: http://www.guardian.co.uk/environment/2011/nov/09/fossil-fuel-infrastructure-climate-change

Climate scientists: concept of net zero is a dangerous trap

Climate scientists: concept of net zero is a dangerous trap

In this article, originally published on The Conversation, three scientists argue that the concept of net zero which is heavily relying on carbon capture and storage technologies is a dangerous illusion.

By James Dyke, Senior Lecturer in Global Systems, University of Exeter, Robert Watson, Emeritus Professor in Environmental Sciences, University of East Anglia, and Wolfgang Knorr, Senior Research Scientist, Physical Geography and Ecosystem Science, Lund University


Sometimes realisation comes in a blinding flash. Blurred outlines snap into shape and suddenly it all makes sense. Underneath such revelations is typically a much slower-dawning process. Doubts at the back of the mind grow. The sense of confusion that things cannot be made to fit together increases until something clicks. Or perhaps snaps.

Collectively we three authors of this article must have spent more than 80 years thinking about climate change. Why has it taken us so long to speak out about the obvious dangers of the concept of net zero? In our defence, the premise of net zero is deceptively simple – and we admit that it deceived us.

The threats of climate change are the direct result of there being too much carbon dioxide in the atmosphere. So it follows that we must stop emitting more and even remove some of it. This idea is central to the world’s current plan to avoid catastrophe. In fact, there are many suggestions as to how to actually do this, from mass tree planting, to high tech direct air capture devices that suck out carbon dioxide from the air.

The current consensus is that if we deploy these and other so-called “carbon dioxide removal” techniques at the same time as reducing our burning of fossil fuels, we can more rapidly halt global warming. Hopefully around the middle of this century we will achieve “net zero”. This is the point at which any residual emissions of greenhouse gases are balanced by technologies removing them from the atmosphere.

This is a great idea, in principle. Unfortunately, in practice it helps perpetuate a belief in technological salvation and diminishes the sense of urgency surrounding the need to curb emissions now.

We have arrived at the painful realisation that the idea of net zero has licensed a recklessly cavalier “burn now, pay later” approach which has seen carbon emissions continue to soar. It has also hastened the destruction of the natural world by increasing deforestation today, and greatly increases the risk of further devastation in the future.

To understand how this has happened, how humanity has gambled its civilisation on no more than promises of future solutions, we must return to the late 1980s, when climate change broke out onto the international stage.

Steps towards net zero

On June 22 1988, James Hansen was the administrator of Nasa’s Goddard Institute for Space Studies, a prestigious appointment but someone largely unknown outside of academia.

By the afternoon of the 23rd he was well on the way to becoming the world’s most famous climate scientist. This was as a direct result of his testimony to the US congress, when he forensically presented the evidence that the Earth’s climate was warming and that humans were the primary cause: “The greenhouse effect has been detected, and it is changing our climate now.”

If we had acted on Hansen’s testimony at the time, we would have been able to decarbonise our societies at a rate of around 2% a year in order to give us about a two-in-three chance of limiting warming to no more than 1.5°C. It would have been a huge challenge, but the main task at that time would have been to simply stop the accelerating use of fossil fuels while fairly sharing out future emissions.

Alt text

 © Robbie AndrewCC BY

Four years later, there were glimmers of hope that this would be possible. During the 1992 Earth Summit in Rio, all nations agreed to stabilise concentrations of greenhouse gases to ensure that they did not produce dangerous interference with the climate. The 1997 Kyoto Summit attempted to start to put that goal into practice. But as the years passed, the initial task of keeping us safe became increasingly harder given the continual increase in fossil fuel use.

It was around that time that the first computer models linking greenhouse gas emissions to impacts on different sectors of the economy were developed. These hybrid climate-economic models are known as Integrated Assessment Models. They allowed modellers to link economic activity to the climate by, for example, exploring how changes in investments and technology could lead to changes in greenhouse gas emissions.

They seemed like a miracle: you could try out policies on a computer screen before implementing them, saving humanity costly experimentation. They rapidly emerged to become key guidance for climate policy. A primacy they maintain to this day.

Unfortunately, they also removed the need for deep critical thinking. Such models represent society as a web of idealised, emotionless buyers and sellers and thus ignore complex social and political realities, or even the impacts of climate change itself. Their implicit promise is that market-based approaches will always work. This meant that discussions about policies were limited to those most convenient to politicians: incremental changes to legislation and taxes.

Around the time they were first developed, efforts were being made to secure US action on the climate by allowing it to count carbon sinks of the country’s forests. The US argued that if it managed its forests well, it would be able to store a large amount of carbon in trees and soil which should be subtracted from its obligations to limit the burning of coal, oil and gas. In the end, the US largely got its way. Ironically, the concessions were all in vain, since the US senate never ratified the agreement.

Postulating a future with more trees could in effect offset the burning of coal, oil and gas now. As models could easily churn out numbers that saw atmospheric carbon dioxide go as low as one wanted, ever more sophisticated scenarios could be explored which reduced the perceived urgency to reduce fossil fuel use. By including carbon sinks in climate-economic models, a Pandora’s box had been opened.

It’s here we find the genesis of today’s net zero policies.

That said, most attention in the mid-1990s was focused on increasing energy efficiency and energy switching (such as the UK’s move from coal to gas) and the potential of nuclear energy to deliver large amounts of carbon-free electricity. The hope was that such innovations would quickly reverse increases in fossil fuel emissions.

But by around the turn of the new millennium it was clear that such hopes were unfounded. Given their core assumption of incremental change, it was becoming more and more difficult for economic-climate models to find viable pathways to avoid dangerous climate change. In response, the models began to include more and more examples of carbon capture and storage, a technology that could remove the carbon dioxide from coal-fired power stations and then store the captured carbon deep underground indefinitely.

This had been shown to be possible in principle: compressed carbon dioxide had been separated from fossil gas and then injected underground in a number of projects since the 1970s. These Enhanced Oil Recovery schemes were designed to force gases into oil wells in order to push oil towards drilling rigs and so allow more to be recovered – oil that would later be burnt, releasing even more carbon dioxide into the atmosphere.

Carbon capture and storage offered the twist that instead of using the carbon dioxide to extract more oil, the gas would instead be left underground and removed from the atmosphere. This promised breakthrough technology would allow climate friendly coal and so the continued use of this fossil fuel. But long before the world would witness any such schemes, the hypothetical process had been included in climate-economic models. In the end, the mere prospect of carbon capture and storage gave policy makers a way out of making the much needed cuts to greenhouse gas emissions.

The rise of net zero

When the international climate change community convened in Copenhagen in 2009 it was clear that carbon capture and storage was not going to be sufficient for two reasons.

First, it still did not exist. There were no carbon capture and storage facilities in operation on any coal fired power station and no prospect the technology was going to have any impact on rising emissions from increased coal use in the foreseeable future.

The biggest barrier to implementation was essentially cost. The motivation to burn vast amounts of coal is to generate relatively cheap electricity. Retrofitting carbon scrubbers on existing power stations, building the infrastructure to pipe captured carbon, and developing suitable geological storage sites required huge sums of money. Consequently the only application of carbon capture in actual operation then – and now – is to use the trapped gas in enhanced oil recovery schemes. Beyond a single demonstrator, there has never been any capture of carbon dioxide from a coal fired power station chimney with that captured carbon then being stored underground.

Just as important, by 2009 it was becoming increasingly clear that it would not be possible to make even the gradual reductions that policy makers demanded. That was the case even if carbon capture and storage was up and running. The amount of carbon dioxide that was being pumped into the air each year meant humanity was rapidly running out of time.

With hopes for a solution to the climate crisis fading again, another magic bullet was required. A technology was needed not only to slow down the increasing concentrations of carbon dioxide in the atmosphere, but actually reverse it. In response, the climate-economic modelling community – already able to include plant-based carbon sinks and geological carbon storage in their models – increasingly adopted the “solution” of combining the two.

So it was that Bioenergy Carbon Capture and Storage, or BECCS, rapidly emerged as the new saviour technology. By burning “replaceable” biomass such as wood, crops, and agricultural waste instead of coal in power stations, and then capturing the carbon dioxide from the power station chimney and storing it underground, BECCS could produce electricity at the same time as removing carbon dioxide from the atmosphere. That’s because as biomass such as trees grow, they suck in carbon dioxide from the atmosphere. By planting trees and other bioenergy crops and storing carbon dioxide released when they are burnt, more carbon could be removed from the atmosphere.

With this new solution in hand the international community regrouped from repeated failures to mount another attempt at reining in our dangerous interference with the climate. The scene was set for the crucial 2015 climate conference in Paris.

A Parisian false dawn

As its general secretary brought the 21st United Nations conference on climate change to an end, a great roar issued from the crowd. People leaped to their feet, strangers embraced, tears welled up in eyes bloodshot from lack of sleep.

The emotions on display on December 13, 2015 were not just for the cameras. After weeks of gruelling high-level negotiations in Paris a breakthrough had finally been achieved. Against all expectations, after decades of false starts and failures, the international community had finally agreed to do what it took to limit global warming to well below 2°C, preferably to 1.5°C, compared to pre-industrial levels.

The Paris Agreement was a stunning victory for those most at risk from climate change. Rich industrialised nations will be increasingly impacted as global temperatures rise. But it’s the low lying island states such as the Maldives and the Marshall Islands that are at imminent existential risk. As a later UN special report made clear, if the Paris Agreement was unable to limit global warming to 1.5°C, the number of lives lost to more intense storms, fires, heatwaves, famines and floods would significantly increase.

But dig a little deeper and you could find another emotion lurking within delegates on December 13. Doubt. We struggle to name any climate scientist who at that time thought the Paris Agreement was feasible. We have since been told by some scientists that the Paris Agreement was “of course important for climate justice but unworkable” and “a complete shock, no one thought limiting to 1.5°C was possible”. Rather than being able to limit warming to 1.5°C, a senior academic involved in the IPCC concluded we were heading beyond 3°C by the end of this century.

Instead of confront our doubts, we scientists decided to construct ever more elaborate fantasy worlds in which we would be safe. The price to pay for our cowardice: having to keep our mouths shut about the ever growing absurdity of the required planetary-scale carbon dioxide removal.

Taking centre stage was BECCS because at the time this was the only way climate-economic models could find scenarios that would be consistent with the Paris Agreement. Rather than stabilise, global emissions of carbon dioxide had increased some 60% since 1992.

Alas, BECCS, just like all the previous solutions, was too good to be true.

Across the scenarios produced by the Intergovernmental Panel on Climate Change (IPCC) with a 66% or better chance of limiting temperature increase to 1.5°C, BECCS would need to remove 12 billion tonnes of carbon dioxide each year. BECCS at this scale would require massive planting schemes for trees and bioenergy crops.

The Earth certainly needs more trees. Humanity has cut down some three trillion since we first started farming some 13,000 years ago. But rather than allow ecosystems to recover from human impacts and forests to regrow, BECCS generally refers to dedicated industrial-scale plantations regularly harvested for bioenergy rather than carbon stored away in forest trunks, roots and soils.

Currently, the two most efficient biofuels are sugarcane for bioethanol and palm oil for biodiesel – both grown in the tropics. Endless rows of such fast growing monoculture trees or other bioenergy crops harvested at frequent intervals devastate biodiversity.

It has been estimated that BECCS would demand between 0.4 and 1.2 billion hectares of land. That’s 25% to 80% of all the land currently under cultivation. How will that be achieved at the same time as feeding 8-10 billion people around the middle of the century or without destroying native vegetation and biodiversity?

Growing billions of trees would consume vast amounts of water – in some places where people are already thirsty. Increasing forest cover in higher latitudes can have an overall warming effect because replacing grassland or fields with forests means the land surface becomes darker. This darker land absorbs more energy from the Sun and so temperatures rise. Focusing on developing vast plantations in poorer tropical nations comes with real risks of people being driven off their lands.

And it is often forgotten that trees and the land in general already soak up and store away vast amounts of carbon through what is called the natural terrestrial carbon sink. Interfering with it could both disrupt the sink and lead to double accounting.

As these impacts are becoming better understood, the sense of optimism around BECCS has diminished.

Pipe dreams

Given the dawning realisation of how difficult Paris would be in the light of ever rising emissions and limited potential of BECCS, a new buzzword emerged in policy circles: the “overshoot scenario”. Temperatures would be allowed to go beyond 1.5°C in the near term, but then be brought down with a range of carbon dioxide removal by the end of the century. This means that net zero actually means carbon negative. Within a few decades, we will need to transform our civilisation from one that currently pumps out 40 billion tons of carbon dioxide into the atmosphere each year, to one that produces a net removal of tens of billions.

Mass tree planting, for bioenergy or as an attempt at offsetting, had been the latest attempt to stall cuts in fossil fuel use. But the ever-increasing need for carbon removal was calling for more. This is why the idea of direct air capture, now being touted by some as the most promising technology out there, has taken hold. It is generally more benign to ecosystems because it requires significantly less land to operate than BECCS, including the land needed to power them using wind or solar panels.

Unfortunately, it is widely believed that direct air capture, because of its exorbitant costs and energy demand, if it ever becomes feasible to be deployed at scale, will not be able to compete with BECCS with its voracious appetite for prime agricultural land.

It should now be getting clear where the journey is heading. As the mirage of each magical technical solution disappears, another equally unworkable alternative pops up to take its place. The next is already on the horizon – and it’s even more ghastly. Once we realise net zero will not happen in time or even at all, geoengineering – the deliberate and large scale intervention in the Earth’s climate system – will probably be invoked as the solution to limit temperature increases.

One of the most researched geoengineering ideas is solar radiation management – the injection of millions of tons of sulphuric acid into the stratosphere that will reflect some of the Sun’s energy away from the Earth. It is a wild idea, but some academics and politicians are deadly serious, despite significant risks. The US National Academies of Sciences, for example, has recommended allocating up to US$200 million over the next five years to explore how geoengineering could be deployed and regulated. Funding and research in this area is sure to significantly increase.

Difficult truths

In principle there is nothing wrong or dangerous about carbon dioxide removal proposals. In fact developing ways of reducing concentrations of carbon dioxide can feel tremendously exciting. You are using science and engineering to save humanity from disaster. What you are doing is important. There is also the realisation that carbon removal will be needed to mop up some of the emissions from sectors such as aviation and cement production. So there will be some small role for a number of different carbon dioxide removal approaches.

The problems come when it is assumed that these can be deployed at vast scale. This effectively serves as a blank cheque for the continued burning of fossil fuels and the acceleration of habitat destruction.

Carbon reduction technologies and geoengineering should be seen as a sort of ejector seat that could propel humanity away from rapid and catastrophic environmental change. Just like an ejector seat in a jet aircraft, it should only be used as the very last resort. However, policymakers and businesses appear to be entirely serious about deploying highly speculative technologies as a way to land our civilisation at a sustainable destination. In fact, these are no more than fairy tales.

The only way to keep humanity safe is the immediate and sustained radical cuts to greenhouse gas emissions in a socially just way.

Academics typically see themselves as servants to society. Indeed, many are employed as civil servants. Those working at the climate science and policy interface desperately wrestle with an increasingly difficult problem. Similarly, those that champion net zero as a way of breaking through barriers holding back effective action on the climate also work with the very best of intentions.

The tragedy is that their collective efforts were never able to mount an effective challenge to a climate policy process that would only allow a narrow range of scenarios to be explored.

Most academics feel distinctly uncomfortable stepping over the invisible line that separates their day job from wider social and political concerns. There are genuine fears that being seen as advocates for or against particular issues could threaten their perceived independence. Scientists are one of the most trusted professions. Trust is very hard to build and easy to destroy.

But there is another invisible line, the one that separates maintaining academic integrity and self-censorship. As scientists, we are taught to be sceptical, to subject hypotheses to rigorous tests and interrogation. But when it comes to perhaps the greatest challenge humanity faces, we often show a dangerous lack of critical analysis.

In private, scientists express significant scepticism about the Paris Agreement, BECCS, offsetting, geoengineering and net zero. Apart from some notable exceptions, in public we quietly go about our work, apply for funding, publish papers and teach. The path to disastrous climate change is paved with feasibility studies and impact assessments.

Rather than acknowledge the seriousness of our situation, we instead continue to participate in the fantasy of net zero. What will we do when reality bites? What will we say to our friends and loved ones about our failure to speak out now?

The time has come to voice our fears and be honest with wider society. Current net zero policies will not keep warming to within 1.5°C because they were never intended to. They were and still are driven by a need to protect business as usual, not the climate. If we want to keep people safe then large and sustained cuts to carbon emissions need to happen now. That is the very simple acid test that must be applied to all climate policies. The time for wishful thinking is over.

Biomass Falsely Counted As Carbon Neutral

Biomass Falsely Counted As Carbon Neutral

This article by was originally published on 29 July 2020 in Mongabay. Saul describes the outdated ideas linked to creating ‘biomass’ and illuminates the harm caused by creating even more CO2.


By Saul Elben/Mongabay.com

  • An outdated Kyoto Climate Agreement policy, grandfathered into the 2015 Paris Agreement, counts electrical energy produced by burning biomass — wood pellets — as carbon neutral. However, new science demonstrates that burning forests for energy is dirtier than coal and not carbon neutral in the short-term.
  • But with the carbon accounting loophole still on the books, European Union nations and other countries are rushing to convert coal plants to burn wood pellets, and to count giant biomass energy facilities as carbon neutral — valid on paper even as they add new carbon emissions to the atmosphere. The forest industry argues otherwise.
  • It too is capitalizing on the loophole, building large new wood pellet factories and logging operations in places like the U.S. Southeast — cutting down forests, pelletizing trees, and exporting biomass. A case in point are the two giant plants now being built by the Enviva Corporation in Lucedale, Mississippi and Epes, Alabama.
  • Enviva and other firms can only make biomass profitable by relying on government subsidies. In the end, forests are lost, carbon neutrality takes decades to achieve, and while communities may see a short-term boost in jobs, they suffer air pollution and the risk of sudden economic collapse if and when the carbon loophole is closed.

When biomass manufacturer Enviva completes its two newest U.S. Gulf Coast plants on opposite sides of the Alabama-Mississippi state line, likely by 2021, they will be the largest “biomass for energy” manufacturing plants on the planet.

Every year, the two factories will grind the equivalent of a hundred square miles of forest into 2.7 million metric tons of combustible wood pellets, to be burned at former coal plants in Europe and Asia — with all the resulting carbon released into the atmosphere.

These U.S. biomass plants, and the wood pellets they churn out, will thrive atop a shaky Jenga tower of political, economic and environmental paradoxes, according to environmentalists. Unable to compete with carbon fuels like coal or natural gas on price, Enviva’s wood pellet plants will stay afloat because of direct and implicit subsidies coming from the European Union, whose members agreed to derive 32% of their energy from renewables by 2030 — a category that they deemed to include biomass.

The EU endorsed this policy even though recent science has shown unequivocally that wood pellets release more CO2 even than coal.

Rule of thumb: to get from the 2.7 million metric tons of wood pellets produced annually to the amount of CO2 released from smokestacks, multiply roughly by four. That means the pellets the two new Gulf Coast mills produce, when burned abroad, could project a little over 10 million tons of CO2 into the atmosphere — the equivalent of 55,000 railroad cars of coal — all while soaking up subsidies that might otherwise go to traditional renewables like wind, tidal, or solar energy, according to Duncan Brack of the Chatham House international NGO and think tank.

Those subsidies, say scientists, are based on now debunked research first conducted and used as guidance for making policy incorporated into the Kyoto Climate Agreement, a policy then grandfathered into the 2015 Paris Agreement. They say the mistake that makes biomass economically viable today is the contention that burning up the world’s forests to produce energy is carbon neutral, an inconvenient untruth that, critics contend, the United Nations has dodged facing at every annual international meeting since Paris.


You can read the whole, original article here:

https://news.mongabay.com/2020/07/burning-down-the-house-envivas-giant-u-s-wood-pellet-plants-gear-up/