Net Zero Plans Are Largely Meaningless

Net Zero Plans Are Largely Meaningless

Editor’s note: “75 of the world’s largest 114 fossil fuel companies have now made net zero by 2050 commitments, yet not a single fossil fuel company has committed to phasing out oil and gas production by 2050 nor have any committed to ending exploration for new oil and gas fields or halting the extraction of existing reserves.”

Real Zero, not greenwashed ‘net zero,’ is essential. As the Corporate Accountability report concludes, it’s time to reject the big polluters’ agenda and implement programs that rapidly phase out fossil fuels and truly eliminate greenhouse gas emissions.”

We “obsess” over getting to “Net Zero” yearly CO2 increases in the atmosphere. The Moderates in Climate Science THEORIZE that when this happens, the GMST will IMMEDIATELY stop going up and will level off.

DOES IT LOOK LIKE “NET ZERO” is going to happen?

If your child is born this year, they are likely going to live through +1.5°C of warming by the time they are 25. A fact that is likely going to cause a 40% to 50% drop in the global food supply and a reduction of 2.5 billion — 4 billion in the global population by 2050, at a minimum.


 

The overshoot myth of bargaining: you can’t keep burning fossil fuels and expect scientists of the future to get us back to 1.5°C

Melting Antarctic glacier.
Shutterstock/Bernhard Staehli

James Dyke, University of Exeter; Robert Watson, University of East Anglia, and Wolfgang Knorr, Lund University

Record breaking fossil fuel production, all time high greenhouse gas emissions and extreme temperatures. Like the proverbial frog in the heating pan of water, we refuse to respond to the climate and ecological crisis with any sense of urgency. Under such circumstances, claims from some that global warming can still be limited to no more than 1.5°C take on a surreal quality.

For example, at the start of 2023’s international climate negotiations in Dubai, conference president, Sultan Al Jaber, boldly stated that 1.5°C was his goal and that his presidency would be guided by a “deep sense of urgency” to limit global temperatures to 1.5°C. He made such lofty promises while planning a massive increase in oil and gas production as CEO of the Abu Dhabi National Oil Company.

We should not be surprised to see such behaviour from the head of a fossil fuel company. But Al Jaber is not an outlier. Scratch at the surface of almost any net zero pledge or policy that claims to be aligned with the 1.5°C goal of the landmark 2015 Paris agreement and you will reveal the same sort of reasoning: we can avoid dangerous climate change without actually doing what this demands – which is to rapidly reduce greenhouse gas emissions from industry, transport, energy (70% of total) and food systems (30% of total), while ramping up energy efficiency.

A particularly instructive example is Amazon. In 2019 the company established a 2040 net zero target which was then verified by the UN Science Based Targets initiative (SBTi) which has been leading the charge in getting companies to establish climate targets compatible with the Paris agreement. But over the next four years Amazon’s emissions went up by 40%. Given this dismal performance, the SBTi was forced to act and removed Amazon and over 200 companies from its Corporate Net Zero Standard.

This is also not surprising given that net zero and even the Paris agreement have been built around the perceived need to keep burning fossil fuels, at least in the short term. Not do so would threaten economic growth, given that fossil fuels still supply over 80% of total global energy. The trillions of dollars of fossil fuel assets at risk with rapid decarbonisation have also served as powerful brakes on climate action.

Overshoot

The way to understand this doublethink: that we can avoid dangerous climate change while continuing to burn fossil fuels – is that it relies on the concept of overshoot. The promise is that we can overshoot past any amount of warming, with the deployment of planetary-scale carbon dioxide removal dragging temperatures back down by the end of the century.

This not only cripples any attempt to limit warming to 1.5°C, but risks catastrophic levels of climate change as it locks us in to energy and material-intensive solutions which for the most part exist only on paper.

To argue that we can safely overshoot 1.5°C, or any amount of warming, is saying the quiet bit out loud: we simply don’t care about the increasing amount of suffering and deaths that will be caused while the recovery is worked on.


This article is part of Conversation Insights.

Our co-editors commission long-form journalism, working with academics from many different backgrounds who are engaged in projects aimed at tackling societal and scientific challenges.


A key element of overshoot is carbon dioxide removal. This is essentially a time machine – we are told we can turn back the clock of decades of delay by sucking carbon dioxide directly out of the atmosphere. We don’t need rapid decarbonisation now, because in the future we will be able to take back those carbon emissions. If or when that doesn’t work, we are led to believe that even more outlandish geoengineering approaches such as spraying sulphurous compounds into the high atmosphere in an attempt to block out sunlight – which amounts to planetary refrigeration – will save us.

The 2015 Paris agreement was an astonishing accomplishment. The establishment of 1.5°C as being the internationally agreed ceiling for warming was a success for those people and nations most exposed to climate change hazards. We know that every fraction of a degree matters. But at the time, believing warming could really be limited to well below 2°C required a leap of faith when it came to nations and companies putting their shoulder to the wheel of decarbonisation. What has happened instead is that the net zero approach of Paris is becoming detached from reality as it is increasingly relying on science fiction levels of speculative technology.

There is arguably an even bigger problem with the Paris agreement. By framing climate change in terms of temperature, it focuses on the symptoms, not the cause. 1.5°C or any amount of warming is the result of humans changing the energy balance of the climate by increasing the amount of carbon dioxide in the atmosphere. This traps more heat. Changes in the global average temperature is the established way of measuring this increase in heat, but no one experiences this average.

Climate change is dangerous because of weather that affects particular places at particular times. Simply put, this extra heat is making weather more unstable. Unfortunately, having temperature targets makes solar geoengineering seem like a sensible approach because it may lower temperatures. But it does this by not reducing, but increasing our interference in the climate system. Trying to block out the sun in response to increasing carbon emissions is like turning on the air conditioning in response to a house fire.

In 2021 we argued that net zero was a dangerous trap. Three years on and we can see the jaws of this trap beginning to close, with climate policy being increasingly framed in terms of overshoot. The resulting impacts on food and water security, poverty, human health, the destruction of biodiversity and ecosystems will produce intolerable suffering.

The situation demands honesty, and a change of course. If this does not materialise then things are likely to deteriorate, potentially rapidly and in ways that may be impossible to control.

Au revoir Paris

The time has come to accept that climate policy has failed, and that the 2015 landmark Paris agreement is dead. We let it die by pretending that we could both continue to burn fossil fuels and avoid dangerous climate change at the same time. Rather than demand the immediate phase out of fossil fuels, the Paris agreement proposed 22nd-century temperature targets which could be met by balancing the sources and sinks of carbon. Within that ambiguity net zero flourished. And yet apart from the COVID economic shock in 2020, emissions have increased every year since 2015, reaching an all time high in 2023.

Despite there being abundant evidence that climate action makes good economic sense (the cost of inaction vastly exceeds the cost of action), no country strengthened their pledges at the last three COPs (the annual UN international meetings) even though it was clear that the world was on course to sail past 2°C, let alone 1.5°C. The Paris agreement should be producing a 50% reduction in greenhouse gas emissions by 2030, but current policies mean that they are on track to be higher than they are today.

Net Zero
Greenhouse gas emissions continue to rise.
Catazul/Pixabay, CC BY

Editor’s note: DGR knows that “renewable” technologies are not sustainable and that the only transition will be to a future that does not include civilization.

We do not deny that significant progress has been made with renewable technologies. Rates of deployment of wind and solar have increased each year for the past 22 years and carbon emissions are going down in some of the richest nations, including the UK and the US. But this is not happening fast enough. A central element of the Paris agreement is that richer nations need to lead decarbonisation efforts to give lower income nations more time to transition away from fossil fuels. Despite some claims to the contrary, the global energy transition is not in full swing. In fact, it hasn’t actually begun because the transition demands a reduction in fossil fuel use. Instead it continues to increase year-on-year.

And so policymakers are turning to overshoot in an attempt to claim that they have a plan to avoid dangerous climate change. A central plank of this approach is that the climate system in the future will continue to function as it does today. This is a reckless assumption.

2023’s warning signs

At the start of 2023, Berkeley Earth, NASA, the UK Met Office, and Carbon Brief predicted that 2023 would be slightly warmer than the previous year but unlikely to set any records. Twelve months later and all four organisations concluded that 2023 was by some distance the warmest year ever recorded. In fact, between February 2023 and February 2024 the global average temperature warming exceeded the Paris target of 1.5°C.

The extreme weather events of 2023 give us a glimpse of the suffering that further global warming will produce. A 2024 report from the World Economic Forum concluded that by 2050 climate change may have caused over 14 million deaths and US$12.5 trillion in loss and damages.

Currently we cannot fully explain why global temperatures have been so high for the past 18 months. Changes in dust, soot and other aerosols are important, and there are natural processes such as El Niño that will be having an effect.

But it appears that there is still something missing in our current understanding of how the climate is responding to human impacts. This includes changes in the Earth’s vital natural carbon cycle.

Around half of all the carbon dioxide humans have put into the atmosphere over the whole of human history has gone into “carbon sinks” on land and the oceans. We get this carbon removal “for free”, and without it, warming would be much higher. Carbon dioxide from the air dissolves in the oceans (making them more acidic which threatens marine ecosystems). At the same time, increasing carbon dioxide promotes the growth of plants and trees which locks up carbon in their leaves, roots, trunks.

All climate policies and scenarios assume that these natural carbon sinks will continue to remove tens of billions of tons of carbon from the atmosphere each year. There is evidence that land-based carbon sinks, such as forests, removed significantly less carbon in 2023. If natural sinks begin to fail – something they may well do in a warmer world – then the task of lowering global temperatures becomes even harder. The only credible way of limiting warming to any amount, is to stop putting greenhouse gasses into the atmosphere in the first place.

Science fiction solutions

It’s clear that the commitments countries have made to date as part of the Paris agreement will not keep humanity safe while carbon emissions and temperatures continue to break records. Indeed, proposing to spend trillions of dollars over this century to suck carbon dioxide out of the air, or the myriad other ways to hack the climate is an acknowledgement that the world’s largest polluters are not going to curb the burning of fossil fuels.

Direct Air Capture (DAC), Bio Energy Carbon Capture and Storage (BECCS), enhanced ocean alkalinity, biochar, sulphate aerosol injection, cirrus cloud thinning – the entire wacky races of carbon dioxide removal and geoengineering only makes sense in a world of failed climate policy.

Net Zero
Is ‘cloud thinning’ really a possibility?
HarmonyCenter/Pixabay, CC BY

Over the following years we are going to see climate impacts increase. Lethal heatwaves are going to become more common. Storms and floods are going to become increasingly destructive. More people are going to be displaced from their homes. National and regional harvests will fail. Vast sums of money will need to be spent on efforts to adapt to climate change, and perhaps even more compensating those who are most affected. We are expected to believe that while all this and more unfolds, new technologies that will directly modify the Earth’s atmosphere and energy balance will be successfully deployed.

What’s more, some of these technologies may need to operate for three hundred years in order for the consequences of overshoot to be avoided. Rather than quickly slow down carbon polluting activities and increasing the chances that the Earth system will recover, we are instead going all in on net zero and overshoot in an increasingly desperate hope that untested science fiction solutions will save us from climate breakdown.

We can see the cliff edge rapidly approaching. Rather than slam on the brakes, some people are instead pushing their foot down harder on the accelerator. Their justification for this insanity is that we need to go faster in order to be able to make the jump and land safely on the other side.

We believe that many who advocate for carbon dioxide removal and geoengineering do so in good faith. But they include proposals to refreeze the Arctic by pumping up sea water onto ice sheets to form new layers of ice and snow. These are interesting ideas to research, but there is very little evidence this will have any effect on the Arctic let alone global climate. These are the sorts of knots that people tie themselves up in when they acknowledge the failure of climate policy, but refuse to challenge the fundamental forces behind such failure. They are unwittingly slowing down the only effective action of rapidly phasing out fossil fuels.

That’s because proposals to remove carbon dioxide from the air or geoengineer the climate promise a recovery from overshoot, a recovery that will be delivered by innovation, driven by growth. That this growth is powered by the same fossil fuels that are causing the problem in the first place doesn’t feature in their analysis.

The bottom line here is that the climate system is utterly indifferent to our pledges and promises. It doesn’t care about economic growth. And if we carry on burning fossil fuels then it will not stop changing until the energy balance is restored. By which time millions of people could be dead, with many more facing intolerable suffering.

Major climate tipping points

Even if we assume that carbon removal and even geoengineering technologies can be deployed in time, there is a very large problem with the plan to overshoot 1.5°C and then lower temperatures later: tipping points.

The science of tipping points is rapidly advancing. Late last year one of us (James Dyke) along with over 200 academics from around the world was involved in the production of the Global Tipping Points Report. This was a review of the latest science about where tipping points in the climate system may be, as well as exploring how social systems can undertake rapid change (in the direction that we want) thereby producing positive tipping points. Within the report’s 350 pages is abundant evidence that the overshoot approach is an extraordinarily dangerous gamble with the future of humanity. Some tipping points have the potential to cause global havoc.

The melt of permafrost could release billions of tons of greenhouse gasses into the atmosphere and supercharge human-caused climate change. Fortunately, this seems unlikely under the current warming. Unfortunately, the chance that ocean currents in the North Atlantic could collapse may be much higher than previously thought. If that were to materialise, weather systems across the world, but in particular in Europe and North America, would be thrown into chaos. Beyond 1.5°C, warm water coral reefs are heading towards annihilation. The latest science concludes that by 2°C global reefs would be reduced by 99%. The devastating bleaching event unfolding across the Great Barrier Reef follows multiple mass mortality events. To say we are witnessing one of the world’s greatest biological wonders die is insufficient. We are knowingly killing it.

We may have even already passed some major climate tipping points. The Earth has two great ice sheets, Antarctica, and Greenland. Both are disappearing as a consequence of climate change. Between 2016 and 2020, the Greenland ice sheet lost on average 372 billion tons of ice a year. The current best assessment of when a tipping point could be reached for the Greenland ice sheet is around 1.5°C.

This does not mean that the Greenland ice sheet will suddenly collapse if warming exceeds that level. There is so much ice (some 2,800 trillion tons) that it would take centuries for all of it to melt over which time sea levels would rise seven metres. If global temperatures could be brought back down after a tipping point, then maybe the ice sheet could be stabilised. We just cannot say with any certainty that such a recovery would be possible. While we struggle with the science, 30 million tons of ice is melting across Greenland every hour on average.

Net Zero
Ice sheets in Greenland and Antarctica are being affected by global warming.
Pexels from Pixabay, CC BY

The take home message from research on these and other tipping points is that further warming accelerates us towards catastrophe. Important science, but is anyone listening?

It’s five minutes to midnight…again

We know we must urgently act on climate change because we are repeatedly told that time is running out. In 2015, Professor Jeffrey Sachs, the UN special adviser and director of The Earth Institute, declared:

The time has finally arrived – we’ve been talking about these six months for many years but we’re now here. This is certainly our generation’s best chance to get on track.

In 2019 (then) Prince Charles gave a speech in which he said: “I am firmly of the view that the next 18 months will decide our ability to keep climate change to survivable levels and to restore nature to the equilibrium we need for our survival.”

“We have six months to save the planet,” exhorted International Energy Agency head Fatih Birol – one year later in 2020. In April 2024, Simon Stiell, executive secretary of the United Nations Framework Convention on Climate Change said the next two years are “essential in saving our planet”.

Either the climate crisis has a very fortunate feature that allows the countdown to catastrophe to be continually reset, or we are deluding ourselves with endless declarations that time has not quite run out. If you can repeatedly hit snooze on your alarm clock and roll over back to sleep, then your alarm clock is not working.

Or there is another possibility. Stressing that we have very little time to act is intended to focus attention on climate negotiations. It’s part of a wider attempt to not just wake people up to the impending crisis, but generate effective action. This is sometimes used to explain how the 1.5°C threshold of warming came to be agreed. Rather than a specific target, it should be understood as a stretch goal. We may very well fail, but in reaching for it we move much faster than we would have done with a higher target, such as 2°C. For example, consider this statement made in 2018:

Stretching the goal to 1.5 degrees celsius isn’t simply about speeding up. Rather, something else must happen and society needs to find another lever to pull on a global scale.

What could this lever be? New thinking about economics that goes beyond GDP? Serious consideration of how rich industrialised nations could financially and materially help poorer nations to leapfrog fossil fuel infrastructure? Participatory democracy approaches that could help birth the radical new politics needed for the restructuring of our fossil fuel powered societies? None of these.

The lever in question is Carbon Capture and Storage (CCS) because the above quote comes from an article written by Shell in 2018. In this advertorial Shell argues that we will need fossil fuels for many decades to come. CCS allows the promise that we can continue to burn fossil fuels and avoid carbon dioxide pollution by trapping the gas before it leaves the chimney. Back in 2018, Shell was promoting its carbon removal and offsets heavy Sky Scenario, an approach described as “a dangerous fantasy” by leading climate change academics as it assumed massive carbon emissions could be offset by tree planting.

Since then Shell has further funded carbon removal research within UK universities presumably in efforts to burnish its arguments that it must be able to continue to extract vast amounts of oil and gas.

Shell is far from alone in waving carbon capture magic wands. Exxon is making great claims for CCS as a way to produce net zero hydrogen from fossil gas – claims that have been subject to pointed criticism from academics with recent reporting exposing industry wide greenwashing around CCS.

But the rot goes much deeper. All climate policy scenarios that propose to limit warming to near 1.5°C rely on the largely unproven technologies of CCS and BECCS. BECCS sounds like a good idea in theory. Rather than burn coal in a power station, burn biomass such as wood chips. This would initially be a carbon neutral way of generating electricity if you grew as many trees as you cut down and burnt. If you then add scrubbers to the power station chimneys to capture the carbon dioxide, and then bury that carbon deep underground, then you would be able to generate power at the same time as reducing concentrations of carbon dioxide in the atmosphere.

Unfortunately, there is now clear evidence that in practice, large-scale BECCS would have very adverse effects on biodiversity, and food and water security given the large amounts of land that would be given over to fast growing monoculture tree plantations. The burning of biomass may even be increasing carbon dioxide emissions. Drax, the UK’s largest biomass power station now produces four times as much carbon dioxide as the UK’s largest coal-fired power station.

Five minutes to midnight messages may be motivated to try to galvanise action, to stress the urgency of the situation and that we still (just) have time. But time for what? Climate policy only ever offers gradual change, certainly nothing that would threaten economic growth, or the redistribution of wealth and resources.

Despite the mounting evidence that globalised, industrialised capitalism is propelling humanity towards disaster, five minutes to midnight does not allow time and space to seriously consider alternatives. Instead, the solutions on offer are techno fixes that prop up the status quo and insists that fossil fuel companies such as Shell must be part of the solution.

That is not to say there are no good faith arguments for 1.5°C. But being well motivated does not alter reality. And the reality is that warming will soon pass 1.5°C, and that the Paris agreement has failed. In the light of that, repeatedly asking people to not give up hope, that we can avoid a now unavoidable outcome risks becoming counterproductive. Because if you insist on the impossible (burning fossil fuels and avoiding dangerous climate change), then you must invoke miracles. And there is an entire fossil fuel industry quite desperate to sell such miracles in the form of CCS.

Four suggestions

Humanity has enough problems right now, what we need are solutions. This is the response we sometimes get when we argue that there are fundamental problems with the net zero concept and the Paris agreement. It can be summed up with the simple question: so what’s your suggestion? Below we offer four.

1. Leave fossil fuels in the ground

The unavoidable reality is that we need to rapidly stop burning fossil fuels. The only way we can be sure of that is by leaving them in the ground. We have to stop exploring for new fossil fuel reserves and the exploitation of existing ones. That could be done by stopping fossil fuel financing.

At the same time we must transform the food system, especially the livestock sector, given that it is responsible for nearly two thirds of agricultural emissions. Start there and then work out how best the goods and services of economies can be distributed. Let’s have arguments about that based on reality not wishful thinking.

2. Ditch net zero crystal ball gazing targets

The entire framing of mid and end-century net zero targets should be binned. We are already in the danger zone. The situation demands immediate action, not promises of balancing carbon budgets decades into the future. The SBTi should focus on near-term emissions reductions. By 2030, global emissions need to be half of what they are today for any chance of limiting warming to no more than 2°C.

It is the responsibility of those who hold most power – politicians and business leaders – to act now. To that end we must demand twin targets – all net zero plans should include a separate target for actual reductions in greenhouse gas emissions. We must stop hiding inaction behind promises of future removals. It’s our children and future generations that will need to pay back the overshoot debt.

3. Base policy on credible science and engineering

All climate policies must be based on what can be done in the real world now, or in the very near future. If it is established that a credible amount of carbon can be removed by a proposed approach – which includes capture and its safe permanent storage – then and only then can this be included in net zero plans. The same applies to solar geoengineering.

Speculative technologies must be removed from all policies, pledges and scenarios until we are sure of how they will work, how they will be monitored, reported and validated, and what they will do to not just the climate but the Earth system as a whole. This would probably require a very large increase in research. As academics we like doing research. But academics need to be wary that concluding “needs more research” is not interpreted as “with a bit more funding this could work”.

4. Get real

Finally, around the world there are thousands of groups, projects, initiatives, and collectives that are working towards climate justice. But while there is a Climate Majority Project, and a Climate Reality Project, there is no Climate Honesty Project (although People Get Real does come close). In 2018 Extinction Rebellion was formed and demanded that governments tell the truth about the climate crisis and act accordingly. We can now see that when politicians were making their net zero promises they were also crossing their fingers behind their backs.

We need to acknowledge that net zero and now overshoot are becoming used to argue that nothing fundamental needs to change in our energy intensive societies. We must be honest about our current situation, and where we are heading. Difficult truths need to be told. This includes highlighting the vast inequalities of wealth, carbon emissions, and vulnerability to climate change.

The time for action is now

We rightly blame politicians for failing to act. But in some respects we get the politicians we deserve. Most people, even those that care about climate change, continue to demand cheap energy and food, and a constant supply of consumer products. Reducing demand by just making things more expensive risks plunging people into food and energy poverty and so policies to reduce emissions from consumption need to go beyond market-based approaches. The cost of living crisis is not separate from the climate and ecological crisis. They demand that we radically rethink how our economies and societies function, and whose interests they serve.

To return to the boiling frog predicament at the start, it’s high time for us to jump out of the pot. You have to wonder why we did not start decades ago. It’s here that the analogy offers valuable insights into net zero and the Paris agreement. Because the boiling frog story as typically told misses out a crucial fact. Regular frogs are not stupid. While they will happily sit in slowly warming water, they will attempt to escape once it becomes uncomfortable. The parable as told today is based on experiments at the end of the 19th century that involved frogs that had been “pithed” – a metal rod had been inserted into their skulls that destroyed their higher brain functioning. These radically lobotomised frogs would indeed float inert in water that was cooking them alive.

Promises of net zero and recovery from overshoot are keeping us from struggling to safety. They assure us nothing too drastic needs to happen just yet. Be patient, relax. Meanwhile the planet burns and we see any sort of sustainable future go up in smoke.

Owning up to the failures of climate change policy doesn’t mean giving up. It means accepting the consequences of getting things wrong, and not making the same mistakes. We must plan routes to safe and just futures from where we are, rather where we would wish to be. The time has come to leap.


For you: more from our Insights series:

To hear about new Insights articles, join the hundreds of thousands of people who value The Conversation’s evidence-based news. Subscribe to our newsletter.The Conversation

James Dyke, Associate Professor in Earth System Science, University of Exeter; Robert Watson, Emeritus Professor in Environmental Sciences, University of East Anglia, and Wolfgang Knorr, Senior Research Scientist, Physical Geography and Ecosystem Science, Lund University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Dominick A. DellaSala: The Importance of Fire in Resilient Ecosystems

Dominick A. DellaSala: The Importance of Fire in Resilient Ecosystems

Editor’s note: The following is the testimony of Dr. Dominick A. DellaSala, Chief Scientist of Geos Institute, Ashland, Oregon, before the U.S. House of Representatives Natural Resources Committee, Subcommittee on Oversight and Investigations, “Exploring Solutions to Reduce Risks of Catastrophic Wildfire and Improve Resilience of National Forests,” on September 27, 2017.

Chairman Westerman, Ranking member Hanabusa, and subcommittee members, thank you for the opportunity to discuss wildfires on national forests. I am the Chief Scientist of the nonprofit organization, Geos Institute in Ashland, Oregon. Geos Institute works with agencies, landowners, and decision makers in applying the best science to climate change planning and forest management. As a scientist, I have published in peer-reviewed journals on fire ecology and climate change, I am on the editorial board of several leading journals and encyclopedias, and I have been on the faculty of Oregon State University and Southern Oregon University. A recent book I co-authored with 28 other scientists outlined the ecological importance of mixed-severity fires in maintaining fire-resilient ecosystems, including ways to coexist with wildfire (DellaSala and Hanson 2015).

Wildfires are necessary natural disturbance processes that forests need to rejuvenate. Most wildfires in pine and mixed-conifer forests of the West burn in mixed fire intensities at the landscape scale that produce large and small patches of low to high tree mortality. This tapestry of burned patches is associated with extraordinary plant and wildlife diversity, including habitat for many big game and bird species that thrive in the newly established forests. From an ecosystem perspective, natural disturbances like wildfires are not an ecological catastrophe. However, given there are now 46 million homes in naturally fire-prone areas (Rasker 2015), and no end in sight for new development, we must find ways to coexist with natural disturbance processes as they are increasing in places due to climate change.

In my testimony today, I will discuss how proposals that call for increased logging and decreased environmental review in response to wildfires and insect outbreaks are not science driven, in many cases may make problems worse, and will not stem rising wildfire suppression costs. I will also discuss what we know about forest fires and beetle outbreaks in relation to climate change, limitations of thinning and other forms of logging in relation to wildfire and insect management, and I will conclude with recommendations for moving forward based on best available science.

I. WHAT WE KNOW ABOUT RECENT FOREST FIRE INCREASES

Recent increases in acres burned of forests are mainly due to a changing climate – Scientists have known for sometime that fire activity tracks regional weather patterns, which in turn, are governed by global climatic forces such as the Pacific Decadal Oscillation (PDO – a recurring long-lived El Niño-like pattern of Pacific climate variability– see chart 1). For instance, the very active fire seasons of the 1910-1930s, occurred during prolonged drought cycles determined by the PDO that resulted in much larger areas burning historically than today (Powell et al. 1994; Interagency Federal Wildland Fire Policy Review Working Group 2001; Egan 2010) (chart 1). In fact, compared to the historic warm PDO phase of the early 1900s, most of the West is actually experiencing a fire deficit (Littell et al. 2009, Parks et al. 2012). However, with warming temperatures, early spring snowmelt, and longer fire seasons over the past few decades more acres are burning each year (Westerling et al. 2006; Littell et al. 2009) (chart 1).

For instance, wildfire season in the West has lengthened from an average of five to seven months, and the number of large wildfires (>1,000 acres) has increased since the 1980s (Dennison et al. 2014) from 140 to 250 per year (UCS 2017). This is occurring as average annual temperature in the West has risen by nearly 2 degrees F since 1970s and winter snow pack has declined (UCS 2017). If measures are not taken to stem greenhouse gas emissions, wildfire acres are projected to increase further in dry areas as annual temperatures are expected to rise another 2.5 to 6.5 degrees F by mid century (UCS 2017). Some researchers estimate more than half of the increase in acres burned over the past several decades is related to climate change (Abatzoglou and Williams 2016). This increase is expected to continue with additional warming leading to even greater suppression costs if the agencies continue to suppress fires across the landscape (Schoennagel et al. 2017).

Increasing Human Development is Lengthening Wildfire Seasons and Adding to Fire Ignitions – The direct role of human-access via roads and development in the Wildlands Urban Interface (WUI) is increasing wildfire activity. Scientists recently evaluated over 1.5 million government records of wildfires nationwide from 1992 to 2012 (Balch et al. 2015). During that time, human-caused fire ignitions have vastly expanded the spatial and seasonal occurrence of fire, accounting for 84 percent of all wildfire starts and 44 percent of the total area burned nationally. We now have the phenomenon of a human-caused fire season, which was three times longer than the lightning-caused fire season and added an average of 40,000 wildfires per year across the US over this 20-year period of time. Ignitions caused by people – whether accidental or arson – have substantial economic costs. This will only worsen with continued development of the WUI adding to the 46 million homes (Rasker 2015) already in these fire-prone areas.

Thus, given expansion of homes in the WUI, the best way to limit damage to homes is to reduce fire risks by working from the home-outward instead of the wildlands-inward (Syphard et al. 2013). For instance, if a fire-brand travels miles ahead and lands on a flammable roof that home is very likely to burn compared to a home that has a fire resistant roof and cleared vegetation within a narrow defensible space of 100-200 feet immediately surrounding the home (Cohen 2000). Logging outside of this narrow zone does not change home ignition factors.

II. WHAT WE KNOW ABOUT FIRE AND FOREST MANAGEMENT

Wilderness and other protected areas are not especially prone to forest fires – proposals to remove environmental protections to increase logging for wildfire concerns based on the assumption that unmanaged – or protected areas – burn more intensely are misplaced. For instance, scientists (Bradley et al. 2016 of which I was a co-author) recently examined the intensity of 1,500 forest fires affecting over 23 million acres during the past four decades in 11 western states. We tested the common perception that forest fires burn hottest (most intensely) in wilderness and national parks while burning cooler (less intensely) or not at all in areas where logging had occurred. What we found was the opposite – fires burned most intense in previously logged areas, while they burned in natural fire mosaic patterns in wilderness, parks, and roadless areas, thereby, maintaining resilient forests (see chart 2). Consequently, there is no reason for reducing environmental protections.

State lands are not at lower wildfire risks compared to federal lands – there is much discussion about whether state lands are being managed in a way that reduces fire occurrence and intensity. However, in a recent report of wildfire risk (that included acres likely to burn), scientists (Zimmerman and Livesay 2017) used the West Wide Wildfire Risk Assessment model, an important assessment tool of the Council of Western State Foresters and Western Forestry Leadership Coalition. They evaluated risk for western states based on historical fire data, topography, vegetation, tree cover, climate, and other factors. According to the Center for Western Priorities analysis, state (22%) and federal (23%) lands have approximately equivalent levels of fire risks in the West, and for some states, risks were higher than federal lands. Notably, allegations of higher fire risk based solely on the number of federal acres burned in a fire season are misleading as there are over 7 times as many federal lands (362 million acres) in 11 Western states as compared to state-owned lands (49 million acres) (Zimmerman and Livesay 2017).

Thinning is Ineffective in Extreme Fire Weather – thinning/logging is most often proposed to reduce fire risk and lower fire intensity. Thinning-from-below of small diameter trees followed by prescribed fire in certain forest types can reduce fire severity (Brown et al. 2004, Kalies and Kent 2016) but only when there is not extreme fire weather (Moritz et al. 2014, Schoennagel et al. 2017). Fires occurring during extreme fire-weather (high winds, high temperatures, low humidity, low fuel moisture) will burn over large landscapes, regardless of thinning, and in some cases can burn hundreds or thousands of acres in just a few days (Stephens et al. 2015, Schoennagel et al. 2017). Fires driven by fire weather are unstoppable and are unsafe for fire fighters to attempt putting them out, and, as discussed, are more likely under a changing climate.

Further, there is a very low probability of a thinned site actually encountering a fire during the narrow window when tree density is lowest. For example, the probability of a fire hitting an area that has been thinned is about 3-8% on average, and thinning would need to be repeated every 10-15 years (depending on site productivity) to keep fuels at a minimum (Rhodes and Baker 2008).

Thinning too much of the overstory trees in a stand, especially removal of large fire-resistant trees, can increase the rate of fire spread by opening tree canopies and letting in more wind, can damage soils, introduce invasive species that increase flammable understory fuels, and impact wildlife habitat (Brown et al. 2004). Thinning also requires an extensive and expensive roads network that can degrade water quality by altering hydrological functions, including chronic sediment loads.

Post-disturbance salvage logging reduces forest resilience and can raise fire hazards –commonly practiced after natural disturbances like fires or insect outbreaks, post-disturbance logging hinders forest resilience by compacting soils, killing natural regeneration of conifer seedlings and shrubs associated with forest renewal, increasing fine fuels from slash left on the ground that aids the spread of fire, removing the most fire-resistant large live and dead trees, and degrading fish and wildlife habitat. Further roads that increase sediment flow to streams triggering widespread water quality problems (Lindenmayer et al. 2008).

III. WHAT WE KNOW ABOUT BEETLE-KILLED FORESTS AND FOREST MANAGEMENT

Beetle Killed Forests are Not More Susceptible to Forest Fires – forests in the West are being affected by the largest outbreaks of bark beetles in decades, which has caused concern about forest resilience and wildfire risk and led to proposals for widespread tree removals. Such proposals stem in part from the rationale that bark beetle outbreaks increase wildfire risks due to dead trees and that logging in beetle-affected forests would therefore lower such risks. However, beetle-killed forests are not more susceptible to forest fires (Bond et al. 2009, Hart et al. 2015, Meigs et al. 2016). This is mainly because when conifers die due to drought or native bark beetles, the combustible oils in the needles quickly begin to dissipate, needles and small twigs begin to fall to the ground. Without the fine fuels that facilitate fire spread, potential crown fires are actually lowered in forests with beetle mortality (Donato et al. 2013). The beetle-killed standing dead trees (snags) are the least flammable part of the forest and act more like a large log in a campfire, rather than kindling which is what causes fire spread.

In fact, studies of beetle-killed forests in the West found that when fires occurred during or immediately after the pulse of snag recruitment from beetle kill, fire severity consistently declined in the stands with high snag densities in the following decades (Meigs et al. 2016). In pine and mixed-conifer forests of the San Bernardino National Forest (CA), fires occurred immediately after a large pulse of snag recruitment from drought and beetles. However, scientists (Bond et al. 2009) found “no evidence that pre-fire tree mortality influenced fire severity.” In studies of beetles and wildfires across the western U.S., scientists (Hart et al. 2015) stated “contrary to the expectation of increased wildfire activity in recently infested red-stage stands, we found no difference between observed area and expected area burned in red-stage or subsequent gray-stage stands during three peak years of wildfire activity, which account for 46 percent of area burned during the 2002–2013 period.” And finally, in a comprehensive review of fire-beetle relations in mixed-conifer and ponderosa pine forests of the Pacific Northwest, scientists (Meigs et al. 2016) found: “in contrast to common assumptions of positive feedbacks, we find that insects generally reduce the severity of subsequent wildfires. Specific effects vary with insect type and timing, but insects decrease the abundance of live vegetation susceptible to wildfire at multiple time lags. By dampening subsequent burn severity, native insects could buffer rather than exacerbate fire regime changes expected due to land use and climate change.”

Most importantly, climate change is allowing more insects to survive the winter, triggering the rash of recent outbreaks (Meigs et al. 2016).

Thinning cannot limit or contain beetle outbreaks – once beetle populations reach widespread epidemic levels, thinning treatments aimed at stopping them do not reduce outbreak susceptibility as beetles over run natural forest defenses with or without thinning (Black et al. 2013).

IV. CLOSING REMARKS AND RECOMMENDATIONS

In sum,

 Recent increases in wildfires and insect outbreaks are a result of a changing climate coupled with human-activities including expansion of homes and roads into the WUI that will only continue to drive up fire suppression costs.
 Policies should be examined that discourage continued growth in the WUI; any new development must include defensible space and construction from non-flammable materials.
 The most effective way to protect homes is to create defensible space in the immediate 100 feet of a structure and use of non-flammable materials. Wildland fire policy should fund defensible space, not more logging and thinning miles away from communities.
 No amount of logging can stop insect outbreaks or large fires under extreme fire weather. Logging may, in fact, increase the amount of unnatural disturbances by homogenizing landscapes with more even aged trees, residual slash left on the ground, and compounding cumulative impacts to ecosystems.
 Thinning of small trees in certain forest types, maintaining canopy closure and in combination with prescribed fire can reduce fire intensity but treatment efficacy is limited in extreme fire weather, and by the small chance that a thinned site will encounter a fire during a very narrow window when fuels are lowest.

CITATIONS

Balch, J. K., B.A. Bradley, J.T. Abatzoglou et al. 2016. Human-started wildfires expand the fire niche across the United States. PNAS 114: 2946-2951.

Black, S.H., D. Kulakowski, B.R. Noon, and D.A. DellaSala. 2013. Do bark beetle outbreaks increase wildfire risks in the Central U.S. Rocky Mountains: Implications from Recent Research. Natural Areas Journal 33:59-65.

Bond, M.L., D.E. Lee, C.M. Bradley, and C.T. Hanson. 2009. Influence of pre-fire tree mortality on fire severity in conifer forests of the San Bernardino Mountains, California. The Open Forest Science Journal 2:41-47.

Bradley, C.M., C.T. Hanson, and D.A. DellaSala. 2016. Does increased forest protection correspond to higher fire severity in frequent-fire forests of the western United States? Ecosphere 7:1-13.

Brown, R.T., J.K. Agee, and J.F. Franklin. 2004. Forest restoration and fire: principles in the context of place. Conservation Biology 18:903-912.

Cohen, J.D. 2000. Preventing disaster: home ignitability in the wildland-urban interface. Journal of Forestry 98: 15-21.

DellaSala, D.A., and C.T. Hanson. 2015. The ecological importance of mixed-severity fires: nature’s phoenix. Elsevier: Boston, MA.

Dennison, P., S. Brewer, J. Arnold, and M. Moritz. 2014. Large wildfire trends in the western United States, 1984-2011. Geophysics Research Letters 41:2928-2933.

Donato, D.C., B.J. Harvey, W.H. Romme, M. Simard, and M.G. Turner. 2013. Bark beetle effects on fuel profiles across a range of stand structures in Douglas-fir forests of Greater Yellowstone. Ecological Applications 23:3-20.

Egan, T. 2010. The Big burn. Huffman Mifflin Harcourt: Boston.

Hart, S.J., T.T. Veblen, N. Mietkiewicz, and D. Kulakowski. 2015. Negative feedbacks on bark beetle outbreaks: widespread and severe spruce beetle infestation restricts subsequent infestation. PlosOne: DOI:10.1371/journal.pone.0127975

Kalies, E.I., and L.L. Yocom Kent. 2016. Tamm Review: Are fuel treatments effective at achieving ecological and social objectives? A systematic review. Forest Ecology and Management 375-84-95.

Lindenmayer, D.B., P.J. Burton, and J.F. Franklin. 2008. Salvage logging and its ecological consequences. Island Press: Washington, D.C.

Littell, J.S., D. McKenzie, D.L. Peterson, and A.L. Westerling. 2009. Climate and wildfire area burned in western U.S. ecoprovinces, 1916-2003. Ecological Applications 19:1003-1021.

Meigs, G.W., H.S.J. Zald, J. L. Campbell, W.S. Keeton, and R.E. Kennedy. 2016. Do insect outbreaks reduce the severity of subsequent forest fires? Environmental Research Letters 11 doi:10.1088/1748-9326/11/4/045008.

Moritz, M.A., E. Batllori, R.A. Bradstock, A.M. Gill, J. Handmer, P.F. Hessburg, J. Leonard, S. McCaffrey, D.C. Odion, T. Schoennagel, and A.D. Syphard. 2014. Learning to coexist with wildfire. Nature 515: 58-66.

Parks, S.A., C. Miller, M.A. Parisien, L.M. Holsinger et al. 2012. Wildland fire deficit and surplus in the western United States, 1984-2012.

Powell, D.S., J.L. Faulkner, D.R. Darr, et al. Forest resources of the United States, 1992. USDA Forest Service General Technical Report RM-234 (revised).

Rasker, R. 2015. Resolving the increasing risk from wildfires in the American West. www.thesolutionsjournal.org; March-April 2015 p. 55- 62.

Rhodes, J.J., and W.L. Baker. 2008. Fire probability, fuel treatment effectiveness and ecological tradeoffs in western U.S. public forests. The Open Forest Science Journal 1: 1-7.

Schoennagel, T., J.K. Balch, H. Brenkert-Smith, P.E., Dennison, et al. 2017. Adapt to more wildfire in western North American forests as climate changes. PNAS
114:4582-4590.

Stephens, S.L., M. P. North, and B.M. Collins. 2015. Largte wildfires in forests: what can be done? ActionBioscience April 15

Syphard, A. D., A. Bar Massada, V. Butsic, and J. E. Keeley. 2013. Land use planning and wildfire: development policies influence future probability of housing loss. PLoS ONE 8(8):e71708

Union of Concerned Scientists (UCS). 2017. Western wildfires and climate change. http://www.ucsusa.org/…/infographic-wildfires-climate-chang…

Westerling, A.L., H.G. Hidalgo, D.R. Cayan, and T.W. Swetnam. 2006. Warming and earlier spring increase western U.S. forest wildfire activity. Science 313:940-943.

Zimmerman, G., and L. Livesay. 2017. Fire lines: comparing wildfire risk on state and U.S. public lands. Center for Western Priorities.
http://westernpriorities.org/…/fire-lines-comparing-wildfi…/

Sea levels rising 60% faster than projected by IPCC

By Institute of Physics

Sea-levels are rising 60 per cent faster than the Intergovernmental Panel on Climate Change’s (IPCC) central projections, new research suggests.

While temperature rises appear to be consistent with the projections made in the IPCC’s fourth assessment report (AR4), satellite measurements show that sea-levels are actually rising at a rate of 3.2 mm a year compared to the best estimate of 2 mm a year in the report.

These findings, which have been published today, 28 November, in IOP Publishing’s journal Environmental Research Letters, are timely as delegates from 190 countries descend on Doha, Qatar, for the United Nation’s 18th Climate Change Conference this week.

The researchers, from the Potsdam Institute for Climate Impact Research, Tempo Analytics and Laboratoire d’Etudes en Géophysique et Océanographie Spatiales, state that the findings are important for keeping track of how well past projections match the accumulating observational data, especially as projections made by the IPCC are increasingly being used in decision making.

The study involved an analysis of global temperatures and sea-level data over the past two decades, comparing them both to projections made in the IPCC’s third and fourth assessment reports.

Results were obtained by taking averages from the five available global land and ocean temperature series.

After removing the three known phenomena that cause short-term variability in global temperatures – solar variations, volcanic aerosols and El Nino/Southern Oscillation – the researchers found that the overall warming trend at the moment is 0.16°C per decade, which closely follows the IPCC’s projections.

Satellite measurements of sea-levels showed a different picture, however, with current rates of increase being 60 per cent faster than the IPCC’s AR4 projections.

Satellites measure sea-level rise by bouncing radar waves back off the sea surface and are much more accurate than tide gauges as they have near-global coverage; tide gauges only sample along the coast. Tide gauges also include variability that has nothing to do with changes in global sea level, but rather with how the water moves around in the oceans, such as under the influence of wind.

The study also shows that it is very unlikely that the increased rate is down to internal variability in our climate system and also shows that non-climatic components of sea-level rise, such as water storage in reservoirs and groundwater extraction, do not have an effect on the comparisons made.

Lead author of the study, Stefan Rahmstorf, said: “This study shows once again that the IPCC is far from alarmist, but in fact has under-estimated the problem of climate change. That applies not just for sea-level rise, but also to extreme events and the Arctic sea-ice loss.”

New research finds that atmospheric carbon levels are highly correlated with economic growth

New research finds that atmospheric carbon levels are highly correlated with economic growth

By Diane Swanbrow / University of Michigan

To slow down global warming, we’ll either have to put the brakes on economic growth or transform the way the world’s economies work.

That’s the implication of an innovative University of Michigan study examining the evolution of atmospheric CO₂, the most likely cause of global warming.

The study, conducted by José Tapia Granados and Edward Ionides of U-M and Óscar Carpintero of the University of Valladolid in Spain, was published online in the peer-reviewed journal Environmental Science and Policy. It is the first analysis to use measurable levels of atmospheric carbon dioxide to assess fluctuations in the gas, rather than estimates of CO₂ emissions, which are less accurate.

“If ‘business as usual’ conditions continue, economic contractions the size of the Great Recession or even bigger will be needed to reduce atmospheric levels of CO₂,” said Tapia Granados, who is a researcher at the U-M Institute for Social Research.

For the study, the researchers assessed the impact of four factors on short-run, year-to-year changes in atmospheric concentrations of CO₂, widely considered the most important greenhouse gas. Those factors included two natural phenomena believed to affect CO₂ levels—volcanic eruptions and the El Niño Southern oscillation—and also world population and the world economy, as measured by worldwide gross domestic product.

Tapia Granados and colleagues found no observable relation between short-term growth of world population and CO₂ concentrations, and they show that recent incidents of volcanic activity coincided with global recessions, which brings into question the reductions in atmospheric CO₂ previously ascribed to these volcanic eruptions.

In years of above-trend world GDP, from 1958 to 2010, the researchers found greater increases in CO₂ concentrations. For each trillion in U.S. dollars that the world GDP deviates from trend, CO₂ levels deviate from trend about half a part per million, they found. Concentrations of CO₂ were estimated to be 200-300 ppm during preindustrial times. They are presently close to 400 ppm, and levels around 300 ppm are considered safe to keep a stable climate.

To break the economic habits contributing to a rise in atmospheric CO₂ levels and global warming, Tapia Granados says that societies around the world would need to make enormous changes.

“Since the 1980s, scientists like James Hansen have been warning us about the effects global warming will have on the earth,” Tapia Granados said. “One solution that has promise is a carbon tax levied on any activity producing CO₂ in order to create incentives to reduce emissions. The money would be returned to the population on a per capita basis so the tax would not mean any extra fiscal burden.”

From the University of Michigan News Service: http://ns.umich.edu/new/releases/20369-global-warming-new-research-emphasizes-the-role-of-global-economic-growth

Ocean Acidification: What Does It Mean?

Ocean Acidification: What Does It Mean?

Editor’s Note: In this essay, Carl (one of our editors) describes the process of ocean acidification, and how it relates with other ecological crises.



First we need to know what an acid is. An acid is any substance (species) who’s molecules or ions are capable of donating a hydrogen ion proton (H+) to another substance in aqueous solution. The opposite of an acid is a base. Which is a substance who’s molecules or ions are able to accept a hydrogen ion from an acid. Acidic substances are usually identified by their sour taste while bases are bitter. The quantitative means to measure the degree to which a substance is acidic or basic is the detection of “potential of hydrogen” (pH) or “power of hydrogen”. This is expressed with a logarithmic scale 0 -14 that inversely indicates the activity of hydrogen ions in solution. The greater the amount of hydrogen ions which are measured below 7 the more acidic a substance is, going to 0. The less hydrogen ions are present which are measured above 7 the more basic a substance is, going to 14. So the pH values are inverse to number of hydrogen ions present. As the concentration of hydrogen ions increases the pH decreases (acidic). As the concentration of hydrogen ions decreases the pH increases (basic). With the value of 7 being neutral which is where pure distilled water falls on the scale. So acidification would be increasing hydrogen ions.

Basic (or alkaline) properties can be associated with the presence of hydroxide ions (OH−) in aqueous solution, and the neutralization of acids (H+) by bases can be explained in terms of the reaction of these two ions to give the neutral molecule water (H+ + OH− → H2O).

Small Drop in pH Means Big Change in Acidity

For millions of years the average pH of the ocean had maintained around 8.2, which is on the basic side of the scale. But since industrial development that number has dropped to slightly below 8.1. So not acidic but going in that direction. While this may not seem like a lot, remember the decrease is nonlinear and measures the amount of hydrogen ions present. A change in pH of 1 unit is equivalent to a tenfold change in the concentration of (H+) ions. So the drop of .11 units represents a 30% increase of (H+) ions than were present in the relative homeostasis state of preindustrial time. Ocean acidification is an increase in the dissolved hydrogen ions (H+) in the water.

What is causing this decrease in pH?

Oceans absorb carbon dioxide (CO2) from the atmosphere through wave action. Pre-industrialization there was a balance between the CO2 going into the water and coming out of the water. The pH was stable in this narrow range. Life in the oceans have evolved to survive in a balanced condition. Industrialization through the burning of fossil fuel has released increased amounts of CO2 into the atmosphere. This has caused the oceans to absorb more CO2. So here is where the chemistry comes into play. As CO2 dissolves in water (H2O) the two create Hydroxycarboxylic (Carbonic) Acid (H2CO3).

CO2 + H2O = H2CO3

This breaks down easily into Hydrogen Carbonate ions (HCO3) and H+ ions.

H2CO3 = HCO3 + H+

Hydrogen ions break off of the Carbonic Acid. So more CO2 means more H+ ions which means increased acidity.

And this is where the problem is. Shells are formed primarily of Calcium Carbonate (CaCO3). But Carbonate (CO3) binds more easily with H+ than with Calcium (Ca), CO3 + 2H+. This takes away Carbonate that would have bonded with the Calcium for shell production. Calcium is relatively constant, so it is the concentration of carbonate that determines formation of calcium carbonate. Less carbonate available makes it more difficult for corals, mollusks, echinoderms, calcareous algae and other shelled organisms to form Calcium Carbonate (CaCO3), their major mineral building block. Also, when Carbonate concentrations fall too low, already formed CaCO3 starts to dissolve. So, marine organisms have a harder time making new shells and maintaining the ones they’ve already got. This causes decreased calcification. In healthy humans, normal body pH average is 7.4. This is one of the main reasons why the pH in swimming pools should be maintained around 7.5.

The acid-base balance of the oceans has been critical in maintaining the Earth’s habitability and allowing the emergence of early life.

“Scientists have long known that tiny marine organisms—phytoplankton(microscopic aquatic plants)—are central to cooling the world by emitting an organic compound known as dimethylsulphide (DMS). Acidification affects phytoplankton in the laboratory by lowering the pH (i.e. acidifying) in plankton-filled water tanks and measuring DMS emissions. When they set the ocean acidification levels for what is expected by 2100 (under a moderate greenhouse gas scenario) they found that cooling DMS emissions fell.”

Given the importance of plankton, the fact that they are the life-support system for the planet and humanity cannot survive without them, the resulting effects will be disastrous. These organisms produce 50% of the world’s oxygen – every other breath animals take and are the basis for the food web. Covering more than 70 percent of the earth’s surface the oceans, the planets lungs, are in peril.

“Over the past 200 years, the oceans have absorbed approximately half of the carbon dioxide (CO2) emitted by human activities, providing long-term carbon storage. Without this sink, the greenhouse gas concentration in the atmosphere would be much higher, and the planet much warmer.”

But absorbing the CO2 causes changes in ocean chemistry, namely lowering pH and decreasing carbonate (CO3) concentrations.

On a human time scale these changes have been slow and steady relative to that baseline. But on a geological time scale this change is more rapid than any change documented over the last 300 million years. So organisms that have evolved tolerance to a certain range of conditions may encounter increasingly stressful or even lethal conditions in the coming decades.

We know this through carbon dating of ice cores which offer scientists’ the best source for historical climate data. Also deep-sea sediment cores from the ocean floor are used to detail the Earth’s history.

Our changing ocean

Estimates of future carbon dioxide levels, based on business-as-usual emission scenarios, indicate that by the end of this century the surface waters of the ocean could have a pH around 7.8 The last time the ocean pH was that low was during the middle Miocene, 14-17 million years ago. The Earth was several degrees warmer and a major extinction event was occurring. Animals take millions of years to evolve. They go extinct without an adequate timeframe to adapt to changes in habitat. Ocean acidification is currently affecting the entire ocean, including coastal estuaries and waterways. Billions of people worldwide rely on food from the ocean as their primary source of protein. Many jobs and economies in the U.S. and around the world depend on the fish and shellfish that live in the ocean.

By absorbing increased carbon dioxide from the atmosphere, the ocean reduces the warming impact of these emissions if they were to have remained in the atmosphere. Shockingly, though, only 1 percent of that heat has ended up in the atmosphere nearly 90 percent of it, is going into the ocean. There, it’s setting ocean heat records year after year and driving increasingly severe marine heat waves. As the ocean temperature has risen its ability to absorb CO2 has decreased. Colder ocean water dissolves more CO2, absorbing more from the atmosphere. But we have steadily increased carbon emissions. The percent of current emissions produced sequestered into the oceans is thirty.

It is unknown if this uptake can be sustained. What might happen to the Earth’s atmosphere if the ocean is unable to absorb continued increased carbon dioxide?

“If the seas are warmer than usual, you can expect higher air temperatures too, says Tim Lenton, professor of climate change at Exeter University. Most of the extra heat trapped by the build-up of greenhouse gases has gone into warming the surface ocean, he explains. That extra heat tends to get mixed downwards towards the deeper ocean, but movements in oceans currents – like El Niño – can bring it back to the surface.” 

The ocean surface favors mineral formation, in deeper waters it dissolves.

We have enter a new Epoch, The Pyrocene

So it is obvious industrializing the oceans with offshore wind farms and deep sea mining, what capitalism calls the Blue Economy, will have the effect of continued acidification. But it will cause even more ramifications because it will have a direct impact on the species that live there and in the habitat where “raw” materials are extracted.

Regions of the ocean where the plankton communities are more efficiently utilizing organic matter, such as the deep sea, are places where the ocean has a naturally lower capacity to absorb some of the carbon dioxide produced by humans. “So understanding how zooplankton(small aquatic animals) communities process carbon, which, to them, represents food and energy, helps us to understand the role of the ocean in absorbing carbon dioxide in the atmosphere,” – Conner Shea doctoral student in the UH Mānoa School of Ocean and Earth Science and Technology (SOEST) Department of Oceanography.

We are headed for a Blue Ocean Event by 2030 – that is for the first time since ancient humans started roaming Earth several million years ago, an ice-free Arctic Ocean in the summer. The water instead of ice will be absorbing the suns heat rather than reflexing it back. Thus increasing sea temperature rise and disruption of the jet stream. This is basically what solar panels and wind turbines do. They make the earth hotter. Wind turbines extract the cooling breezes for their energy, the opposite of a fan. Miles and miles of solar panels destroy habitat and absorb the heat.

Continued industrialization will have the devastating effect of threats to food supplies, loss of coastal protection, diminished biodiversity and disruption of the carbon cycling – arising from these chemical reactions. This story involves a fundamental change within the largest living space on the planet, changes that are happening fast, and right now.

The oceans will find a new balance hundreds of thousands of years from now but between now and then marine organisms and environments will suffer.

What causes climate change?

The earth’s temperature cycles, glacial – interglacial, are primarily driven by periodic changes in the Earth’s orbit. Three distinct orbital cycles – called Milankovitch cycles. A Serbian scientist calculated that Ice Ages occur approximately every 41,000 years. Subsequent research confirms that they did occur at 41,000-year intervals between one and three million years ago. But about 800,000 years ago, the cycle of Ice Ages lengthened to 100,000 years, matching Earth’s deviation of orbit from circularity cycle. While various theories have been proposed to explain this transition, scientists do not yet have a clear answer. So CO2 historically has not caused climate change, it’s increased in the atmosphere during warmer temperatures and decreased during colder temperatures. Feedback loops have amplified changes initiated by orbital variations. But it is now humans that are currently increasing the amount of CO2 in the atmosphere by burning fossil fuels.

Strictly from an anthropocentric point of view, humanity could adapt to global warming and extreme weather changes. It will not survive the extinction of most marine plants and animals. The destruction of nature is more dangerous than climate change. It is sad that in the effort to save the climate and continuance of business as usual, we are destroying the environment. All of life came from the sea, it would be unwise to harm the birthplace of all species.

Photo by Ant Rozetsky on Unsplash