Editor’s note: The following is the testimony of Dr. Dominick A. DellaSala, Chief Scientist of Geos Institute, Ashland, Oregon, before the U.S. House of Representatives Natural Resources Committee, Subcommittee on Oversight and Investigations, “Exploring Solutions to Reduce Risks of Catastrophic Wildfire and Improve Resilience of National Forests,” on September 27, 2017.
Chairman Westerman, Ranking member Hanabusa, and subcommittee members, thank you for the opportunity to discuss wildfires on national forests. I am the Chief Scientist of the nonprofit organization, Geos Institute in Ashland, Oregon. Geos Institute works with agencies, landowners, and decision makers in applying the best science to climate change planning and forest management. As a scientist, I have published in peer-reviewed journals on fire ecology and climate change, I am on the editorial board of several leading journals and encyclopedias, and I have been on the faculty of Oregon State University and Southern Oregon University. A recent book I co-authored with 28 other scientists outlined the ecological importance of mixed-severity fires in maintaining fire-resilient ecosystems, including ways to coexist with wildfire (DellaSala and Hanson 2015).
Wildfires are necessary natural disturbance processes that forests need to rejuvenate. Most wildfires in pine and mixed-conifer forests of the West burn in mixed fire intensities at the landscape scale that produce large and small patches of low to high tree mortality. This tapestry of burned patches is associated with extraordinary plant and wildlife diversity, including habitat for many big game and bird species that thrive in the newly established forests. From an ecosystem perspective, natural disturbances like wildfires are not an ecological catastrophe. However, given there are now 46 million homes in naturally fire-prone areas (Rasker 2015), and no end in sight for new development, we must find ways to coexist with natural disturbance processes as they are increasing in places due to climate change.
In my testimony today, I will discuss how proposals that call for increased logging and decreased environmental review in response to wildfires and insect outbreaks are not science driven, in many cases may make problems worse, and will not stem rising wildfire suppression costs. I will also discuss what we know about forest fires and beetle outbreaks in relation to climate change, limitations of thinning and other forms of logging in relation to wildfire and insect management, and I will conclude with recommendations for moving forward based on best available science.
I. WHAT WE KNOW ABOUT RECENT FOREST FIRE INCREASES
Recent increases in acres burned of forests are mainly due to a changing climate – Scientists have known for sometime that fire activity tracks regional weather patterns, which in turn, are governed by global climatic forces such as the Pacific Decadal Oscillation (PDO – a recurring long-lived El Niño-like pattern of Pacific climate variability– see chart 1). For instance, the very active fire seasons of the 1910-1930s, occurred during prolonged drought cycles determined by the PDO that resulted in much larger areas burning historically than today (Powell et al. 1994; Interagency Federal Wildland Fire Policy Review Working Group 2001; Egan 2010) (chart 1). In fact, compared to the historic warm PDO phase of the early 1900s, most of the West is actually experiencing a fire deficit (Littell et al. 2009, Parks et al. 2012). However, with warming temperatures, early spring snowmelt, and longer fire seasons over the past few decades more acres are burning each year (Westerling et al. 2006; Littell et al. 2009) (chart 1).
For instance, wildfire season in the West has lengthened from an average of five to seven months, and the number of large wildfires (>1,000 acres) has increased since the 1980s (Dennison et al. 2014) from 140 to 250 per year (UCS 2017). This is occurring as average annual temperature in the West has risen by nearly 2 degrees F since 1970s and winter snow pack has declined (UCS 2017). If measures are not taken to stem greenhouse gas emissions, wildfire acres are projected to increase further in dry areas as annual temperatures are expected to rise another 2.5 to 6.5 degrees F by mid century (UCS 2017). Some researchers estimate more than half of the increase in acres burned over the past several decades is related to climate change (Abatzoglou and Williams 2016). This increase is expected to continue with additional warming leading to even greater suppression costs if the agencies continue to suppress fires across the landscape (Schoennagel et al. 2017).
Increasing Human Development is Lengthening Wildfire Seasons and Adding to Fire Ignitions – The direct role of human-access via roads and development in the Wildlands Urban Interface (WUI) is increasing wildfire activity. Scientists recently evaluated over 1.5 million government records of wildfires nationwide from 1992 to 2012 (Balch et al. 2015). During that time, human-caused fire ignitions have vastly expanded the spatial and seasonal occurrence of fire, accounting for 84 percent of all wildfire starts and 44 percent of the total area burned nationally. We now have the phenomenon of a human-caused fire season, which was three times longer than the lightning-caused fire season and added an average of 40,000 wildfires per year across the US over this 20-year period of time. Ignitions caused by people – whether accidental or arson – have substantial economic costs. This will only worsen with continued development of the WUI adding to the 46 million homes (Rasker 2015) already in these fire-prone areas.
Thus, given expansion of homes in the WUI, the best way to limit damage to homes is to reduce fire risks by working from the home-outward instead of the wildlands-inward (Syphard et al. 2013). For instance, if a fire-brand travels miles ahead and lands on a flammable roof that home is very likely to burn compared to a home that has a fire resistant roof and cleared vegetation within a narrow defensible space of 100-200 feet immediately surrounding the home (Cohen 2000). Logging outside of this narrow zone does not change home ignition factors.
II. WHAT WE KNOW ABOUT FIRE AND FOREST MANAGEMENT
Wilderness and other protected areas are not especially prone to forest fires – proposals to remove environmental protections to increase logging for wildfire concerns based on the assumption that unmanaged – or protected areas – burn more intensely are misplaced. For instance, scientists (Bradley et al. 2016 of which I was a co-author) recently examined the intensity of 1,500 forest fires affecting over 23 million acres during the past four decades in 11 western states. We tested the common perception that forest fires burn hottest (most intensely) in wilderness and national parks while burning cooler (less intensely) or not at all in areas where logging had occurred. What we found was the opposite – fires burned most intense in previously logged areas, while they burned in natural fire mosaic patterns in wilderness, parks, and roadless areas, thereby, maintaining resilient forests (see chart 2). Consequently, there is no reason for reducing environmental protections.
State lands are not at lower wildfire risks compared to federal lands – there is much discussion about whether state lands are being managed in a way that reduces fire occurrence and intensity. However, in a recent report of wildfire risk (that included acres likely to burn), scientists (Zimmerman and Livesay 2017) used the West Wide Wildfire Risk Assessment model, an important assessment tool of the Council of Western State Foresters and Western Forestry Leadership Coalition. They evaluated risk for western states based on historical fire data, topography, vegetation, tree cover, climate, and other factors. According to the Center for Western Priorities analysis, state (22%) and federal (23%) lands have approximately equivalent levels of fire risks in the West, and for some states, risks were higher than federal lands. Notably, allegations of higher fire risk based solely on the number of federal acres burned in a fire season are misleading as there are over 7 times as many federal lands (362 million acres) in 11 Western states as compared to state-owned lands (49 million acres) (Zimmerman and Livesay 2017).
Thinning is Ineffective in Extreme Fire Weather – thinning/logging is most often proposed to reduce fire risk and lower fire intensity. Thinning-from-below of small diameter trees followed by prescribed fire in certain forest types can reduce fire severity (Brown et al. 2004, Kalies and Kent 2016) but only when there is not extreme fire weather (Moritz et al. 2014, Schoennagel et al. 2017). Fires occurring during extreme fire-weather (high winds, high temperatures, low humidity, low fuel moisture) will burn over large landscapes, regardless of thinning, and in some cases can burn hundreds or thousands of acres in just a few days (Stephens et al. 2015, Schoennagel et al. 2017). Fires driven by fire weather are unstoppable and are unsafe for fire fighters to attempt putting them out, and, as discussed, are more likely under a changing climate.
Further, there is a very low probability of a thinned site actually encountering a fire during the narrow window when tree density is lowest. For example, the probability of a fire hitting an area that has been thinned is about 3-8% on average, and thinning would need to be repeated every 10-15 years (depending on site productivity) to keep fuels at a minimum (Rhodes and Baker 2008).
Thinning too much of the overstory trees in a stand, especially removal of large fire-resistant trees, can increase the rate of fire spread by opening tree canopies and letting in more wind, can damage soils, introduce invasive species that increase flammable understory fuels, and impact wildlife habitat (Brown et al. 2004). Thinning also requires an extensive and expensive roads network that can degrade water quality by altering hydrological functions, including chronic sediment loads.
Post-disturbance salvage logging reduces forest resilience and can raise fire hazards –commonly practiced after natural disturbances like fires or insect outbreaks, post-disturbance logging hinders forest resilience by compacting soils, killing natural regeneration of conifer seedlings and shrubs associated with forest renewal, increasing fine fuels from slash left on the ground that aids the spread of fire, removing the most fire-resistant large live and dead trees, and degrading fish and wildlife habitat. Further roads that increase sediment flow to streams triggering widespread water quality problems (Lindenmayer et al. 2008).
III. WHAT WE KNOW ABOUT BEETLE-KILLED FORESTS AND FOREST MANAGEMENT
Beetle Killed Forests are Not More Susceptible to Forest Fires – forests in the West are being affected by the largest outbreaks of bark beetles in decades, which has caused concern about forest resilience and wildfire risk and led to proposals for widespread tree removals. Such proposals stem in part from the rationale that bark beetle outbreaks increase wildfire risks due to dead trees and that logging in beetle-affected forests would therefore lower such risks. However, beetle-killed forests are not more susceptible to forest fires (Bond et al. 2009, Hart et al. 2015, Meigs et al. 2016). This is mainly because when conifers die due to drought or native bark beetles, the combustible oils in the needles quickly begin to dissipate, needles and small twigs begin to fall to the ground. Without the fine fuels that facilitate fire spread, potential crown fires are actually lowered in forests with beetle mortality (Donato et al. 2013). The beetle-killed standing dead trees (snags) are the least flammable part of the forest and act more like a large log in a campfire, rather than kindling which is what causes fire spread.
In fact, studies of beetle-killed forests in the West found that when fires occurred during or immediately after the pulse of snag recruitment from beetle kill, fire severity consistently declined in the stands with high snag densities in the following decades (Meigs et al. 2016). In pine and mixed-conifer forests of the San Bernardino National Forest (CA), fires occurred immediately after a large pulse of snag recruitment from drought and beetles. However, scientists (Bond et al. 2009) found “no evidence that pre-fire tree mortality influenced fire severity.” In studies of beetles and wildfires across the western U.S., scientists (Hart et al. 2015) stated “contrary to the expectation of increased wildfire activity in recently infested red-stage stands, we found no difference between observed area and expected area burned in red-stage or subsequent gray-stage stands during three peak years of wildfire activity, which account for 46 percent of area burned during the 2002–2013 period.” And finally, in a comprehensive review of fire-beetle relations in mixed-conifer and ponderosa pine forests of the Pacific Northwest, scientists (Meigs et al. 2016) found: “in contrast to common assumptions of positive feedbacks, we find that insects generally reduce the severity of subsequent wildfires. Specific effects vary with insect type and timing, but insects decrease the abundance of live vegetation susceptible to wildfire at multiple time lags. By dampening subsequent burn severity, native insects could buffer rather than exacerbate fire regime changes expected due to land use and climate change.”
Most importantly, climate change is allowing more insects to survive the winter, triggering the rash of recent outbreaks (Meigs et al. 2016).
Thinning cannot limit or contain beetle outbreaks – once beetle populations reach widespread epidemic levels, thinning treatments aimed at stopping them do not reduce outbreak susceptibility as beetles over run natural forest defenses with or without thinning (Black et al. 2013).
IV. CLOSING REMARKS AND RECOMMENDATIONS
Recent increases in wildfires and insect outbreaks are a result of a changing climate coupled with human-activities including expansion of homes and roads into the WUI that will only continue to drive up fire suppression costs.
Policies should be examined that discourage continued growth in the WUI; any new development must include defensible space and construction from non-flammable materials.
The most effective way to protect homes is to create defensible space in the immediate 100 feet of a structure and use of non-flammable materials. Wildland fire policy should fund defensible space, not more logging and thinning miles away from communities.
No amount of logging can stop insect outbreaks or large fires under extreme fire weather. Logging may, in fact, increase the amount of unnatural disturbances by homogenizing landscapes with more even aged trees, residual slash left on the ground, and compounding cumulative impacts to ecosystems.
Thinning of small trees in certain forest types, maintaining canopy closure and in combination with prescribed fire can reduce fire intensity but treatment efficacy is limited in extreme fire weather, and by the small chance that a thinned site will encounter a fire during a very narrow window when fuels are lowest.
Balch, J. K., B.A. Bradley, J.T. Abatzoglou et al. 2016. Human-started wildfires expand the fire niche across the United States. PNAS 114: 2946-2951.
Black, S.H., D. Kulakowski, B.R. Noon, and D.A. DellaSala. 2013. Do bark beetle outbreaks increase wildfire risks in the Central U.S. Rocky Mountains: Implications from Recent Research. Natural Areas Journal 33:59-65.
Bond, M.L., D.E. Lee, C.M. Bradley, and C.T. Hanson. 2009. Influence of pre-fire tree mortality on fire severity in conifer forests of the San Bernardino Mountains, California. The Open Forest Science Journal 2:41-47.
Bradley, C.M., C.T. Hanson, and D.A. DellaSala. 2016. Does increased forest protection correspond to higher fire severity in frequent-fire forests of the western United States? Ecosphere 7:1-13.
Brown, R.T., J.K. Agee, and J.F. Franklin. 2004. Forest restoration and fire: principles in the context of place. Conservation Biology 18:903-912.
Cohen, J.D. 2000. Preventing disaster: home ignitability in the wildland-urban interface. Journal of Forestry 98: 15-21.
DellaSala, D.A., and C.T. Hanson. 2015. The ecological importance of mixed-severity fires: nature’s phoenix. Elsevier: Boston, MA.
Dennison, P., S. Brewer, J. Arnold, and M. Moritz. 2014. Large wildfire trends in the western United States, 1984-2011. Geophysics Research Letters 41:2928-2933.
Donato, D.C., B.J. Harvey, W.H. Romme, M. Simard, and M.G. Turner. 2013. Bark beetle effects on fuel profiles across a range of stand structures in Douglas-fir forests of Greater Yellowstone. Ecological Applications 23:3-20.
Egan, T. 2010. The Big burn. Huffman Mifflin Harcourt: Boston.
Hart, S.J., T.T. Veblen, N. Mietkiewicz, and D. Kulakowski. 2015. Negative feedbacks on bark beetle outbreaks: widespread and severe spruce beetle infestation restricts subsequent infestation. PlosOne: DOI:10.1371/journal.pone.0127975
Kalies, E.I., and L.L. Yocom Kent. 2016. Tamm Review: Are fuel treatments effective at achieving ecological and social objectives? A systematic review. Forest Ecology and Management 375-84-95.
Lindenmayer, D.B., P.J. Burton, and J.F. Franklin. 2008. Salvage logging and its ecological consequences. Island Press: Washington, D.C.
Littell, J.S., D. McKenzie, D.L. Peterson, and A.L. Westerling. 2009. Climate and wildfire area burned in western U.S. ecoprovinces, 1916-2003. Ecological Applications 19:1003-1021.
Meigs, G.W., H.S.J. Zald, J. L. Campbell, W.S. Keeton, and R.E. Kennedy. 2016. Do insect outbreaks reduce the severity of subsequent forest fires? Environmental Research Letters 11 doi:10.1088/1748-9326/11/4/045008.
Moritz, M.A., E. Batllori, R.A. Bradstock, A.M. Gill, J. Handmer, P.F. Hessburg, J. Leonard, S. McCaffrey, D.C. Odion, T. Schoennagel, and A.D. Syphard. 2014. Learning to coexist with wildfire. Nature 515: 58-66.
Parks, S.A., C. Miller, M.A. Parisien, L.M. Holsinger et al. 2012. Wildland fire deficit and surplus in the western United States, 1984-2012.
Powell, D.S., J.L. Faulkner, D.R. Darr, et al. Forest resources of the United States, 1992. USDA Forest Service General Technical Report RM-234 (revised).
Rasker, R. 2015. Resolving the increasing risk from wildfires in the American West. www.thesolutionsjournal.org; March-April 2015 p. 55- 62.
Rhodes, J.J., and W.L. Baker. 2008. Fire probability, fuel treatment effectiveness and ecological tradeoffs in western U.S. public forests. The Open Forest Science Journal 1: 1-7.
Schoennagel, T., J.K. Balch, H. Brenkert-Smith, P.E., Dennison, et al. 2017. Adapt to more wildfire in western North American forests as climate changes. PNAS
Stephens, S.L., M. P. North, and B.M. Collins. 2015. Largte wildfires in forests: what can be done? ActionBioscience April 15
Syphard, A. D., A. Bar Massada, V. Butsic, and J. E. Keeley. 2013. Land use planning and wildfire: development policies influence future probability of housing loss. PLoS ONE 8(8):e71708
Union of Concerned Scientists (UCS). 2017. Western wildfires and climate change. http://www.ucsusa.org/…/infographic-wildfires-climate-chang…
Westerling, A.L., H.G. Hidalgo, D.R. Cayan, and T.W. Swetnam. 2006. Warming and earlier spring increase western U.S. forest wildfire activity. Science 313:940-943.
Zimmerman, G., and L. Livesay. 2017. Fire lines: comparing wildfire risk on state and U.S. public lands. Center for Western Priorities.
By Institute of Physics
Sea-levels are rising 60 per cent faster than the Intergovernmental Panel on Climate Change’s (IPCC) central projections, new research suggests.
While temperature rises appear to be consistent with the projections made in the IPCC’s fourth assessment report (AR4), satellite measurements show that sea-levels are actually rising at a rate of 3.2 mm a year compared to the best estimate of 2 mm a year in the report.
These findings, which have been published today, 28 November, in IOP Publishing’s journal Environmental Research Letters, are timely as delegates from 190 countries descend on Doha, Qatar, for the United Nation’s 18th Climate Change Conference this week.
The researchers, from the Potsdam Institute for Climate Impact Research, Tempo Analytics and Laboratoire d’Etudes en Géophysique et Océanographie Spatiales, state that the findings are important for keeping track of how well past projections match the accumulating observational data, especially as projections made by the IPCC are increasingly being used in decision making.
The study involved an analysis of global temperatures and sea-level data over the past two decades, comparing them both to projections made in the IPCC’s third and fourth assessment reports.
Results were obtained by taking averages from the five available global land and ocean temperature series.
After removing the three known phenomena that cause short-term variability in global temperatures – solar variations, volcanic aerosols and El Nino/Southern Oscillation – the researchers found that the overall warming trend at the moment is 0.16°C per decade, which closely follows the IPCC’s projections.
Satellite measurements of sea-levels showed a different picture, however, with current rates of increase being 60 per cent faster than the IPCC’s AR4 projections.
Satellites measure sea-level rise by bouncing radar waves back off the sea surface and are much more accurate than tide gauges as they have near-global coverage; tide gauges only sample along the coast. Tide gauges also include variability that has nothing to do with changes in global sea level, but rather with how the water moves around in the oceans, such as under the influence of wind.
The study also shows that it is very unlikely that the increased rate is down to internal variability in our climate system and also shows that non-climatic components of sea-level rise, such as water storage in reservoirs and groundwater extraction, do not have an effect on the comparisons made.
Lead author of the study, Stefan Rahmstorf, said: “This study shows once again that the IPCC is far from alarmist, but in fact has under-estimated the problem of climate change. That applies not just for sea-level rise, but also to extreme events and the Arctic sea-ice loss.”
By Diane Swanbrow / University of Michigan
To slow down global warming, we’ll either have to put the brakes on economic growth or transform the way the world’s economies work.
That’s the implication of an innovative University of Michigan study examining the evolution of atmospheric CO₂, the most likely cause of global warming.
The study, conducted by José Tapia Granados and Edward Ionides of U-M and Óscar Carpintero of the University of Valladolid in Spain, was published online in the peer-reviewed journal Environmental Science and Policy. It is the first analysis to use measurable levels of atmospheric carbon dioxide to assess fluctuations in the gas, rather than estimates of CO₂ emissions, which are less accurate.
“If ‘business as usual’ conditions continue, economic contractions the size of the Great Recession or even bigger will be needed to reduce atmospheric levels of CO₂,” said Tapia Granados, who is a researcher at the U-M Institute for Social Research.
For the study, the researchers assessed the impact of four factors on short-run, year-to-year changes in atmospheric concentrations of CO₂, widely considered the most important greenhouse gas. Those factors included two natural phenomena believed to affect CO₂ levels—volcanic eruptions and the El Niño Southern oscillation—and also world population and the world economy, as measured by worldwide gross domestic product.
Tapia Granados and colleagues found no observable relation between short-term growth of world population and CO₂ concentrations, and they show that recent incidents of volcanic activity coincided with global recessions, which brings into question the reductions in atmospheric CO₂ previously ascribed to these volcanic eruptions.
In years of above-trend world GDP, from 1958 to 2010, the researchers found greater increases in CO₂ concentrations. For each trillion in U.S. dollars that the world GDP deviates from trend, CO₂ levels deviate from trend about half a part per million, they found. Concentrations of CO₂ were estimated to be 200-300 ppm during preindustrial times. They are presently close to 400 ppm, and levels around 300 ppm are considered safe to keep a stable climate.
To break the economic habits contributing to a rise in atmospheric CO₂ levels and global warming, Tapia Granados says that societies around the world would need to make enormous changes.
“Since the 1980s, scientists like James Hansen have been warning us about the effects global warming will have on the earth,” Tapia Granados said. “One solution that has promise is a carbon tax levied on any activity producing CO₂ in order to create incentives to reduce emissions. The money would be returned to the population on a per capita basis so the tax would not mean any extra fiscal burden.”
From the University of Michigan News Service: http://ns.umich.edu/new/releases/20369-global-warming-new-research-emphasizes-the-role-of-global-economic-growth
Editor’s Note: In this essay, Carl (one of our editors) describes the process of ocean acidification, and how it relates with other ecological crises.
First we need to know what an acid is. An acid is any substance (species) who’s molecules or ions are capable of donating a hydrogen ion proton (H+) to another substance in aqueous solution. The opposite of an acid is a base. Which is a substance who’s molecules or ions are able to accept a hydrogen ion from an acid. Acidic substances are usually identified by their sour taste while bases are bitter. The quantitative means to measure the degree to which a substance is acidic or basic is the detection of “potential of hydrogen” (pH) or “power of hydrogen”. This is expressed with a logarithmic scale 0 -14 that inversely indicates the activity of hydrogen ions in solution. The greater the amount of hydrogen ions which are measured below 7 the more acidic a substance is, going to 0. The less hydrogen ions are present which are measured above 7 the more basic a substance is, going to 14. So the pH values are inverse to number of hydrogen ions present. As the concentration of hydrogen ions increases the pH decreases (acidic). As the concentration of hydrogen ions decreases the pH increases (basic). With the value of 7 being neutral which is where pure distilled water falls on the scale. So acidification would be increasing hydrogen ions.
Basic (or alkaline) properties can be associated with the presence of hydroxide ions (OH−) in aqueous solution, and the neutralization of acids (H+) by bases can be explained in terms of the reaction of these two ions to give the neutral molecule water (H+ + OH− → H2O).
Small Drop in pH Means Big Change in Acidity
For millions of years the average pH of the ocean had maintained around 8.2, which is on the basic side of the scale. But since industrial development that number has dropped to slightly below 8.1. So not acidic but going in that direction. While this may not seem like a lot, remember the decrease is nonlinear and measures the amount of hydrogen ions present. A change in pH of 1 unit is equivalent to a tenfold change in the concentration of (H+) ions. So the drop of .11 units represents a 30% increase of (H+) ions than were present in the relative homeostasis state of preindustrial time. Ocean acidification is an increase in the dissolved hydrogen ions (H+) in the water.
What is causing this decrease in pH?
Oceans absorb carbon dioxide (CO2) from the atmosphere through wave action. Pre-industrialization there was a balance between the CO2 going into the water and coming out of the water. The pH was stable in this narrow range. Life in the oceans have evolved to survive in a balanced condition. Industrialization through the burning of fossil fuel has released increased amounts of CO2 into the atmosphere. This has caused the oceans to absorb more CO2. So here is where the chemistry comes into play. As CO2 dissolves in water (H2O) the two create Hydroxycarboxylic (Carbonic) Acid (H2CO3).
CO2 + H2O = H2CO3
This breaks down easily into Hydrogen Carbonate ions (HCO3) and H+ ions.
H2CO3 = HCO3 + H+
Hydrogen ions break off of the Carbonic Acid. So more CO2 means more H+ ions which means increased acidity.
And this is where the problem is. Shells are formed primarily of Calcium Carbonate (CaCO3). But Carbonate (CO3) binds more easily with H+ than with Calcium (Ca), CO3 + 2H+. This takes away Carbonate that would have bonded with the Calcium for shell production. Calcium is relatively constant, so it is the concentration of carbonate that determines formation of calcium carbonate. Less carbonate available makes it more difficult for corals, mollusks, echinoderms, calcareous algae and other shelled organisms to form Calcium Carbonate (CaCO3), their major mineral building block. Also, when Carbonate concentrations fall too low, already formed CaCO3 starts to dissolve. So, marine organisms have a harder time making new shells and maintaining the ones they’ve already got. This causes decreased calcification. In healthy humans, normal body pH average is 7.4. This is one of the main reasons why the pH in swimming pools should be maintained around 7.5.
The acid-base balance of the oceans has been critical in maintaining the Earth’s habitability and allowing the emergence of early life.
“Scientists have long known that tiny marine organisms—phytoplankton(microscopic aquatic plants)—are central to cooling the world by emitting an organic compound known as dimethylsulphide (DMS). Acidification affects phytoplankton in the laboratory by lowering the pH (i.e. acidifying) in plankton-filled water tanks and measuring DMS emissions. When they set the ocean acidification levels for what is expected by 2100 (under a moderate greenhouse gas scenario) they found that cooling DMS emissions fell.”
Given the importance of plankton, the fact that they are the life-support system for the planet and humanity cannot survive without them, the resulting effects will be disastrous. These organisms produce 50% of the world’s oxygen – every other breath animals take and are the basis for the food web. Covering more than 70 percent of the earth’s surface the oceans, the planets lungs, are in peril.
“Over the past 200 years, the oceans have absorbed approximately half of the carbon dioxide (CO2) emitted by human activities, providing long-term carbon storage. Without this sink, the greenhouse gas concentration in the atmosphere would be much higher, and the planet much warmer.”
But absorbing the CO2 causes changes in ocean chemistry, namely lowering pH and decreasing carbonate (CO3) concentrations.
On a human time scale these changes have been slow and steady relative to that baseline. But on a geological time scale this change is more rapid than any change documented over the last 300 million years. So organisms that have evolved tolerance to a certain range of conditions may encounter increasingly stressful or even lethal conditions in the coming decades.
We know this through carbon dating of ice cores which offer scientists’ the best source for historical climate data. Also deep-sea sediment cores from the ocean floor are used to detail the Earth’s history.
Estimates of future carbon dioxide levels, based on business-as-usual emission scenarios, indicate that by the end of this century the surface waters of the ocean could have a pH around 7.8 The last time the ocean pH was that low was during the middle Miocene, 14-17 million years ago. The Earth was several degrees warmer and a major extinction event was occurring. Animals take millions of years to evolve. They go extinct without an adequate timeframe to adapt to changes in habitat. Ocean acidification is currently affecting the entire ocean, including coastal estuaries and waterways. Billions of people worldwide rely on food from the ocean as their primary source of protein. Many jobs and economies in the U.S. and around the world depend on the fish and shellfish that live in the ocean.
By absorbing increased carbon dioxide from the atmosphere, the ocean reduces the warming impact of these emissions if they were to have remained in the atmosphere. Shockingly, though, only 1 percent of that heat has ended up in the atmosphere nearly 90 percent of it, is going into the ocean. There, it’s setting ocean heat records year after year and driving increasingly severe marine heat waves. As the ocean temperature has risen its ability to absorb CO2 has decreased. Colder ocean water dissolves more CO2, absorbing more from the atmosphere. But we have steadily increased carbon emissions. The percent of current emissions produced sequestered into the oceans is thirty.
It is unknown if this uptake can be sustained. What might happen to the Earth’s atmosphere if the ocean is unable to absorb continued increased carbon dioxide?
“If the seas are warmer than usual, you can expect higher air temperatures too, says Tim Lenton, professor of climate change at Exeter University. Most of the extra heat trapped by the build-up of greenhouse gases has gone into warming the surface ocean, he explains. That extra heat tends to get mixed downwards towards the deeper ocean, but movements in oceans currents – like El Niño – can bring it back to the surface.”
The ocean surface favors mineral formation, in deeper waters it dissolves.
We have enter a new Epoch, The Pyroscene
So it is obvious industrializing the oceans with offshore wind farms and deep sea mining, what capitalism calls the Blue Economy, will have the effect of continued acidification. But it will cause even more ramifications because it will have a direct impact on the species that live there and in the habitat where “raw” materials are extracted.
Regions of the ocean where the plankton communities are more efficiently utilizing organic matter, such as the deep sea, are places where the ocean has a naturally lower capacity to absorb some of the carbon dioxide produced by humans. “So understanding how zooplankton(small aquatic animals) communities process carbon, which, to them, represents food and energy, helps us to understand the role of the ocean in absorbing carbon dioxide in the atmosphere,” – Conner Shea doctoral student in the UH Mānoa School of Ocean and Earth Science and Technology (SOEST) Department of Oceanography.
We are headed for a Blue Ocean Event by 2030 – that is for the first time since ancient humans started roaming Earth several million years ago, an ice-free Arctic Ocean in the summer. The water instead of ice will be absorbing the suns heat rather than reflexing it back. Thus increasing sea temperature rise and disruption of the jet stream. This is basically what solar panels and wind turbines do. They make the earth hotter. Wind turbines extract the cooling breezes for their energy, the opposite of a fan. Miles and miles of solar panels destroy habitat and absorb the heat.
Continued industrialization will have the devastating effect of threats to food supplies, loss of coastal protection, diminished biodiversity and disruption of the carbon cycling – arising from these chemical reactions. This story involves a fundamental change within the largest living space on the planet, changes that are happening fast, and right now.
The oceans will find a new balance hundreds of thousands of years from now but between now and then marine organisms and environments will suffer.
What causes climate change?
The earth’s temperature cycles, glacial – interglacial, are primarily driven by periodic changes in the Earth’s orbit. Three distinct orbital cycles – called Milankovitch cycles. A Serbian scientist calculated that Ice Ages occur approximately every 41,000 years. Subsequent research confirms that they did occur at 41,000-year intervals between one and three million years ago. But about 800,000 years ago, the cycle of Ice Ages lengthened to 100,000 years, matching Earth’s deviation of orbit from circularity cycle. While various theories have been proposed to explain this transition, scientists do not yet have a clear answer. So CO2 historically has not caused climate change, it’s increased in the atmosphere during warmer temperatures and decreased during colder temperatures. Feedback loops have amplified changes initiated by orbital variations. But it is now humans that are currently increasing the amount of CO2 in the atmosphere by burning fossil fuels.
Strictly from an anthropocentric point of view, humanity could adapt to global warming and extreme weather changes. It will not survive the extinction of most marine plants and animals. The destruction of nature is more dangerous than climate change. It is sad that in the effort to save the climate and continuance of business as usual, we are destroying the environment. All of life came from the sea, it would be unwise to harm the birthplace of all species.
Photo by Ant Rozetsky on Unsplash
Editor’s Note: Scientists have been known to make modest predictions when it comes to ecological crises. This is a reason many predictions come about long before the expected timeline. The following article looks at some recent events to argue that climate collapse has already begun.
By José Seoane/Globetrotter
In 2023, different climatic anomalies have been recorded that set new historical records in the tragic progression of climate change at the global level.
Thus, in June, the surface temperature in the North Atlantic reached the maximum increase of 1.3 degrees Celsius with respect to preindustrial values. In a similar direction—although in lower values—the average temperature of the seas at the global level increased. On the other hand, the retraction of Antarctic ice reached a new limit, reaching the historical decrease of 2016, but several months earlier in the middle of the cold season.
The combination of these records has led scientists who follow these processes to warn of the danger of a profound change in the currents that regulate temperature and life in the oceans and globally. The heat waves recorded on the coasts of a large part of the world—in Ireland, Mexico, Ecuador, Japan, Mauritania, and Iceland—may, in turn, be proof of this.
These phenomena, of course, are not limited to the seas. On Thursday, July 6, the global air temperature (measured at two meters above the ground) reached 17.23 degrees Celsius for the first time in the history of the last centuries, 1.68 degrees Celsius higher than preindustrial values; last June was already the warmest month in history. Meanwhile, temperatures on the continents, particularly in the North, also broke records: 40 degrees Celsius in Siberia, 50 degrees Celsius in Mexico, the warmest June in England in the historical series that began in 1884.
And its counterpart, droughts, such as the one plaguing Uruguay, where the shortage of fresh water since May has forced the increasing use of brackish water sources, making tap water undrinkable for the inhabitants of the Montevideo metropolitan area, where 60 percent of the country’s population is concentrated. This is a drought that, if it continues, could leave this region of the country without drinking water, making it the first city in the world to suffer such a catastrophe.
But the stifling heat and the droughts also bring with them voracious fires, such as the boreal forest fire that has been raging across Canada for weeks, with more than 500 outbreaks scattered in different regions of the country, many of them uncontrollable, and the widespread images of an apocalyptic New York darkened and stained red under a blanket of ashes.
This accumulation of tragic evidence, against all the denialist narratives, makes it undeniable that the climate crisis is already here, among us. It also indicates the absolute failure of the policies and initiatives adopted to reduce the emission or presence of greenhouse gases in the atmosphere. In this direction, in May of 2023, the levels of carbon dioxide (CO2) measured at NOAA’s global reference observatory in Hawaii reached an all-time high of 424 parts per million (ppm), becoming more than 50 percent higher than before the beginning of the industrial era and, those of the period January—May 2023, 0.3 percent higher than those of the same period of 2022 and 1.6 percent compared to that of 2019. According to the latest report of the United Nations Intergovernmental Panel on Climate Change (IPCC), the global surface temperature has risen faster since 1970 than in any other 50-year period for at least the last 2,000 years, the same period in which international agreements and national initiatives to combat the causes of climate change were deployed. The failure of these policies is also reflected, in our present, in the persistence and strength of a fossil capitalism and its plundering and socio-environmental destruction.
Not only have these so-called mitigation policies failed, but also the so-called adaptation policies aimed at minimizing the foreseeable impacts of climate change are weak or even absent.
In the same vein, the annual report of the World Meteorological Organization (WMO, Global Annual to Decadal Climate Update) released in May 2023 warned that it is very likely (66 percent probability) that the annual average global temperature will exceed 1.5 degrees Celsius in at least one year of the next five years (2023-2027), it is possible (32 percent probability) that the average temperature will exceed 1.5 degrees Celsius and it is almost certain (98 percent probability) that at least one of the next five years, as well as the five-year period as a whole, will be the warmest on record; The IPCC has estimated serious consequences if this temperature is exceeded permanently.
How close to this point will the arrival of the El Niño phenomenon place us this year and possibly in the coming years? El Niño is an event of climatic origin that expresses itself in the warming of the eastern equatorial Pacific Ocean and manifests itself in cycles of between three and eight years. With antecedents in the 19th century, in 1924 climatologist Gilbert Walker coined the term “Southern Oscillation” to identify it and in 1969 meteorologist Jacob Bjerknes suggested that this unusual warming in the eastern Pacific could unbalance the trade winds and increase the warm waters toward the east, that is, toward the intertropical coasts of South America.
But this is not simply a traditional meteorological phenomenon that recurs in irregular annual periods. It is not a natural phenomenon; however many attempts are made, time and again, to make invisible or deny its social causes. On the contrary, in recent decades, the dynamics of the climate crisis have increased both in frequency and intensity. Already in early 2023, the third continuous La Niña episode concluded, the third time since 1950 that it has extended over three years and with increasing intensity. Likewise, in 2016, El Niño led to the average temperature record reached by the planet. And different scientists estimate today that this Super El Niño may be repeated today with unknown consequences given the levels of greenhouse gases and the dynamics of the current climate crisis.
The banners of a change inspired by social and climate justice and the effective paths of this socio-ecological transition raised by popular movements are becoming more imperative and urgent today. It is possible to propose an emergency popular mitigation and adaptation plan. But to make these alternatives socially audible, to break with the ecological blindness that wants to impose itself, it is first necessary to break the epistemological construction that wants to inscribe these catastrophes, repeatedly and persistently, in a world of supposedly pure nature, in a presumably external field, alien and outside human social control.
This is a matrix of naturalization that, while excluding social groups and the mode of socioeconomic organization from any responsibility for the current crises, wants to turn them into unpredictable and unknowable events that only leave the option of resignation, religious alienation, or individual resilience. The questioning of these views is inscribed not only in the discourses but also in the practices and emotions, in responding to the catastrophe with the (re)construction of bonds and values of affectivity, collectivity, and solidarity—indispensable supports for emancipatory change.
Photo by NASA on Unsplash