Editor’s note: The International Day for Biodiversity was celebrated on May 22, which commemorates the adoption of the Convention on Biological Diversity, a global treaty. What lessons have we learned from undoing past harms and conserving biodiversity for our planet’s future?
Global efforts to restore forests are gathering pace, driven by promises of combating climate change, conserving biodiversity and improving livelihoods. Yet a recent review published in Nature Reviews Biodiversity warns that the biodiversity gains from these initiatives are often overstated — and sometimes absent altogether.
Restoration has typically prioritized utilitarian goals such as timber production, carbon sequestration or erosion control. This bias is reflected in the widespread use of monoculture plantations or low-diversity agroforests. Nearly half the forest commitments in the Bonn Challenge to restore degraded and deforested landscapes consist of commercial plantations of exotic species, a trend that risks undermining biodiversity rather than enhancing it.
Scientific evidence shows that restoring biodiversity requires more than planting trees. Methods like natural regeneration — allowing forests to recover on their own — can often yield superior biodiversity outcomes, though they face social and economic barriers. By contrast, planting a few fast-growing species may sequester carbon quickly but offers little for threatened plants and animals.
Biodiversity recovery is influenced by many factors: the intensity of prior land use, the surrounding landscape and the species chosen for restoration. Recovery is slow, often measured in decades, and tends to lag for rare and specialist species. Alarmingly, most projects stop monitoring after just a few years, long before ecosystems stabilize.
Scientists underline that while proforestation, reforestation and forest rewilding can contribute to curbing climate change and biodiversity loss, they have their limits and must be combined with deep carbon emissions cuts and conservation of existing forests and wilderness.
Edward Faison, an ecologist at the Highstead Foundation, stood quietly in a patch of forest that stretched for miles in all directions. Above him, the needles from white pine trees swayed — common in the Adirondack Forest Preserve in northern New York state. He stepped past downed wood and big, broken snags, observing how the forest functioned with minimal interference.
“These forests have been essentially unmanaged for over 125 years. To see them continue to thrive and accumulate carbon, recover from natural disturbances and develop complexity without our help reveal just how resilient these systems are,” Faison says.
Protected from logging in 1894 by an act of the New York Legislature, the Adirondack Forest Preserve (AFP) is a model of natural forest growth, or letting forests simply “get on with it.” The largest trees, white pines (Pinus strobus), are more than a century old and stretch more than 150 feet tall and are 4-5 feet in diameter.
The AFP, the largest wilderness preserve in the eastern United States, is a prime example of what researchers have come to call “proforestation.” Coined in 2019 by Tufts University professor William Moomaw and Trinity College professor of applied science Susan Masino, the term proforestation describes the process of allowing existing forests to continue growing without human interference until they achieve their full ecological potential for carbon sequestration and biological diversity.
Proforestation is considered a natural climate solution, i.e., a strategy to steward the Earth’s vegetation to increase the removal of carbon dioxide (CO2) from the atmosphere. According to Faison, a forest naturally develops greater complexity over time, with a diversity of tree sizes and heights as well as large standing dead trees and downed logs. This complexity provides habitat for various animals, plants and fungi, which make the forest more resilient to disturbances associated with climate change.
Proforestation is distinct from reforestation, which can involve planting new trees in deforested areas to restore them (or allowing deforested areas to naturally regenerate). It is also different from afforestation, which is the process of planting new forests in previously unforested areas. Proforestation’s merit lies in inaction: simply leaving old forests undisturbed, allowing for continuous growth to maximize carbon accumulation over time. As forests mature and trees grow larger, they sequester greater amounts of carbon.
“The largest 1% diameter trees in a mature multiage forest hold half the carbon,” according to Moomaw. “It’s the existing forests that we have that are doing the work.” Existing forests remove almost 30% of CO2 from the atmosphere that humans put in every year from burning fossil fuels.
Older is better
In Mohawk Trail State Forest in Massachusetts, Moomaw studied the tallest grove of white pine trees in New England, aged between 150 and 200 years, observing how the trees grew. When comparing them with younger trees of the same type growing under similar conditions, he found that “the amount of carbon added by these trees between 100 and 150 years of age is greater than the amount added between zero and 50.”
In addition to carbon storage capabilities, old forests are pivotal in controlling regional and global water cycles through a process called evapotranspiration, by which water is transferred from the land to the atmosphere. Due to deeper and more complex root systems as well as larger canopies and leaves, old forests capture more water and release it as vapor into the atmosphere.
“Old forests have the genetic competence to do this work,” Masino says. “It’s not done by meadows. It’s not done by grassy areas. It’s not done as effectively by forests that have been cut or planted. It’s these ancient systems that have the complexity to bring water to themselves. And in doing that, they’re bringing it to the rest of the landscape. Once you start cutting the landscape, you’re drying it out.”
Masino, who also has a joint appointment in neuroscience and psychology at Trinity College, emphasizes the importance of designating natural areas appropriately and allowing more room for proforestation.
“It’s urgent to decide where we intend to prioritize natural processes, where we are doing research, and what areas we are dedicating for our resource needs,” she says. “Nature needs room to breathe. We can’t leave everything open to manipulation and extraction. It’s deadly.”
She says that planting trees on streets, on campuses or in parks is good for temperature regulation, flood protection and creating habitat, but these trees don’t grow up in a web of life. Planting trees in a forest, too, can risk disrupting the dynamic complexity of evolved and evolving genetic knowledge.
Wildlife dependent on old growth
Over on the West Coast, University of Oregon professor emeritus Beverly Law has studied forests for decades. She describes watching three logging trucks, each with a giant log from an old, single tree strapped to the back, passing in a procession while waiting at an intersection on her bike, a frequent occurrence on her way to work at the university in the late 1980s.
“There are plant and animal species that rely on these old forests for their survival. You take away the forest, and they’re gone,” Law says. “It’s important to have diverse genetics in the forest. Some of them will be more genetically able to withstand climate change than others. You don’t know which ones they will be. That is why genetic diversity within species is important.”
Mature forests are crucial to the survival of certain critically endangered animals that rely on the connected canopies or the soil-rich forest floor. Preserving the biodiversity of the Pacific Northwest, which hosts forests more than a thousand years old, is especially dire. According to a 2022 paper published in Environmental Chemistry Letters, old growth forests retain a number of species from both the top and bottom of the food chain, such as the Olympic salamander (Rhyacotriton olympicus), the Del Norte salamander (Plethodon elongatus) and the two species of tailed frog (Ascaphidae). Losing them forever could kick off a cascade effect and result in severe consequences for the environment.
The spotted owl (Strix occidentalis), too, depends on old-growth forests in the Pacific Northwest, requiring the specific environment for roosting and nesting, and remains a central figure in forest management debates.
Such hulking ancient trees are the eyes of the woods, having stood through changing years and the changing climate.
“Ten to 12% of old-growth forests are left [in the US], and it’s insane that people are still trying to cut them down,” Law says. “They are the only survivors of American handiwork. Is it man’s dominion over the forest? We should have reverence, considering they’re all that’s left.”
Banner image: Pine cone of a white pine (Pinus strobus). Image by Denis Lifanov via Flickr (CC BY-NC-SA 2.0).
In 1915, General Electric released a silent promotional film titled The Home Electricaloffering a glimpse into a gleaming, frictionless future. The film walks viewers through a model electric home: lights flicked on at the wall, meals cooked without fire, laundry cleaned without soap and muscle. A young wife smiles as she moves effortlessly through her day, assisted by gadgets that promised to eliminate drudgery and dirt. This was not a documentary—it was a vision, a fantasy, a sales pitch. At the time, only a small fraction of American households had electricity at all, and nearly 90% of rural families still relied on oil lamps, wood stoves, hand pumps, and washboards. But the message was clear: to be modern was to be electric—and anything less was a kind of failure.
At the dawn of the 20th century, electricity was still a symbol of wealth, not a tool of survival. Most urban households that had it used it only for lighting; refrigeration, electric stoves, or washing machines were luxuries among luxuries. In rural America, most farms and small towns remained off-grid through the 1920s. The electric grid simply didn’t go there. Private utilities, driven by profit, had no interest in building costly infrastructure where it wouldn’t quickly pay off.
And yet, propaganda told a different story. In magazines, World’s Fairs, and promotional pamphlets, electricity was shown as the cornerstone of health, cleanliness, efficiency, and modern womanhood. Electric appliances promised to save time, reduce labor, and lift families—especially women—into the new century. But this future was just out of reach for most people. A growing divide opened up: between those who lived by the rhythms of sun and fire, and those whose lives were quietly reshaped by the flick of a switch.
To live without electricity meant pumping water by hand, chopping and hauling wood for heat and cooking, cleaning clothes with a washboard, and preserving food with salt, smoke, or ice if you had it. It meant darkness after sundown unless you had oil or candles. These were difficult, time-consuming tasks—but also deeply embedded in older, place-based ways of life. People were less dependent on centralized systems. They mended clothes instead of buying new ones, and their food came from the land, not refrigerated trucks.
Yet the narrative of “progress” didn’t tolerate this complexity. By the 1920s and ‘30s, utilities and appliance manufacturers framed non-electric life as backward, dirty, and even unpatriotic. Their message: to be modern was to be electric.
This vision of electrified modernity wasn’t just implicit; it was relentlessly promoted through the dazzling spectacles of world’s fairs and the persuasive language of print advertising. Electricity was framed not only as a technological advance but as a moral and social imperative—a step toward cleanliness, order, and even national progress. At places like the 1904 St. Louis World’s Fair, entire palaces were built to glorify electricity, their glowing facades and futuristic interiors turning utility into fantasy. Meanwhile, companies like Western Electric and General Electric saturated early 20th-century magazines with ads that equated electric appliances with a better life—especially for women. These messages didn’t merely advertise products; they manufactured desire, anxiety, and aspiration. To remain in the dark was no longer quaint—it was backward.
At the 1904 St. Louis World’s Fair, the Palace of Electricity was more than an exhibit—it was theater. Illuminated by thousands of electric bulbs, the building itself was proof of concept: a monument to the power and promise of electrification. Inside, visitors encountered displays of the latest electric appliances and power systems, all framed as marvels of human ingenuity. Nearby, the Edison Storage Battery Company showcased innovations in energy storage, while massive dynamos hummed behind glass. The fair suggested not just that electricity was useful, but that it was destiny.
Louisiana purchase exposition, St. Louis, 1904. The Library of Congress, via Wikimedia Commons.
This theatrical framing of electricity as progress carried into everyday life through print advertisements. A 1910 issue of Popular Electricity magazine illustrated a physician using electric light in surgery, suggesting that even health depended on electrification. In a 1920 ad for the Hughes Electric Range, a beaming housewife is pictured relaxing while dinner “cooks itself,” thanks to the miracle of electricity. Likewise, a Western Electric ad from the same year explained how to build an “electrical housekeeping” system—one that offered freedom from drudgery, but only if the right appliances were purchased.
These messages targeted emotions as much as reason. They played on fears of being left behind, of being an inadequate housewife, of missing out on modernity. Electricity was no longer merely about illumination—it became a symbol of transformation. The more it was portrayed as essential to health, domestic happiness, and national strength, the more it took on the aura of inevitability. A home without electricity was not simply unequipped; it was a failure to progress. Through ads, exhibits, and films, electricity was sold not just as a convenience, but as a moral good.
And so the groundwork was laid—not only for mass electrification, but for the idea that to live well, one must live electrically.
Before the Toaster: Industry was the First Beneficiary of Electrification
While early 20th-century advertisements showed electricity as a miracle for housewives, the truth is that industry was the first and most powerful customer of the electric age. Long before homes had refrigerators or lightbulbs, factories were wiring up to electric motors, electric lighting, and eventually, entire assembly lines driven by centralized power. Electricity made manufacturing more flexible, more scalable, and less tied to water or steam—especially important in urban areas where land was tight and labor plentiful.
By the 1890s, industries like textiles, metalworking, paper mills, and mining were early adopters of electricity, replacing steam engines with electric motors that could power individual machines more efficiently. Instead of a single massive steam engine turning shafts and belts throughout a factory, electric motors allowed decentralized control and faster adaptation to different tasks. Electric lighting also extended working hours and improved productivity, particularly in winter months.
Electrification offered not just operational efficiency but competitive advantage—and companies knew it. By the 1910s and 1920s, large industrial users began lobbying both utilities and governments for better access to power, lower rates, and more reliable service. Their political and economic influence helped shape early utility regulation and infrastructure investment. Many state utility commissions were lobbied heavily by industrial users, who often negotiated bulk discounts and prioritized service reliability over residential expansion.
This dynamic led to a kind of two-tiered system: electrification for factories was seen as economically essential, while electrification for homes was framed as aspirational—or even optional. In rural areas especially, private utilities refused to extend lines unless they could first serve a profitable industrial customer nearby, like a lumber mill or mine.
Meanwhile, companies that produced electrical equipment—like General Electric, Westinghouse, and Allis-Chalmers—stood to gain enormously. They pushed for industrial electrification through trade shows, engineering conferences, and direct lobbying. Publications like Electrical World and Power magazine ran glowing stories about new industrial applications, highlighting speed, productivity, and cost savings. GE and Westinghouse didn’t just sell light bulbs and home gadgets—they also built turbines, dynamos, and entire systems for industrial-scale customers.
And industry didn’t just demand electricity—industry helped finance it. Many early power plants, particularly in the Midwest and Northeast, were built explicitly to serve one or more large factories, and only later expanded to provide residential service. These plants often operated on a model of “load factor optimization”: power usage by factories during the day and homes at night ensured a steady demand curve, which maximized profits.
By the 1920s, the logic was clear: industry came first, homes came second—but both served the larger vision of an electrified economy. And this industrial-first expansion became one of the justifications for public electrification programs in the 1930s. If electricity had become so essential to national productivity, how could it remain out of reach for most rural Americans?
Niagara Falls Power Plant: Built for Industry
In 1895, the Niagara Falls Power Company, led by industrialist Edward Dean Adams and with technological help from Westinghouse Electric and Nikola Tesla, completed the Adams Power Plant Transformer House—one of the first large-scale hydroelectric plants in the world.
Eight of the ten 1,875 kW transformers at the Adams Power Plant Transformer House, 1904, public domain
This plant didn’t exist to power homes. Its primary purpose was to serve nearby industries: electrochemical, electrometallurgical, and manufacturing firms that required vast amounts of energy. The ability to harness hydropower made Niagara Falls a magnet for energy-intensive factories.
Founded in 1891, Carborundum relocated to Niagara Falls in 1895 to take advantage of the abundant hydroelectric power. They manufactured silicon carbide abrasives, known as “carborundum,” using electric furnaces that operated at high heat. The company was the second to contract with the Niagara Falls Power Company, underscoring the plant’s role in attracting energy-intensive industries.
The promise of abundant cheap power made Niagara Falls the world capital of electro-chemical and electro-metallurgical industries, which included such companies as the Aluminum Company of America (ALCOA), Carborundum (which developed the world’s hardest abrasive as well as graphite), Union Carbide, American Cyanamid, Auto-Lite Battery, and Occidental Petroleum. These were enterprises that depended upon abundant cheap power. At its industrial peak, in 1929, Niagara Falls was the leading manufacturer in the world of products using abrasives, carbon, chlorine, and ferro-alloys.
In the late 19th and early 20th centuries, Niagara Falls became a hub for industrial activity, primarily due to its abundant hydroelectric power. The establishment of the Niagara Falls Power Company in 1895 marked the beginning of large-scale electricity generation in the area. This readily available power attracted energy-intensive industries, including aluminum production, electrochemical manufacturing, and abrasives. Companies like the Pittsburgh Reduction Company (later Alcoa) and the Carborundum Company set up operations to capitalize on the cheap and plentiful electricity.
Even food companies jumped on the opportunity for abundant electricity. The founder of the Shredded Wheat Company (maker of both Shredded Wheat and Triscuit), Henry Perky, built a large factory directly at Niagara Falls, choosing the site precisely because of its access to cheap, abundant hydroelectric power. When the Triscuit cracker was first produced in 1903, the factory was powered entirely by electricity—a key marketing point. Early ads bragged that Triscuits were “Baked by Electricity,” which was a novel and futuristic idea at the time.
However, this rapid industrial growth came at a significant environmental cost. The freedom afforded to early industry in Niagara Falls meant that area waterways became dumps for chemicals and other toxic substances. By the 1920s, Niagara Falls was home to a dynamic and thriving chemical sector that produced vast amounts of industrial-grade chemicals via hydroelectric power. This included the production of chlorines, degreasers, explosives, pesticides, plastics, and myriad other chemical agents.
The success at Niagara set a precedent: electricity could fuel industrial expansion, and factories began lobbying for access to centralized electric power. States and cities recognized that electrification attracted investment, jobs, and tax revenue. This created political pressure to expand grids and build new generation capacity—not to homes first, but to industrial parks and cities with manufacturing bases.
The environmental impact was profound. In 1986, Canadian researchers discoveredthat the mist from the falls contained cancer-causing chemicals, leading both the U.S. and Canada to promise cleanup efforts. Moreover, the Love Canal neighborhood in Niagara Falls became infamous for being the site of one of the worst environmental disasters involving chemical wastes in U.S. history. The area was used as a dumping ground for nearly 22,000 tons of chemical waste, leading to severe health issues for residents and eventual evacuation of the area.
This historical example underscores the complex legacy of electrification—while it spurred industrial advancement and economic growth, it also led to environmental degradation and public health crises.
The Salesman of the Grid: Samuel Insull and the Corporate Vision of a Public Good
Even as electricity was still being marketed as a lifestyle upgrade—offering clean kitchens, lighted parlors, and “freedom from drudgery”—Samuel Insull was reshaping the electrical industry behind the scenes in ways that would bring electricity to both homes and factories on an unprecedented scale. A former secretary to Thomas Edison, Insull became the president of Chicago Edison (later Commonwealth Edison) and transformed the electric utility into a regional power empire. He championed centralized generation, long-distance transmission, and, most importantly, load diversity: the idea that combining industrial and residential customers would create a steadier, more profitable demand curve.
Industry, after all, consumed massive amounts of electricity during the day, while households peaked in the evenings. By blending these demands, utilities could justify larger power plants that ran closer to capacity around the clock—making electricity cheaper to produce per unit and more profitable to sell.
Insull’s holding companies and financial structures helped finance this expansion, often using consumer payments to support new infrastructure. This helped expand the grid outward—to serve not just wealthy homes and big factories, but small towns and middle-class neighborhoods. Electrification became a virtuous cycle: the more customers (especially industrial ones) you had, the more power you could afford to generate, which brought in more customers. The industrial appetite for power and the domestic aspiration for comfort were two sides of the same system.
By the early 20th century, Insull had consolidated dozens of smaller electric companies into massive holding corporations, effectively inventing the modern utility monopoly. His genius wasn’t technical but financial: he pioneered the use of long-term bonds and ratepayer-backed financing to build expansive infrastructure, including coal-fired power plants and transmission lines that could serve entire cities and suburbs.
Insull also understood that to secure profits, electricity had to become not a luxury, but a public necessity. He lobbied for—and helped shape—state-level utility commissions that regulated rates but guaranteed companies a return on investment. He promoted a pricing model in which larger customers subsidized smaller residential ones, making electricity seem affordable while expanding the customer base. In speeches and newspaper campaigns, Insull insisted that electricity was a public service best delivered by private enterprise—so long as that enterprise was shielded from competition and supported by the state.
But Insull’s vision had limits. His business model was urban, corporate, and capital-intensive. It thrived in cities where growth and profits were assured—but left rural America behind. Even by the late 1920s, nearly 90% of rural households still had no electricity, and private utilities had little interest in changing that. When Insull’s financial empire collapsed during the Great Depression—leaving thousands of investors penniless—it triggered a wave of backlash and set the stage for Roosevelt’s 1930s public electrification programs.
The failure of Insull’s empire didn’t just expose the risks of private monopolies; it also reframed electricity as too essential to be left entirely in corporate hands. If the promise of electrification was to reach beyond city limits, it would take more than advertising. It would take state power.
Electricity as a Public “Good”
Franklin D. Roosevelt’s New Deal ushered in that power—both literally and figuratively. Federal programs like the Tennessee Valley Authority (TVA), the Rural Electrification Administration (REA), and the Works Progress Administration (WPA) tackled electrification as a national mission. The TVA aimed to transform one of the poorest regions in the country through public power and flood control. The REA extended loans to rural cooperatives to build distribution lines where private utilities refused to go. The WPA, though more broadly focused on employment and infrastructure, supported the building of roads, dams, and even electric grids that tied into the new public utilities.
But these were not just engineering projects—they were nation-building efforts, wrapped in the language and imagery of progress. Government-sponsored films, posters, and exhibits cast electrification as a patriotic duty and a moral good. In The TVA at Work (1935), a TVA propaganda film, darkness and floods give way to light as electricity reaches the rural South, promising flood control, education, health, and hope.
Posters issued by the REA featured glowing farmhouses surrounded by darkness, their light a beacon of the federal government’s benevolence. Electrification was no longer a luxury product to be sold—it was a public right to be delivered. And propaganda helped recast the electric switch as not just a convenience, but a symbol of democratic progress.
In the early decades of the 20th century, the business of providing electricity was largely in private hands, dominated by powerful industrialists who operated in a fragmented and often exploitative landscape. Rates varied wildly, service was inconsistent, and rural areas were left behind entirely. Out of this chaos emerged a slow, contested movement to treat electricity not as a luxury good for profit but as a regulated public utility—something closer to a right.
Roosevelt’s electrification programs—especially the TVA and the REA—aimed to provide public benefits rather than private profit. But in reality, most rural Americans didn’t vote on where dams and coal-fired power plants would go, how the landscape would be transformed, or who would manage the power. The decision-making remained highly centralized, and the voice of the people was filtered through federal agencies, engineers, and bureaucrats. If this was democracy, it was a technocratic form—focused on distributing benefits, not sharing power.
Still, for many rural communities, the arrival of electricity felt like democratic inclusion: a recognition by the federal government that their lives mattered too. New Deal propaganda leaned into this feeling. Posters, pamphlets, and films portrayed electrification as a patriotic triumph—uniting the country, modernizing the nation, and bringing light to all Americans, not just the urban elite.
FDR fiercely criticized utility companies for their opposition to these efforts. In one speech, he called out their “selfish purposes,” accusing them of spreading propaganda and corrupting public education to protect their profits. His administration’s Public Utility Holding Company Act of 1935 was designed to break up massive utility holding companies, increase transparency, and limit the abusive practices that had flourished under Insull’s system.
By the end of the 1930s, electricity had changed in the eyes of the law and the public. It was no longer a commodity like soap or phonographs. It was essential—a regulated utility, under public scrutiny, increasingly expected to reach all people regardless of profit margins.
How Rural Communities Organized for Electricity
Reaching everyone required more than federal mandates; it required rural people—many of whom had never flipped a light switch—to believe electricity was not just possible, but necessary. New Deal propaganda didn’t just promote electrification; it made it feel like a patriotic obligation. In posters, films, and traveling exhibits, electricity was depicted as a force of national renewal, radiating from power plants and wires like sunlight over a darkened land. Farmers who had once relied on kerosene lanterns saw glowing visions of electric barns, modern kitchens, and clean, running water. The message was clear: this wasn’t charity—it was justice.
The REA offered low-interest loans to communities willing to organize themselves into cooperatives. But before wires could be strung, people had to organize—drawing maps, knocking on doors, pooling resources. That kind of coordination didn’t happen spontaneously. It was sparked, in large part, by persuasive media.
REA films like Power and the Land (1940) dramatized the transformation of farm life through electricity. Traveling REA agents brought these short films and illustrated pamphlets to town halls, church basements, and grange meetings, showing everyday people that their neighbors were already forming co-ops—and thriving. REA’s Rural Electrification News magazine featured testimonials from farm wives, who praised electric irons, cream separators, and the ability to read after sunset. Electrification wasn’t just about comfort; it was about dignity and opportunity.
A TVA poster from the period shows power lines bringing power for farm fields, homes, and factories. The subtext was unmistakable: electricity was the pulse of a modern democracy. You didn’t wait for it. You organized for it.
And people did. Between 1935 and 1940, rural electrification—driven by this blend of policy and persuasion—expanded rapidly. By 1940, more than 1.5 million rural homes had electricity, up from barely 300,000 just five years earlier. The wires came not just because the government built them, but because people demanded them, formed cooperatives, and rewired their lives around a new kind of infrastructure—one they now believed they deserved.
When FDR created the REA in 1935, fewer than 10% of rural homes had electricity. By 1953, just under two decades after the REA’s launch, over 90% of U.S. farms had electric service, much of it delivered through cooperatives that had become symbols of rural self-determination.
The Federal Power Act
In 1935, the same year Roosevelt signed executive orders establishing the Rural Electrification Administration, Congress passed the Federal Power Act—an often-overlooked but foundational shift in how electricity was governed in the United States. At the time, only about 60% of American homes had electricity, and the vast majority of rural households remained off the grid. Industry was rapidly becoming reliant on continuous, 24/7 electric power to run increasingly complex machinery and production lines, making reliable electricity essential not just for homes but for the nation’s economic engine.
The Act expanded the jurisdiction of the Federal Power Commission, granting it authority to regulate interstate transmission and wholesale sales of electricity. This marked a decisive move away from the era of laissez-faire monopolies toward public oversight. Industry players, eager for dependable and affordable power to sustain growth and competition, played a subtle but important role in pushing for federal regulation that would stabilize the market and ensure widespread, reliable access. The Act framed electricity not as a luxury commodity but as a vital service that required accountability and coordination. In tandem with the New Deal electrification programs, it laid the legal groundwork for treating electricity as a public good—setting the stage for how electricity would be mobilized, mythologized, and mass-produced during wartime.
Electricity as Patriotic Duty
By the end of the 1930s, electricity had changed in the eyes of the law and the public. It was no longer a commodity like soap or phonographs. It was essential—a regulated utility, under public scrutiny, increasingly expected to reach all people regardless of profit margins.
But as the nation edged closer to war, the story of electricity changed again. The gleaming kitchens and “eighth wonder of the world” dams of New Deal posters gave way to a new message: power meant patriotism. Electricity was no longer just a household convenience or symbol of rural uplift—it was fuel for victory.
Even before the U.S. formally entered World War II, government and industry launched campaigns urging Americans to think of their energy use as a form of service. Factories were electrified at full tilt to produce planes, tanks, and munitions. Wartime posters and advertisements called on citizens to “Do Your Part”—to conserve power at home so it could be redirected to the front. Lights left on unnecessarily weren’t just wasteful; they were unpatriotic.
One striking 1942 poster from the U.S. Office of War Information featured a light switch with the message: “Switch off that light! Less light—more planes.” Another encouraged energy conservation by asking people to switch lights off promptly because “coal is vital to victory” (at this time 56% total electricity on U.S. grids was generated by coal).
For women, especially, electricity was again positioned as a moral responsibility. Earlier ads had promised electric gadgets to free housewives from drudgery; now, propaganda reminded them that their efficient use of electric appliances was part of the national war strategy. The same infrastructure built by New Deal programs now helped turn the rural power grid into an engine of military supply.
Electricity had become inseparable from national identity and survival. To use it wisely was to serve the country. To waste it was to betray the war effort. This was no longer a story of gadgets and progress—it was a story of sacrifice, duty, and unity under the banner of light.
Nowhere was this message clearer than in the materials produced by the Bonneville Power Administration (BPA), which managed the massive hydroelectric output of the Columbia River dams in the Pacific Northwest. In the early 1940s, the BPA commissioned a series of posters to dramatize the link between public power and wartime production. One of the most iconic, “Bonneville Fights Time,” shows a welder in a protective mask, sparks flying, framed by dynamic lines of electricity and stylized clock hands. The message: electric power enabled faster, more precise welding—crucial for shipbuilding, aircraft, and munitions production.
The poster’s bold composition connected modernist design with national urgency. Bonneville’s electricity wasn’t just flowing to light bulbs—it was flowing to the war factories of the Pacific coast, to the shipyards of Portland and Seattle, and to the aluminum plants that turned hydroelectric power into lightweight warplanes. These images promoted more than technical efficiency; they sold a vision of democratized power mobilized for total war.
Through such propaganda, the promise of public power was reimagined—not just as a civic good, but as a weapon that could help win World War II.
Electrifying the American Dream
When the war ended, the messaging around electricity shifted again—from sacrifice to surplus. Wartime rationing gave way to a marketing explosion, and the same electrified infrastructure that had powered victory was now poised to power prosperity. With factories retooled for peace-time commerce, and veterans returning with GI Bill benefits and dreams of suburban life, the home became the new front line of American identity—and electric gadgets were its weaponry.
The postwar boom fused electricity with consumption, convenience, and class mobility. Advertisements no longer asked families to conserve power for the troops; they encouraged them to buy electric dishwashers, toasters, vacuum cleaners, televisions. Owning a full suite of appliances became a marker of success, a tangible reward for patriotism and patience. Electricity was no longer just a utility—it was the lifeblood of modern living, sold with the same glamour and intensity once reserved for luxury cars or perfumes.
Utilities and manufacturers teamed up to keep the vision alive. The Live Better Electrically campaign, launched in 1956 and endorsed by celebrities like Ronald Reagan, urged Americans to “go all-electric”—not just for lighting and appliances, but for heating, cooking, and even air conditioning. The campaign painted a glowing picture of total electrification, backed by images of smiling housewives, sparkling kitchens, and obedient gadgets. In one ad, a mother proudly paints a heart on her electric range as her children and husband laugh and smile. The future, once uncertain, had been domesticated.
Nowhere was the all-electric ideal more vividly branded than in the Gold Medallion Home, a product of The Live Better Electrically campaign. These homes were awarded a literal gold medallion by utilities if they met a full checklist: electric heat, electric water heater, electric kitchen appliances, and sufficient wiring to support a future of plugged-in living. Promoted through glossy ads and celebrity endorsements, the Medallion Home symbolized upward mobility, domestic modernity, and patriotic participation in a high-energy future. It was a propaganda campaign that blurred the line between consumer aspiration and infrastructure planning. Today’s “electrify everything” efforts—encouraging heat pumps, EVs, induction stoves, and smart panels—echo this strategy. Once again, homes are being refashioned as sites of technological virtue and national progress, marketed through a familiar mix of lifestyle promise and utility coordination. The medallion has changed shape, but the message remains: the future lives here.
This was propaganda of abundance. And behind it was an unspoken truth: electrification had won. What had once been sold as fantasy—glimpsed in world’s fair palaces or GE films—was now embedded in daily life. The flick of a switch no longer symbolized hope. It had become habit.
Ruralite
Ruralite magazine serves as the flagship publication of Pioneer Utility Resources, a not-for-profit communications cooperative to serve the rural electric cooperatives (or co-ops) across the western United States. It was—and remains—a shared publication platform for dozens of small, locally owned utility co-ops that formed in the wake of the REA.
Each electric co-op—often based in small towns or rural counties—can customize part of the magazine with local news, board updates, outage reports, and community features. But the bulk of the magazine is centrally produced, offering ready-made content: stories about electric living, energy efficiency, co-op values, new technologies, and the benefits of belonging to a cooperative utility system.
In this sense, Ruralite functions as a kind of regional PR organ: a hybrid of lifestyle magazine, customer newsletter, and soft-sell propaganda tool. It is funded by and distributed through electric co-ops themselves, landing monthly in the homes of hundreds of thousands of rural residents.
Though it debuted in 1954—well after the apex of New Deal electrification programs—Ruralite can be seen as a direct descendant of that era’s propaganda infrastructure, repackaged for peacetime and consumer prosperity. The TVA had its posters, the REA had its pamphlets, and Ruralite had glossy photo spreads of farm wives with gleaming electric ranges.
Where New Deal propaganda had rallied Americans to support rural electrification as a national project of fairness and modernity, Ruralite shifted the tone toward comfort, aspiration, and consumer loyalty. It picked up the baton of electrification as cultural transformation, reinforcing the idea that electric living wasn’t just a right—it was the new rural ideal.
Clipped from “For the Curious Ruralite,” tips to encourage electricity use from the December 1954 edition of Ruralite Magazine
Ruralite framed rural electrification not as catching up to the cities, but as leading the way in a new era—one where rural values, ingenuity, and resourcefulness would power the country forward. In this way, co-ops and their members became symbols of progress, not just beneficiaries of it.
This was propaganda not by posters or patriotic slogans, but through community storytelling. Ruralite grounded its messaging in local personalities, recipes, and relatable anecdotes, while embedding calls to adopt more appliances, update homes, and trust in the local co-op as a benevolent, forward-looking institution.
The first Ruralite recipe, for which you need an electric refrigerator, published in Ruralite Magazine, June 1954. Clipped from this June 1, 2024 article.
Today, Ruralite remains rooted in local storytelling, but its tone aligns more with contemporary consumer lifestyle media. Sustainability, renewables, and energy efficiency now appear alongside nostalgic rural features and recipes. Yet despite the modern packaging, the core narrative remains consistent: electricity is integral to the good life. That through-line—from a beacon of modernization to a pillar of local identity—demonstrates how the publication has adapted without abandoning its propagandistic roots.
In the current energy landscape, Ruralite plays a quiet but significant role in advancing the “electrify everything” agenda—the 21st-century push to decarbonize buildings, transportation, and infrastructure by transitioning away from fossil fuels to electric systems.
While Ruralite doesn’t use overtly political language, it steadily normalizes new electric technologies like heat pumps, EVs, induction stoves, and solar arrays. Features on homeowners who upgraded to electric water heaters, profiles of co-ops launching EV charging stations, or DIY guides for energy audits all reinforce the idea that the electric future is practical, responsible, and here. The message is aspirational but grounded in small-town pragmatism: this isn’t Silicon Valley hype—it’s your neighbor electrifying their barn or replacing a propane furnace or reminiscing about life without electricity.
Ruralite continues the legacy of New Deal-era propaganda by promoting ever-greater electricity use—now through electric vehicles and heat pumps instead of fridges and space heaters—reinforcing the idea that progress always means more power, more consumption, and more infrastructure. Its storytelling still serves a strategic function—ensuring electricity remains not just accepted, but desired, in every American home.
Postwar Peak and Decline of Electrification Propaganda
By the 1960s, most American homes—urban and rural—had been electrified. The major battle to electrify the country was won. As a result, the overt electrification-as-progress propaganda that had dominated the New Deal era and postwar boom faded. Electricity became mundane: a background utility, no longer something that needed to be sold as revolutionary.
During the 1970s and early 1980s, the focus of public discourse shifted toward energy crises and conservation. Rather than expanding electrification, the government and utilities started encouraging Americans to use less, not more—a notable, if temporary, reversal. The 1973 oil shock, Three Mile Island (1979), and rising distrust in institutions tempered the earlier utopian energy messaging.
1970’s energy conservation poster, via Low Carbon Institute, in the personal collection of Russell Davies.
However, electrification propaganda never vanished entirely. It just narrowed. Publications like Ruralite and utility co-ops continued localized campaigns, pushing upgrades (like electric water heaters or electric stoves) in rural areas and maintaining a cultural narrative of electric life as modern and efficient.
The Renewables-Era Revival of Electrification Propaganda
In the late 1990s and especially the 2000s, a new wave of electrification propaganda began to emerge, but this time under the banner of climate action. Instead of promoting electricity as luxury or convenience, the new message was: electrify everything to save the planet.
This “green” electrification push encourages:
Electric vehicles (EVs) to replace gasoline cars
Heat pumps to replace fossil fuel heating systems
Induction stoves over gas ranges
Grid modernization and massive renewable build-outs (wind, solar, batteries)
Glossy, optimistic, uncritical propaganda pushing electricity from Ruralite Magazine, December 2023.
The messaging echoes earlier propaganda in tone—glossy, optimistic, often uncritical—but reframes the moral purpose: not modernization for its own sake, but decarbonization. The tools remain similar: media campaigns, federal incentives, public-private partnerships, and co-op publications like Ruralite, which has evolved to reflect this new narrative.
Typical imagery promoting “clean energy.” This image is used on a League of Conservation Voters initiative, Clean Energy for All.
Modern utility outreach events like co-op utility Orcas Power and Light Cooperative’s (OPALCO) EV Jamboree—where electric vehicles are showcased, test drives offered, and electrification is framed as exciting and inevitable—echo the strategies of the REA’s mid-century traveling circuses. Just as the REA brought portable demonstrations of electric appliances and farm equipment to rural fairs to sell the promise of a brighter, cleaner, more efficient life, today’s utilities stage events to generate enthusiasm for electric vehicles, heat pumps, and smart appliances. In both cases, the goal is not just education but persuasion—selling a future tied to deeper dependence on the electric grid.
Advertisement for an EV Jamboree, propaganda for electric vehicles, boats, bikes, etc.
One of the most striking revivals is the push for nuclear power, long dormant after public backlash in the 1980s. Once considered politically radioactive and dangerous, nuclear is now rebranded as a clean energy savior. The Biden administration has supported small modular reactor (SMR) development and extended funding for existing nuclear plants. More recently, President Donald Trump announced plans to reinvest in nuclear infrastructure, positioning it as a strategic national asset and imperative for national security and industry. The messaging is clear: nuclear is back, and it’s being sold not just as a technology, but as a patriotic imperative.
The Green Delusion and the Digital Demand: Modern Propaganda for an Electrified Future
In the 21st century, electrification propaganda has been reborn—not as a tool to bring light to rural homes or sell refrigerators, but as a moral and technological mandate. This time, it’s cloaked in the language of sustainability, innovation, and decarbonization. Utilities, tech giants, and government agencies now present an electrified future as inevitable and ethical. But beneath the rhetoric lies a powerful continuity with the past: electricity must still be sold to the public, and propaganda remains the vehicle of persuasion.
The contemporary campaign is driven by a potent mix of actors. Investor-owned utilities plaster their websites with wind turbines and solar panels, promoting the idea that they are leading the charge toward a cleaner future. Federal and state governments offer rebates and incentives for EVs, solar panels, heat pumps, and induction stoves, framing these changes not only as personal upgrades, but as civic duties. Corporate giants like Google, Microsoft, and Amazon amplify the message, touting their commitment to “100% renewable” operations—while quietly brokering deals for bespoke gas and nuclear plants to keep their operations online, and selling their digital services to fossil fuels companies.
Deceptive practices are proliferating alongside the expansion of renewable energy infrastructure. Companies developing utility-scale solar projects often mislead communities about the scale, impact, and permanence of proposed developments—if they engage with them at all. Local residents frequently report being excluded from the planning process, receiving vague or misleading information, or being outright lied to about how the projects will alter their environment. As Dunlap et al. document in their paper ‘A Dead Sea of Solar Panels:” Solar Enclosure, Extractivism and the Progressive Degradation of the California Desert, such tactics are not anomalies but part of a systemic pattern:
[W]e would flat out ask them [the company] questions and their answers were not honest … [it] led me to believe they really didn’t care about us. They had charts of where lines were going to be, and later, we found out that it wasn’t necessarily the truthful proposal. And you’re thinking: ‘why do you have to deceive us?’
— Desert Center resident, quoted in ‘A Dead Sea of Solar Panels:’ solar enclosure, extractivism and the progressive degradation of the California desert, by Dunlap et. al.
These projects, framed publicly as green progress, often mask an extractive logic—one that mirrors the practices of fossil fuel development, only cloaked in the language of sustainability.
At the heart of this new energy push lies a paradox: the renewable future requires more electricity than ever before. Electrifying transportation, heating, and industry demands a massive expansion of grid infrastructure—new transmission lines, more generation, and more raw materials. But increasingly, the driver of this expansion is data.
Artificial intelligence, cloud computing, and cryptocurrency mining are extraordinarily power-hungry. Modern AI models require vast data centers, each consuming megawatts of electricity—often 24/7. In his May 2025 Executive Order promoting nuclear energy, President Donald Trump made this explicit: “Advanced nuclear reactors will power data centers, AI infrastructure, and critical defense operations.” Here, electricity isn’t just framed as a public good—it’s a strategic asset. The demand for clean, constant energy is now justified not by light bulbs or quality of life, but by national security and economic dominance in the digital age.
This shift has profound implications. The public is once again being asked to accept massive infrastructure projects—new power generation plants and transmission corridors, subsidies for private companies, and increased energy bills—as the price of progress. Utilities and politicians assure us that this growth is green, even as the material and ecological costs of building out renewables and data infrastructure are hidden from view. The new propaganda is sleeker, data-driven, and more morally charged—but at its core, it performs the same function as its 20th-century predecessors: to justify a massive increase in power use.
A particularly insidious thread in this new wave of propaganda is the claim that artificial intelligence will “solve” climate change. This narrative, repeated by CEOs, media outlets, and government officials, frames AI as a kind of techno-savior: capable of optimizing energy use, designing better renewables, and fixing broken supply chains. But while these applications are technically possible, they are marginal compared to the staggering energy footprint of building and running large-scale AI systems. Training a single frontier model can consume as much power as a small town.Once operational, the server farms that host these models run 24/7, devouring electricity and water—often in drought-prone areas—and prompting utilities to fire up old coal and gas plants to meet projected demand.
Under the guise of “solving” the climate crisis, the AI boom is accelerating it. And just like earlier propaganda campaigns, the messaging is carefully crafted: press releases about “green AI” and “green-by-AI” along with glossy reports touting efficiency gains distract from the physical realities of extraction, combustion, and carbon emissions. The promise of virtual solutions is being used to justify real-world expansion of energy-intensive infrastructure. If previous generations were sold the dream of electrified domestic bliss, today’s consumers are being sold a dream of digital salvation—packaged in clean fonts and cloud metaphors, but grounded in the same old logic of growth at all costs.
The Material Reality of “Electrify Everything”
While the language of “smart grids,” “clean energy,” and “electrify everything” suggests a sleek, seamless transition to a more sustainable future, the material realities tell a very different story. Every CPU chip, electric vehicle, solar panel, wind turbine, and smart meter is built from a global chain of extractive processes—mined lithium, cobalt, copper, rare earth elements, steel, silicon, and more—often sourced under environmentally destructive and socially exploitative conditions. Expanding the grid to support these technologies requires not just energy but immense physical infrastructure: transmission lines slicing through forests and deserts, substations and data centers devouring land and power, and constant maintenance of an aging, overstretched network.
Yet this reality is largely absent from public-facing narratives. Instead, we’re fed slogans like “energy humanism” and “clean electrification”—terms that obscure the industrial scale and catastrophic impacts of what’s being proposed. Like the early electrification propaganda that portrayed hydropower as endlessly abundant and benevolent (salmon and rivers be damned), today’s messaging continues to erase the costs of extraction, land use, and energy consumption, promoting technological salvation without acknowledging the planetary toll.
Propaganda for “green minerals” extraction in Zambia
The scale of extraction required to electrify everything is staggering. According to the International Energy Agency (IEA), reaching global climate goals by 2040 could require a massive increase in demand for minerals like lithium, cobalt, and nickel. For lithium alone, the World Bank estimates production must at least quadruple by 2040 to meet EV and battery storage needs. Copper—essential for wiring and grid infrastructure—faces a predicted shortfall of 6 million metric tons per year by 2031, even as global demand continues to surge with data centers, EVs, and electrification programs.
If you just paint your mining equipment green and use more electricity to mine, somehow that will make mining “sustainable”? Illustration from the paper Advancing toward sustainability: The emergence of green mining technologies and practices published in Green and Smart Mining Engineering
Mining companies have seized the moment to rebrand themselves as climate heroes. Lithium Americas, which plans to operate the massive Thacker Pass lithium mine in Nevada, is described as “a cornerstone for the clean energy transition” and touts itself as a boon for local employment, even while the company destroys thousands of acres of critical habitat. The company promises jobs, school funding, and tax revenue—classic propaganda borrowed from 20th-century industrial playbooks. But local resistance, including from communities like the Fort McDermitt Paiute and Shoshone Tribe, underscores the deeper truth: these projects degrade ecosystems, threaten sacred sites, and deplete water resources in arid regions.
Another mining giant, Rio Tinto, has aggressively marketed its “green” copper and lithium projects in Serbia, Australia, and the U.S. as “supporting the green energy revolution,” while downplaying community opposition, pollution risks, and the company’s long history of environmental destruction. Their PR materials highlight “sustainable mining,” “low-carbon futures,” and “partnering with communities,” despite persistent local protests and growing global awareness of mining’s high environmental costs.
What’s missing from these narratives is any serious reckoning with the energy required to mine, transport, refine, and manufacture these materials, along with the energy needed to power the growing web of electrified infrastructure. As the demand for data centers, EV fleets, AI training clusters, and smart grids accelerates, we are rapidly expanding industrialization in the name of sustainability, substituting fossil extractivism with mineral extractivism rather than questioning the ever-increasing energy and material throughput of modern society.
Across the U.S., utilities are aggressively promoting electric vehicles, heat pumps, and “smart” appliances as part of their electrification campaigns—often framed as climate solutions. Pacific Gas & Electric (PG&E) in California, for example, offers rebates on EVs and encourages members to electrify their homes and transportation. Yet at the very same time, utilities like PG&E also warn that the electric grid is under strain and must expand dramatically to meet rising demand. This contradiction is rarely acknowledged. Instead, utilities position grid expansion as inevitable and green, framing it as “modernization” or “resilience.” What’s omitted is that electrifying everything doesn’t reduce energy use—it shifts and increases it, requiring vast new infrastructure, more centralized control, and continued extractivism.
The public is told that using more electricity will save the planet, while being asked to accept more pollution and destroyed environments along with new transmission lines, substations, and higher rates to pay for it all.
From Luxury to Necessity: Total Dependence on a Fragile Grid
The stability of the electricity grid requires electricity supply to constantly meet electricity demand, which in turn, requires numerous entities that operate different components of the grid to coordinate with each other.
Over the last century, electricity has shifted from a shimmering novelty to an unspoken necessity—so deeply embedded in daily life that its absence feels like a crisis. This transformation did not happen organically; it was engineered through decades of propaganda, from World’s Fairs and government-backed campaigns to glossy co-op magazines and modern “electrify everything” initiatives. What began as a promise of convenience became a system of total dependence.
OPALCO pushes EVs, electric appliances and heat pumps, while at the same time publishing articles about how the grid is under strain.
Today, every layer of modern life—communication, healthcare, finance, water delivery, food preservation, transportation, and farming—relies on a constant, invisible stream of electrons. Yet the grid that supplies them is increasingly strained and precarious. As utilities push electric vehicles, heat pumps, and AI-fueled growth, and states (like Washington State) offer tax incentives to electricity-hungry industries, they simultaneously warn that the grid must expand rapidly to avoid collapse. The public is told this expansion is progress. But the more electrified our lives become, the more vulnerable we are to its failures.
This was laid bare in March 2024, when a massive blackout in Spain left over two million people without power and seven dead. Train systems halted. ATMs stopped working. Hospitals ran on limited backup power. Food spoiled, water systems faltered, and thousands were stranded in elevators and subways. The cause? A chain of technical failures made worse by infrastructure stretched thin by new demands and the rapid expansion of renewables. Spanish officials called it a “wake-up call.” But for many, it was a terrifying glimpse into just how brittle the electric scaffolding of modern life has become.
Contrast that with life just 130 years ago, when the vast majority of Americans lived without electricity. Homes were lit by kerosene and heated by wood. Water was drawn from wells. Food was preserved with salt or root cellars. Communities were far more self-reliant, and daily life, while harder in some ways, was not exposed to the singular point of failure that defines today’s electrified society.
Before widespread electrification, communities were more tightly knit by necessity. Without the conveniences of refrigeration, electric heating, or instant communication, people relied on one another. Neighbors shared food, labor, stories, and tools. Social life centered around common spaces—markets, churches, schools, porches. Mutual aid was not a political slogan but a basic survival strategy. Electricity helped alleviate certain physical burdens, but it also enabled a more atomized existence: private appliances replace shared labor, television and now Netflix replace neighborhood gatherings, and online connection supplants physical community.
The electrification of everything, sold as liberation, has created a new form of total dependence. We have not simply added electricity to our lives—we have rewired life itself to require it. And as the grid stretches to accommodate AI servers, data centers, electric fleets, and “smart” everything, the question we must ask is no longer how much we can electrify—but how much failure we can endure.
It’s hard to imagine life today without electricity—yet just 130 years ago, almost no one had it, and communities thrived in very different ways. Our deepening dependence on the grid is not simply our choice; technologies like AI and massive data centers are being imposed upon us, often without real consent or public debate.
As we barrel toward ecological collapse—pervasive pollution, climate chaos, biodiversity loss, and the sixth mass extinction—our blind faith in endless electrification risks bringing us back to a state not unlike that distant past, but under far more desperate circumstances. Now more than ever, we must question the costs we ignore and face the difficult truth: the future we’re building may demand everything we take for granted, and then some.
Public Works Administration Project, U.S. Army Corps of Engineers, Bonneville Power and Navigation Dam in Oregon, Columbia River, 40 miles East of Portland, “Downstream side of Blocks 7 and 8 of North Half of Spillway Dam and Piers 9 to 12. Inclusive of South Half of Dam”. Oct 24, 1936. National Archives and Records Administration.
Editor’s note: “What if you could save the climate while continuing to pollute it?” If that sounds too good to be true, that’s because it is. But corporations across the globe are increasingly trying to answer this question with the same shady financial tool: carbon offsets.
To understand what’s going on with the carbon market, it’s important to know the terms(term-oil), vocabulary and organizations involved. For starters, a carbon credit is different from a carbon offset. A carbon credit represents a metric ton of carbon dioxide or the equivalent of other climate-warming gases kept out of the atmosphere. If a company (or individual, or country) uses that credit to compensate for its emissions — perhaps on the way to a claim of reduced net emissions — it becomes an offset.
“We need to pay countries to protect their forests, and that’s just not happening,” Mulder said. But the problem with carbon credits is they are likely to be used as offsets “to enable or justify ongoing emissions,” she said. “The best-case scenario is still not very good. And the worst-case scenario is pretty catastrophic, because we’re just locking in business as usual.”
“Offsetting via carbon credits is another way to balance the carbon checkbook. The idea first took hold in the 1980s and picked up in the following decade. Industrialized countries that ratified the 1997 Kyoto Protocol became part of a mandatory compliance market, in which a cap-and-trade system limited the quantity of greenhouse gases those countries could emit. An industrialized country emitting over its cap could purchase credits from another industrialized country that emitted less than its quota. Emitters could also offset CO2 by investing in projects that reduced emissions in developing countries, which were not required to have targets.”
Yet, the truth is far darker. Far from being an effective tool, carbon credits have become a convenient smokescreen that allows polluters to continue their damaging practices unchecked. As a result, they’re hastening our descent into environmental and societal breakdown.
The entire framework of carbon credits is based on a single, fatal assumption: that “offsets” can substitute for actual emissions reductions. But instead of cutting emissions, companies and countries are using carbon credits as a cheap alternative to meaningful action. This lack of accountability is pushing us closer to catastrophic climate tipping points, with the far-reaching impacts of climate change and resource depletion threatening the lives of everyone on this planet.
Brazilian prosecutors are calling for the cancellation of the largest carbon credit deal in the Amazon Rainforest, saying it breaks national law and risks harming Indigenous communities.
While marketed as a solution to mitigate climate change, carbon markets have been criticized as a facade for continued extractivism and corporate control of minerals in Africa.
Africa’s vast forests, minerals, and land are increasingly commodified under the guise of carbon offset projects. Global corporations invest in these projects, claiming to “offset” their emissions while continuing business as usual in their countries. This arrangement does little to address emissions at the source and increase exploitation in Africa, where land grabs, displacement, and ecological degradation often accompany carbon offset schemes.
“But beginning in January 2023, The Guardian, together with other news organizations, have published a series of articles that contend the majority of carbon credit sales in their analysis did not lead to the reduction of carbon in the atmosphere. The questions have centered on concepts such as additionality, which refers to whether a credit represents carbon savings over and above what would have happened without the underlying effort, and other methods used to calculate climate benefits.
The series also presented evidence that a Verra-approved conservation project in Peru promoted as a success story for the deforestation it helped to halt resulted in the displacement of local landowners. Corporations like Chevron, the second-largest fossil fuel company in the U.S., purchase carbon credits to bolster their claims of carbon neutrality. But an analysis by the watchdog group Corporate Accountability found that these credits were backed by questionable carbon capture technologies and that Chevron is ignoring the emissions that will result from the burning of the fossil fuels it produces.”
Since 2009, Tesla has had a tidy little side hustle selling the regulatory credits it collects for shifting relatively huge numbers of EVs in markets like China, Europe and California. The company earns the credits selling EVs and then sells them to automakers whose current lineup exceeds emission rules set out in certain territories. This business has proven quite lucrative for Tesla, as Automotive News explains:
The Elon Musk-led manufacturer generated $1.79 billion in regulatory credit revenue last year, an annual filing showed last week. That brought the cumulative total Tesla has raked in since 2009 to almost $9 billion.
“Tesla shouldn’t be considered a car manufacturer: they’re a climate movement profiteer. Most of their profits come from carbon trading. Car companies would run afoul of government regulations and fines for producing high emissions vehicles, but thanks to carbon credits, they can just pay money to companies like Tesla to continue churning out gas guzzlers. In other words, according to Elon Musk’s business model: no gas guzzlers, no Tesla.” – Peter Gelderloos
A LICENSE TO POLLUTE
The carbon offset market is an integral part of efforts to prevent effective climate action
In early November 2023, shortly before the COP28 summit opened in Dubai, a hitherto obscure UAE firm attracted significant media attention around news of their prospective land deals in Africa.
Reports suggested that Blue Carbon—a company privately owned by Sheikh Ahmed al-Maktoum, a member of Dubai’s ruling family—had signed deals promising the firm control over vast tracts of land across the African continent. These deals included an astonishing 10 percent of the landmass in Liberia, Zambia and Tanzania, and 20 percent in Zimbabwe. Altogether, the area equaled the size of Britain.
Blue Carbon intended to use the land to launch carbon offset projects, an increasingly popular practice that proponents claim will help tackle climate change. Carbon offsets involve forest protection and other environmental schemes that are equated to a certain quantity of carbon “credits.” These credits can then be sold to polluters around the world to offset their own emissions. Prior to entering into the negotiations of the massive deal, Blue Carbon had no experience in either carbon offsets or forest management. Nonetheless the firm stood to make billions of dollars from these projects.
Environmental NGOs, journalists and activists quickly condemned the deals as a new “scramble for Africa”—a land grab enacted in the name of climate change mitigation. In response, Blue Carbon insisted the discussions were merely exploratory and would require community consultation and further negotiation before formal approval.
Regardless of their current status, the land deals raise concerns that indigenous and other local communities could be evicted to make way for Blue Carbon’s forest protection plans. In Eastern Kenya, for example, the indigenous Ogiek People were driven out of the Mau Forest in November 2023, an expulsion that lawyers linked to ongoing negotiations between Blue Carbon and Kenya’s president, William Ruto. Protests have also followed the Liberian government’s closed-door negotiations with Blue Carbon, with activists claiming the project violates the land rights of indigenous people enshrined within Liberian law. Similar cases of land evictions elsewhere have led the UN Special Rapporteur on the Rights of Indigenous Peoples, Francisco Calí Tzay, to call for a global moratorium on carbon offset projects.
Beyond their potentially destructive impact on local communities, Blue Carbon’s activities in Africa point to a major shift in the climate strategies of Gulf states. As critics have shown, the carbon offsetting industry exists largely as a greenwashing mechanism, allowing polluters to hide their continued emissions behind the smokescreen of misleading carbon accounting methodologies while providing a profitable new asset class for financial actors. As the world’s largest exporters of crude oil and liquified natural gas, the Gulf states are now positioning themselves across all stages of this new industry—including the financial markets where carbon credits are bought and sold. This development is reconfiguring the Gulf’s relationships with the African continent and will have significant consequences for the trajectories of our warming planet.
False Accounting and Carbon Laundering
There are many varieties of carbon offset projects. The most common involves the avoided deforestation schemes that make up the bulk of Blue Carbon’s interest in African land. In these schemes, land is enclosed and protected from deforestation. Carbon offset certifiers—of which the largest in the world is the Washington-based firm, Verra—then assess the amount of carbon these projects prevent from being released into the atmosphere (measured in tons of CO2). Once assessed, carbon credits can be sold to polluters, who use them to cancel out their own emissions and thus meet their stated climate goals.
Superficially attractive—after all, who doesn’t want to see money going into the protection of forests?—such schemes have two major flaws. The first is known as “permanence.” Buyers who purchase carbon credits gain the right to pollute in the here and now. Meanwhile, it takes hundreds of years for those carbon emissions to be re-absorbed from the atmosphere, and there is no guarantee that the forest will continue to stand for that timeframe. If a forest fire occurs or the political situation changes and the forest is destroyed, it is too late to take back the carbon credits that were initially issued. This concern is not simply theoretical. In recent years, California wildfires have consumed millions of hectares of forest, including offsets purchased by major international firms such as Microsoft and BP. Given the increasing incidence of forest fires due to global warming, such outcomes will undoubtedly become more frequent.
Again, this estimate depends on an unknowable future, opening up significant profit-making opportunities for companies certifying and selling carbon credits.
The second major flaw with these schemes is that any estimation of carbon credits for avoided deforestation projects rests on an imaginary counterfactual: How much carbon would have been released if the offset project were not in place? Again, this estimate depends on an unknowable future, opening up significant profit-making opportunities for companies certifying and selling carbon credits. By inflating the estimated emissions reductions associated with a particular project, it is possible to sell many more carbon credits than are actually warranted. This scope for speculation is one reason why the carbon credit market is so closely associated with repeated scandals and corruption. Indeed, according to reporting in the New Yorker, after one massive carbon fraud was revealed in Europe, “the Danish government admitted that eighty per cent of the country’s carbon-trading firms were fronts for the racket.”[1]
These methodological problems are structurally intrinsic to offsetting and cannot be avoided. As a result, most carbon credits traded today are fictitious and do not result in any real reduction in carbon emissions. Tunisian analyst Fadhel Kaboub describes them as simply “a license to pollute.”[2] One investigative report from early 2023 found that more than 90 percent of rainforest carbon credits certified by Verra were likely bogus and did not represent actual carbon reductions. Another study conducted for the EU Commission reported that 85 percent of the offset projects established under the UN’s Clean Development Mechanism failed to reduce emissions. A recent academic study of offset projects across six countries, meanwhile, found that most did not reduce deforestation, and for those that did, the reductions were significantly lower than initially claimed. Consequently, the authors conclude, carbon credits sold for these projects were used to “offset almost three times more carbon emissions than their actual contributions to climate change mitigation.”[3]
Despite these fundamental problems—or perhaps because of them—the use of carbon offsets is growing rapidly. The investment bank Morgan Stanley predicts that the market will be worth $250 billion by 2050, up from about $2 billion in 2020, as large polluters utilize offsetting to sanction their continued carbon emissions while claiming to meet net zero targets. In the case of Blue Carbon, one estimate found that the amount of carbon credits likely to be accredited through the firm’s projects in Africa would equal all of the UAE’s annual carbon emissions. Akin to carbon laundering, this practice allows ongoing emissions to disappear from the carbon accounting ledger, swapped for credits that have little basis in reality.
Monetizing Nature as a Development Strategy
For the African continent, the growth of these new carbon markets cannot be separated from the escalating global debt crisis that has followed the Covid-19 pandemic and the war in Ukraine. According to a new database, Debt Service Watch, the Global South is experiencing its worst debt crisis on record, with one-third of countries in Sub-Saharan Africa spending over half their budget revenues on servicing debt. Faced with such unprecedented fiscal pressures, the commodification of land through offsetting is now heavily promoted by international lenders and many development organizations as a way out of the deep-rooted crisis.
The African Carbon Markets Initiative (ACMI), an alliance launched in 2022 at the Cairo COP27 summit, has emerged as a prominent voice in this new development discourse. ACMI brings together African leaders, carbon credit firms (including Verra), Western donors (USAID, the Rockefeller Foundation and Jeff Bezos’ Earth Fund) and multilateral organizations like the United Nations Economic Commission for Africa. Along with practical efforts to mobilize funds and encourage policy changes, ACMI has taken a lead role in advocating for carbon markets as a win-win solution for both heavily indebted African countries and the climate. In the words of the organization’s founding document, “The emergence of carbon credits as a new product allows for the monetization of Africa’s large natural capital endowment, while enhancing it.”[4]
ACMI’s activities are deeply tied to the Gulf. One side to this relationship is that Gulf firms, especially fossil fuel producers, are now the key source of demand for future African carbon credits. At the September 2023 African Climate Summit in Nairobi, Kenya, for example, a group of prominent Emirati energy and financial firms (known as the UAE Carbon Alliance) committed to purchasing $450 million worth of carbon credits from ACMI over the next six years. The pledge immediately confirmed the UAE as ACMI’s biggest financial backer. Moreover, by guaranteeing demand for carbon credits for the rest of this decade, the UAE’s pledge helps create the market today, driving forward new offset projects and solidifying their place in the development strategies of African states. It also helps legitimize offsetting as a response to the climate emergency, despite the numerous scandals that have beset the industry in recent years.
Saudi Arabia is likewise playing a major role in pushing forward carbon markets in Africa. One of ACMI’s steering committee members is the Saudi businesswoman, Riham ElGizy, who heads the Regional Voluntary Carbon Market Company (RVCMC). Established in 2022 as a joint venture between the Public Investment Fund (Saudi Arabia’s sovereign wealth fund) and the Saudi stock exchange, Tadawul, RVCMC has organized the world’s two largest carbon auctions, selling more than 3.5 million tons worth of carbon credits in 2022 and 2023. 70 percent of the credits sold in these auctions were sourced from offset projects in Africa, with the 2023 auction taking place in Kenya. The principal buyers of these credits were Saudi firms, led by the largest oil company in the world, Saudi Aramco.
Beyond simply owning offset projects in Africa, the Gulf states are also positioning themselves at the other end of the carbon value chain: the marketing and sale of carbon credits to regional and international buyers.
The Emirati and Saudi relationships with ACMI and the trade in African carbon credits illustrate a notable development when it comes to the Gulf’s role in these new markets. Beyond simply owning offset projects in Africa, the Gulf states are also positioning themselves at the other end of the carbon value chain: the marketing and sale of carbon credits to regional and international buyers. In this respect, the Gulf is emerging as a key economic space where African carbon is turned into a financial asset that can be bought, sold and speculated upon by financial actors across the globe.
Indeed, the UAE and Saudi Arabia have each sought to establish permanent carbon exchanges, where carbon credits can be bought and sold just like any other commodity. The UAE set up the first such trading exchange following an investment by the Abu Dhabi-controlled sovereign wealth fund, Mubadala, in the Singapore-based AirCarbon Exchange (ACX) in September 2022. As part of this acquisition, Mubadala now owns 20 percent of ACX and has established a regulated digital carbon trading exchange in Abu Dhabi’s financial free zone, the Abu Dhabi Global Market. ACX claims the exchange is the first regulated exchange of its kind in the world, with the trade in carbon credits beginning there in late 2023. Likewise, in Saudi Arabia the RVCMC has partnered with US market technology firm Xpansiv to establish a permanent carbon credit exchange set to launch in late 2024.
Whether these two Gulf-based exchanges will compete or prioritize different trading instruments, such as carbon derivatives or Shariah-compliant carbon credits, remains to be seen. What is clear, however, is that major financial centers in the Gulf are leveraging their existing infrastructures to establish regional dominance in the sale of carbon. Active at all stages of the offsetting industry—from generating carbon credits to purchasing them—the Gulf is now a principal actor in the new forms of wealth extraction that connect the African continent to the wider global economy.
Entrenching a Fossil-Fueled Future
Over the past two decades, the Gulf’s oil and especially gas production has grown markedly, alongside a substantial eastward shift in energy exports to meet the new hydrocarbon demand from China and East Asia. At the same time, the Gulf states have expanded their involvement in energy-intensive downstream sectors, notably the production of petrochemicals, plastics and fertilizers. Led by Saudi Aramco and the Abu Dhabi National Oil Company, Gulf-based National Oil Companies now rival the traditional Western oil supermajors in key metrics such as reserves, refining capacity and export levels.
Rather, much like the big Western oil companies, the Gulf’s vision of expanded fossil fuel production is accompanied by an attempt to seize the leadership of global efforts to tackle the climate crisis.
In this context—and despite the reality of the climate emergency—the Gulf states are doubling down on fossil fuel production, seeing much to be gained from hanging on to an oil-centered world for as long as possible. As the Saudi oil minister vowed back in 2021, “every molecule of hydrocarbon will come out.”[5] But this approach does not mean the Gulf states have adopted a stance of head-in-the-sand climate change denialism. Rather, much like the big Western oil companies, the Gulf’s vision of expanded fossil fuel production is accompanied by an attempt to seize the leadership of global efforts to tackle the climate crisis.
One side to this approach is their heavy involvement in flawed and unproven low carbon technologies, like hydrogen and carbon capture. Another is their attempts to steer global climate negotiations, seen in the recent UN climate change conferences, COP27 and COP28, where the Gulf states channeled policy discussions away from effective efforts to phase out fossil fuels, turning these events into little more than corporate spectacles and networking forums for the oil industry.
The carbon offset market should be viewed as an integral part of these efforts to delay, obfuscate and obstruct addressing climate change in meaningful ways. Through the deceptive carbon accounting of offset projects, the big oil and gas industries in the Gulf can continue business as usual while claiming to meet their so-called climate targets. The Gulf’s dispossession of African land is key to this strategy, ultimately enabling the disastrous specter of ever-accelerating fossil fuel production.
This statement, published on July 2, 2024, responds to the growing efforts of corporations to greenwash their greenhouse gas emissions by buying “credits” for supposed emission reductions elsewhere. It is signed by more than 80 leading civil society organizations.
Editor’s note: “I think we’re in the midst of a collapse of civilization, and we’re definitely in the midst of the end of the American empire. And when empires start to fail, a lot of people get really crazy. In TheCulture of Make Believe, I predicted the rise of the Tea Party. I recognized that in a system based on competition and where people identify with the system, when times get tough, they wouldn’t blame the system, but instead, they would indicate it’s the damn Mexicans’ fault or the damn black people’s fault or the damn women’s fault or some other group. The thing that I didn’t predict was that the Left would go insane in its own way. I anticipated the rise of an authoritarian Right, but not authoritarianism more generally, to which the Left is not immune. The collapse of empire results in increased insecurity and the demand for stability. The cliché about Mussolini is that he made the trains run on time, that he brought about stability.” – Derrick Jensen
It’s not just stupid people. People can be very smart as individuals, but collectively we are stupid. Postmodernism is a case in point. It starts with a great idea, that we are influenced by the stories we’re told and the stories we’re told are influenced by history. It begins with the recognition that history is told by the winners and that the history we were taught through the 1940s, 50s and 60s was that manifest destiny is good, civilization is good, expanding humanity is good. Exemplary is the 1962 film How the West Was Won. It’s extraordinary in how it regards the building of dams and expansion of agriculture as simply great. Postmodernism starts with the insight that such a story is influenced by who has won, which is great, but then it draws the conclusion that nothing is real and there are only stories.
“This is the cult-like behavior of the postmodern left: if you disagree with any of the Holy Commandments of postmodernism/queer theory/transgender ideology, you must be silenced on not only that but on every other subject. Welcome to the death of discourse, brought to you by the postmodern left.”
I once asked a group of my students if they knew what the term postmodernism meant: one replied that it’s when you put everything in quotation marks. It wasn’t such a bad answer, because concepts such as “reality”, “truth” and “humanity” are invariably put under scrutiny by thinkers and “texts” associated with postmodernism.
Postmodernism is often viewed as a culture of quotations.
Take Matt Groening’s The Simpsons (1989–). The very structure of the television show quotes the classic era of the family sitcom. While the misadventures of its cartoon characters ridicule all forms of institutionalised authority – patriarchal, political, religious and so on – it does so by endlessly quoting from other media texts.
This form of hyperconscious “intertextuality” generates a relentlessly ironic or postmodern worldview.
Relationship to modernism
The difficulty of defining postmodernism as a concept stems from its wide usage in a range of cultural and critical movements since the 1970s. Postmodernism describes not only a period but also a set of ideas, and can only be understood in relation to another equally complex term: modernism.
Modernism was a diverse art and cultural movement in the late 19th and early 20th centuries whose common thread was a break with tradition, epitomised by poet Ezra Pound’s 1934 injunction to “make it new!”.
The “post” in postmodern suggests “after”. Postmodernism is best understood as a questioning of the ideas and values associated with a form of modernism that believes in progress and innovation. Modernism insists on a clear divide between art and popular culture.
But like modernism, postmodernism does not designate any one style of art or culture. On the contrary, it is often associated with pluralism and an abandonment of conventional ideas of originality and authorship in favour of a pastiche of “dead” styles.
Postmodern architecture
The shift from modernism to postmodernism is seen most dramatically in the world of architecture, where the term first gained widespread acceptance in the 1970s.
One of the first to use the term, architectural critic Charles Jencks suggested the end of modernism can be traced to an event in St Louis on July 15, 1972 at 3:32pm. At that moment, the derelict Pruitt-Igoe public housing project was demolished.
Built in 1951 and initially celebrated, it became proof of the supposed failure of the whole modernist project.
Jencks argued that while modernist architects were interested in unified meanings, universal truths, technology and structure, postmodernists favoured double coding (irony), vernacular contexts and surfaces. The city of Las Vegas became the ultimate expression of postmodern architecture.
Famous theorists
Theorists associated with postmodernism often used the term to mark a new cultural epoch in the West. For philosopher Jean-François Lyotard, the postmodern condition was defined as “incredulity towards metanarratives”; that is, a loss of faith in science and other emancipatory projects within modernity, such as Marxism.
Marxist literary theorist Fredric Jameson famously argued postmodernism was “the cultural logic of late capitalism” (by which he meant post-industrial, post-Fordist, multi-national consumer capitalism).
These included, to paraphrase: the substitution of pastiche for the satirical impulse of parody; a predilection for nostalgia; and a fixation on the perpetual present.
In Jameson’s pessimistic analysis, the loss of historical temporality and depth associated with postmodernism was akin to the world of the schizophrenic.
Postmodern visual art
In the visual arts, postmodernism is associated with a group of New York artists – including Sherrie Levine, Richard Prince and Cindy Sherman – who were engaged in acts of image appropriation, and have since become known as The Pictures Generation after a 1977 show curated by Douglas Crimp.
By the 1980s postmodernism had become the dominant discourse, associated with “anything goes” pluralism, fragmentation, allusions, allegory and quotations. It represented an end to the avant-garde’s faith in originality and the progress of art.
But the origins of these strategies lay with Dada artist Marcel Duchamp, and the Pop artists of the 1960s in whose work culture had become a raw material. After all, Andy Warhol was the direct progenitor of the kitsch consumerist art of Jeff Koons in the 1980s.
Postmodern cultural identity
Postmodernism can also be a critical project, revealing the cultural constructions we designate as truth and opening up a variety of repressed other histories of modernity. Such as those of women, homosexuals and the colonised.
The modernist canon itself is revealed as patriarchal and racist, dominated by white heterosexual men. As a result, one of the most common themes addressed within postmodernism relates to cultural identity.
American conceptual artist Barbara Kruger’s statement that she is “concerned with who speaks and who is silent: with what is seen and what is not” encapsulates this broad critical project.
Australia has been theorised by Paul Taylor and Paul Foss, editors of the influential journal Art & Text, as already postmodern, by virtue of its culture of “second-degree” – its uniquely unoriginal, antipodal appropriations of European culture.
If the language of postmodernism waned in the 1990s in favour of postcolonialism, the events of 9/11 in 2001 marked its exhaustion.
Surfing YouTube, I came across an interview of Ezra Klein by Stephen Colbert. He was promoting a new book called Abundance, basically arguing that scarcity is politically-manufactured by “both sides,” and that if we get our political act together, everybody can have more. Planetary limits need not apply. I’ve often been impressed by Klein’s sharp insights on politics, yet can’t reconcile how someone so smart misses the big-picture perspectives that grab my attention.
He’s not alone: tons of sharp minds don’t seem to be at all concerned about planetary limits or metastatic modernity, which for me has been a source of perennial puzzlement.
The logical answer is that I’m not the sharpest tool in the shed. Indeed, many of these folks could run cognitive/logical circles around me. And maybe that’s the end of the story. Yet it’s not the end of this post, as I try to work out what accounts for the disconnect, and (yet again) examine my own assuredness.
Imagined Basis
What is the basis of pundit-level rejection of my premise? Oh yeah: my premise is that modernity is a fleeting, patently unsustainable mode of life on Earth that will self-terminate on a historically relevant (i.e., brief) timescale—likely to convincingly crest the peak this century. Modernity can’t last.
I will reconstruct how I think an ultra-smart person might react, were I to present in conversation the premise that modernity can’t last—based on past interactions with such folks. Two branches stand out.
One branch would be the unwittingly spot-on admission of “I don’t see why not.” I could not have identified the core problem any better, and would be tempted to say: “Wow—what a courageous first step in recognizing our limited faculties. That humble confession is very big of you.” My not having the wit to prove conclusively to such folks that modernity can’t work (and I would say that no human possesses such mental powers) says very little about the complex reality of our future—operating without giving a flip as to what happens in human brains. But it’s also quite far from demonstrating convincingly how something as unsustainable as modernity—dependent on one-time exploitation of non-renewable resources—might possibly address the host of interacting elements that will contribute to its crumbling.
That branch aside, the common reply I want to spend more time on goes something like: “Just look at the past. No one could have foreseen the amazingness of today, and we ought to recognize that we are likewise ill-equipped to speculate on the future. In other words, anyone expressing your premise in the last 10,000 years would have turned out to be wrong [well, so far]. Chances, are: so are you.”
Damn. Blistering. How can one get up from that knockout? And the thing is, it’s a completely valid bit of logic. I also appreciate the intellectual humility involved. Why, then, am I so stubborn on this point? Is it because I want to be popular or rich? Then I’m even stupider than I thought, because those things are basically guaranteed to be incompatible with such a message. Is it because I crave end-times, having been dealt a bad hand and never “good at the game?” Nope: I thrived as an all-in astrophysicist and had/have a rather privileged and comfortable life that I would personally, selfishly prefer not to have disrupted. Is it out of fear of collapse? Getting warmer: that was a big early motivation—the alarming prospect of losing what I held until recently to be a glorious civilization. But at this point all I can say is that based on multiple lines of evidence I really think it’s the truth, and can’t easily or honestly argue myself out of this difficult spot. Denial, anger, bargaining, and depression don’t help us come to terms with the hard reality..
Returning to the putative response: I’ll name it as lazy. It’s superficial. It’s a shortcut, sidling up to: “Collapse hasn’t happened yet—in fact quite the opposite—and thus most likely will not.” It declines to examine the constituent pieces and arguments, falling back on a powerful and persuasive bit of logic straight out of the left brain. It has all the hallmarks: certain, crisp, abstract, decontextualized, logical, clever.
It carries the additional dual advantage of simultaneously avoiding unpleasant confrontation of a scary prospect and inviting starry-eyed wonder at magic the future might bring. No wonder it’s so magnetically attractive as a go-to response!. We’re both driven to it and attracted by it! The very smartest among us, in fact, often have the most to lose, and may therefore be among the most psychologically attached to modernity. We mustn’t forget that every human has a psychology, and is capable of impressive levels of denial for any number of reasons.
Some Metaphors
Its time for a few metaphors that help to frame my approach. I offer two related ones, because none are perfect. Together, they might work well enough for our purposes.
TAKE ONE
Imagine that someone tees up a golf ball in an indoor space full of hard objects: concrete walls and steel shelves—maybe loaded with heavy glass goblets and vases, etc. Poised to deliver a smashing blow to the ball with an over-sized driver, they ask me: “What do you think will happen if I hit this ball?” Imagining a comical movie scene where the ball makes a series of wild-ass bounces shattering priceless collectables as it goes, it might seem impossible to guess what all might or might not happen. So, I “cheat” and say: “The ball will come to rest.”
And guess what: I’m right! No matter how crazy the flight, it is guaranteed that in fairly short order, the ball will no longer be moving. I could even put a timescale on it: stopped within 10 seconds, or maybe even 5—depending on the dimensions of the room. I can say this because each collision will remove a fair bit of energy from the ball, and the smaller the room, the shorter the time between energy-sapping events.
During the middle of the experiment, it is clear that mayhem is happening, and it’s essentially impossible to predict what’s next. That’s where we are in modernity. So, yes: some intellectual humility is called for. We could not have predicted any of the particulars, after all. But one can still stand by the prediction that the ball will come to rest, much as one can say modernity will wind itself down.
TAKE TWO
The golf ball metaphor does 80% of the work, but I don’t fully embrace it because the ball is at maximum destructive capacity at the very beginning, its damage-potential decaying from the first moment. Modernity took some time to accelerate to present speed, now at a fever pitch. For this, I think of a rock tumbling down a slope.
I do a fair bit of hiking, sometimes off trail where—careful as I am—I might occasionally dislodge a rock on a steep slope. What happens next is entirely unpredictable (even if deterministic given initial conditions). Most of the time the rock just slides just a few centimeters; sometimes it will lazily tumble a few meters; or more rarely it will pick up speed and hurtle hundreds of meters down the slope in a kinetic spectacle. Kilometer scales are not entirely out of the question in some locations.
Still, for all these scenarios, I am sure of one thing: the rock will come to rest—possibly in multiple fragments. I can also put a reasonable timescale on it, mid-journey, based on its behavior to that point. I can tell if it’s picking up speed. I can evaluate if the slope is moderating or will soon come to an end. It’s not impossible to make a decent guess for how long it might go, even if unable to predict what hops, collisions, or deflections it might execute along the way.
Maybe the phrase “a rolling stone gathers no moss” can be re-interpreted as: kinetic mayhem is no basis for a healthy, relational ecology. If tumbling boulders were the normal/default state of things, mountains would not last long (or more to the point: never come into being!). Likewise, one species driving millions of others to extinction in mere centuries is not a normal, sustainable state of affairs. That $#!+ has to stop.
Modernity’s Turn
Modernity is far more complex than a tumbling rock. But one side effect of this is a multitude of facets to consider. When many of them line up to tell a similar story…well, that story becomes more compelling. I offer a few, here.
POPULATION
Global human population has been a super-exponential, in that the annual growth rate as a percentage of the total has steadily climbed through the millennia and centuries (0.04% after agriculture began, up to 2% in the 1960s). It is no shock to anyone that we may be straining (or overtaxing) what the planet can support. Indeed, the growth rate has been decreasing for the last 60 years, and the drop appears to be accelerating lately. Almost any model predicts a global peak before this century is over, and possibly as soon as the next 15–20 years. This is, of course, highly relevant to modernity. Economies will shrink and possibly collapse (being predicated on growth) as population falls from a peak. Such a turn could precipitate a whole new phase that “no one could have seen coming.” I’m looking at you, pundits!
The argument of “just look to the past” and imagining some sort of extrapolation begins to seem dubious or even outright silly in the context of a plummeting population. Let’s face it: we don’t know how it plays out. Loss of modern technological capabilities is not at all a mental stretch, even if such “muscles” are rarely exercised.
RESOURCES
Modernity hungry!. Fossil fuels have played a huge role in the dramatic acceleration of the past few centuries. We all know this is a limited-time prospect. Oil discoveries peaked over a half-century ago, so the writing is on the wall for production decline on a timescale of decades. Pretending that solar and wind will sweep in as substitutes involves a fair bit of magical thinking and ignorance of myriad practical details (back to the “I don’t see why not” response). We face an unprecedented transition as fossil fuels wane, so that the acceleration of the past is very likely to run out of steam. Even holding steady involves an unsubstantiated leap of faith—never fleshed out as to how it all could possibly work. “I don’t see why not” is about the best one can expect.
Mined materials are likewise non-renewable and being consumed at an all-time-high rate. Ore grade has fallen dramatically, so that we now must pursue increasingly marginal and deeper deposits and thus impact more land, while discharging an ever-increasing volume of mine tailings. This happened fast: most material extraction has occurred in the last century (or even 50 years). We would be foolish to imagine an extrapolation of the past or even maintaining similar levels of activity for any long duration. More realistically, these practices will be undercut by declining population and energy availability. I’ve spent plenty of time pointing out that recycling can at best stretch out the timeline, but not by orders of magnitude.
WATER/AGRICULTURE
Agricultural productivity has also steadily increased, but on the back of “mining” non-renewable resources like ground water and soils—not to mention an extraordinary dependence on finite fossil fuels. Okay: at least water and soils can renew on long timescales, but our rate of depletion far outstrips replenishment. Land turned to desert by overuse stops even trying to maintain soils, while also suppressing water replenishment by squelching rainfall. This is yet another domain where the fact that the past has involved a steady march in one direction is quite far from guaranteeing that direction as a constant of nature. Its very “success” is what hastens its failure. The simple logic of “hasn’t happened yet” blithely bypasses a lot of context sitting in plain sight.
CLIMATE CHANGE
I don’t usually stress climate change, because I view it as one symptom of a more general disease. Moreover, should we magically eliminate climate change in a blink, my assessment is hardly altered since so many other factors are contributing to the overall phenomenon of modernity’s unsustainability. I include climate change here because it seems to be the one element that has percolated to the attention of the pundit-class as a potential existential threat. It isn’t yet clear how modernity trucks on without fossil fuels. Yet, even if we were to curtail their use by 2050, the climate damage may be great enough to reverse modernity’s fortunes (actually, the most catastrophic legacy of CO2 emissions may be ocean acidification rather than climate change). Again, the “logic” of extrapolation becomes rather dubious. The faith-based assumption is that we will “technology” our way out of the crisis, which becomes perfectly straightforward if ignoring all the other factors at play. Increased materials demand to “technofix” our ills (and the associated mining, habitat destruction, pollution) puts a fly in the ointment. But most concerning to me is what we already do with energy. Answer: initiate a sixth mass extinction by running a resource-hungry, human supremacist, global market economy. Most climate change “solutions” assign top priority to maintaining the destructive juggernaut at full speed—without question.
ECOLOGICAL COLLAPSE
This brings me to the ultimate peril. As large, hungry, high-maintenance mammals on this planet, we are utterly dependent on a healthy, vibrant, biodiverse ecology—in ways we can’t begin to fathom. It’s beyond our meat-brain capacity to appreciate. Long-term survival at the hands of evolution has never once required cognitive comprehension of the myriad subtle relationships necessary for a stable community of life. An amoeba, mayfly, newt, or hedgehog gets on just fine without such knowledge. What is required is fitting into the niches and interrelationships patiently worked out through the process of evolution. Guess what: in a flash, we jumped the tracks into a patently non-ecological lifestyle not vetted by evolution to be viable. It appears to be not even close.
This is not just a theoretical concern. Biologists are pretty clear that a sixth mass extinction is underway as a direct result of modernity. The dots are not particularly hard to connect. We mine and spew/dispose materials alien to the community of life into the environment. Good luck, critters! We eliminate or shatter wild space in favor of “developed” land: exterminating, eradicating, displacing, and impoverishing the life that depends on that land and its resident web of life. The struggle can take decades to resolve as populations ebb—generation after generation—on the road to inevitable failure. Even this decades-long process is effectively instant compared to the millions of years over which the intricate web was crafted.
I have pointed out a number of times that we are now down to 2.5 kg of wild land mammal mass per human on the planet. It was 80 kg per person in 1800 and 50,000 kg per person before the start of the agricultural revolution—when humans held a roughly proportionate share of mammal biomass compared to the other mammal species. In my lifetime (born 1970), the average decline in vertebrate populations has been roughly 70%. Fish, insects, birds decline at 1–2% per year, which compounds quickly. Extinction rates are now hundreds of times higher than the background, almost all of which has transpired in the last century.
Just like the golf ball in the room or the rock tumbling down the mountainside, these figures allow us to place approximate, relevant timescales on the phenomenon of ecological collapse—and that timescale is at the sub-century level. We’re watching its opening act, and the rate is alarming. The consequences, however, are easily brushed aside in ignorance. Try it yourself: mention to someone that humans can’t survive ecological collapse and—Family Feud style—I’d put my money on “I don’t see why not” being among the most frequent responses.
So, Don’t Give Me That…
I think you can see why I’m not swayed by the tidy and fully-decontextualized lazy logic of extrapolation offered by some of the smartest people. This psychologically satisfying logic can have such a powerfully persuasive pull that it short-circuits serious considerations of the counterarguments. This is especially true when the relevant subjects are uncomfortable, inconvenient, unfamiliar, and also happen to be beyond our capacity to cognitively master. Just because we can’t understand something doesn’t render it non-existent. Seeking answers from within our brains gets what it deserves: garbage in—garbage out.
We used the metaphors of a golf ball or rolling stone necessarily coming to rest. Likewise, a thrown rock will return to the ground, or a flying contraption not based on the aerodynamic principles of sustainable flight will fail to stay aloft. Modernity has no ecological context (no rich set of evolved interrelationships and co-dependencies with the rest of the community of life) and is rapidly demonstrating its unsustainable nature on many parallel, interconnected fronts. This would seem to make the default position clear: modernity will come to rest on a century-ish timescale, the initial reversal possibly becoming evident in mere decades. [Correction: I think it will likely be mostly stopped on a century timescale, but it may take millennia to fully melt into whatever mode comes next.]
Retreating to the logic of extrapolation or basic unpredictability amounts to a faith-based approach that deflects any actual analysis: a cowardly dodge. Given the multi-layer, parallel concerns all pointing to a temporary modernity, it would seem to put the burden of proof that “the unsustainable can be sustained” squarely on the collapse-deniers. The default position is that unsustainable systems fail; that non-ecological modes lack longevity; that unprecedented and extreme departures do not become the rule; that no species is capable of going-it alone. Arguing the extraordinary obverse demands extraordinary evidence, which of course is not availing itself.
When logic suggests an attractive bypass, recognize that logic is only a narrow and disconnected component of a more complete, complex reality. Most importantly, the logic of extrapolation only serves to throw up a cautionary flag, without even bothering to address the relevant dynamics. That particular flag is later recognized as a misfire once the appropriate elements are given due consideration: this time is different, because modernity is outrageously different from the larger temporal and ecological context. Pretending otherwise requires turning a spider’s-worth of blind eyes to protect a short-term, ideological, emotionally “safe” agenda. Pretend all you want: it won’t change what’s real.
Editor’s note: A new report that microplastics pollution is hampering photosynthesis in plants, and that the result is the loss of some 10% of the world’s primary productivity, including food crops. We are now risking to blot out the planetary photosynthesis machine, just because we think that stopping the growth of the plastics industry is a subversive idea. But the report gets something in reverse: it is not that these effects “extend from food security into planetary health.” It is the opposite .But that changes little in a situation in which nothing changes, except for the desperate attempt of solving problems by killing the messenger, that is, “driving a dagger into the climate change religion”
We got rid of acid rain. Now something scarier is falling from the sky. Here’s why you should never, ever drink the rain. A number of studies have documented microplastics in rain falling all over the world — even in remote, unpopulated regions. Plastic particles have infiltrated the entire planet, from the summit of Mount Everest to the deepest oceans. Also, the microscopic shards of plastic found in every corner of the planet may be exacerbating antibiotic resistance, a new study has found.
Plastic Pollution: So Much Bigger Than Straws
by Jackie Nuñez, The Revelator
March 14, 2025
Over the past couple of weeks we’ve seen the current U.S. administration grasping at straws, mocking restrictions on single-use plastics, and trying to distract from the real issue: Plastic poisons people and the planet, and the industries that produce it need to stop making so much of it.
When I started “The Last Plastic Straw” movement in 2011, the sole purpose was to bring attention to a simple, tangible issue and raise awareness about the absurdity of single-use plastic items and engage people to take action.
So what are the real problems with plastic? Plastics don’t break down, they break up: Unlike natural materials that decompose, they fragment into smaller and smaller pieces, never benignly degrading but remaining forever plastic. All plastic items shed plastic particles called microplastics and even smaller nanoplastics, which we inhale, ingest, and absorb into our bodies. Plastics, depending on their manufacturing composition, contain a mixture of more than 16,000 chemicals, at least 4,200 of which are knownhazards to human health. When we use plastic straws, cups, plates, utensils, and food packaging, we are literally swallowing those toxic plastic particles and chemicals.
Plastic particles have also been found in placenta and breast milk, so children today are being born plasticized. This is a toxic burden that today’s youth should not have to bear.
🧠 A new study found the amount of microplastics & nanoplastics in human brains increased by 50% between 1997 to 2024. Researchers found that in people with dementia, plastic particles were six times more numerous than in people without dementia. It was also found that plastic particles in human livers are increasing over time.
🔬75% of the plastics found in human tissue samples were one of the most common types of plastics: polyethylene. Polyethylene (PET) is used in everyday items like food packaging, bottles, bags, toys, and more.
📈 With microplastics and nanoplastics building up in our bodies it’s time to put plastic-free solutions in place, for people and the planet.
Source: The Journal of Nature Medicine
It goes without saying that plastic’s harms to our health come at an enormous cost to us, who must suffer through the heartbreaking and painful diseases it causes. It’s estimated that every 30 seconds, someone dies from plastic pollution in the Global South, an area overburdened by mountains of plastic pollution that is shipped away from the Global North under the guise of “recycling” only to be dumped and often burned, releasing additional toxic pollution. Financially too, plastics are expensive: The chemicals in plastic alone cost the U.S. healthcare system $250 billion in just one year.
We can’t recycle our way out of this. Plastic was never made to be recycled and is still not made to be recycled.
Our leaders who support continued or even increased plastic production seem ignorant of the facts about plastic pollution. Let us enlighten them: All plastic pollutes, and single-use plastic items like straws are not only hazardous to our health, they’re especially wasteful.
We could all save money if our government prioritized building up plastic-free reuse and refill systems, where we hold on to our stuff rather than continuously buy it and throw it away. Such reuse and refill systems were the reality before single-use plastic was mass-produced and marketed. And they worked. Most U.S. voters support reducing plastic production, along with national policies that reduce single-use plastic, increasing use of reusable packaging and foodware, and protecting people who live in neighborhoods harmed by plastic production facilities.
To change this nightmare scenario, our leaders need to support policies that reduce plastic production, not grow it. This means curbing wasteful plastic production and supporting plastic- and toxic-free, regenerative materials and systems of reuse and refill.
As the advocacy and engagement manager at Plastic Pollution Coalition, my work continues to support the solutions to this massive global crisis — strong policies that focus on plastic pollution prevention, better business practices, and a culture shift. We work together with our allied coalition organizations, businesses, scientists, notables and individual members every single day to make these solutions a reality — no matter how much the U.S. administration or other leaders try to undermine, belittle, or dismiss efforts to minimize the use of straws and other quickly disposed plastic products that poison our planet and our bodies.
Plastic never was and never will be disposable, and neither are people.
This article first appeared on The Revelator and is republished here under a Creative Commons license.
Banner Credit: Taklamacuwv Lamia on Wikimedia Commons