Wednesday, March 30, 2022

The 1619 Project Unrepentantly Pushes Junk History

Nikole Hannah-Jones' new book sidesteps scholarly critics while quietly deleting previous factual errors.

By Phillip W. Magness.

"I too yearn for universal justice," wrote Zora Neale Hurston in her autobiography, Dust Tracks on a Road, "but how to bring it about is another thing." The black novelist's remarks prefaced a passage where she grappled with the historical legacy of slavery in the African-American experience. Perhaps unexpectedly, Hurston informed her readers that she had "no intention of wasting my time beating on old graves with a club."

Hurston did not aim to bury an ugly past but to search for historical understanding. Her 1927 interview with Cudjoe Lewis, among the last living survivors of the 1860 voyage of the slave ship Clotilda, contains an invaluable eyewitness account of the middle passage as told by one of its victims. Yet Hurston saw only absurdity in trying to find justice by bludgeoning the past for its sins. "While I have a handkerchief over my eyes crying over the landing of the first slaves in 1619," she continued, "I might miss something swell that is going on in" the present day.

Hurston's writings present an intriguing foil to The New York Times' 1619 Project, which the newspaper recently expanded into a book-length volume. As its subtitle announces, the book aims to cultivate a "new origin story" of the United States where the turmoil and strife of the past are infused into a living present as tools for attaining a particular vision of justice. Indeed, it restores The 1619 Project's original aim of displacing the "mythology" of 1776 "to reframe the country's history, understanding 1619 as our true founding." This passage was quietly deleted from The New York Times' website in early 2020 just as the embattled journalistic venture was making a bid for a Pulitzer Prize. After a brief foray into self-revisionism in which she denied ever making such a claim, editor Nikole Hannah-Jones has now apparently brought this objective back to the forefront of The 1619 Project.

Vacillating claims about The 1619 Project's purpose have come to typify Hannah-Jones' argumentation. In similar fashion, she selectively describes the project as a work either of journalism or of scholarly history, as needed. Yet as the stealth editing of the "true founding" passage revealed, these pivots are often haphazardly executed. So too is her attempt to claim the mantle of Hurston. In a recent public spat with Andrew Sullivan, Hannah-Jones accused the British political commentator of "ignorance" for suggesting that "Zora Neale Hurston's work sits in opposition to mine." She was apparently unaware that Dust Tracks on a Road anticipated and rejected the premise of The 1619 Project eight decades prior to its publication.

On the surface, The 1619 Project: A New Origin Story (One World) expands the short essays from The New York Times print edition into almost 600 pages of text, augmented by additional chapters and authors. The unmistakable subtext is an opportunity to answer the barrage of controversies that surrounded the project after its publication in August 2019. "We wanted to learn from the discussions that surfaced after the project's publication and address the criticisms some historians offered in good faith," Hannah-Jones announces in the book's introduction, before devoting the majority of her ink to denouncing the blusterous critical pronouncements of the Trump administration after it targeted The 1619 Project in the run-up to the 2020 presidential election. Serious scholarly interlocutors of the original project are largely sidestepped, and factual errors in the original text are either glossed over or quietly removed.

While the majority of the public discussion around The 1619 Project has focused on Hannah-Jones' lead essay, its greatest defects appear in the Princeton sociologist Matthew Desmond's essay on "Capitalism." Hannah-Jones' writings provide the framing for the project, but Desmond supplies its ideological core—a political charge to radically reorient the basic structure of the American economy so as to root out an alleged slavery-infused brutality from capitalism.

Hannah-Jones' prescriptive call for slavery reparations flows seamlessly from Desmond's argument, as does her own expanded historical narrative—most recently displayed in a lecture series for MasterClass in which she attempted to explain the causes of the 2008 financial crisis by faulting slavery. "The tendrils of [slavery] can still be seen in modern capitalism," she declared, where banking companies "were repackaging risky bonds and risky notes…in ways [that] none of us really understood." The causal mechanism connecting the two events remained imprecise, save for allusions to "risky slave bonds" and a redesignation of the cotton industry as "too big to fail."

Making what appears to be a muddled reference to the Panic of 1837, she confidently declared that "what happened in 1830 is what happened in 2008." The claimed connection aimed to prove that the "American capitalist system is defined today by the long legacy and shadow of slavery." This racist, brutal system "offers the least protections for workers of all races," she said, and it thus warrants a sweeping overhaul through the political instruments of the state. To this end, Hannah-Jones appends an expanded essay to The 1619 Project book, endorsing a Duke University study's call for a "vast social transformation produced by the adoption of bold national policies."

"At the center of those policies," she declared, "must be reparations."

Uncorrected Errors

What are we to make of The 1619 Project's anti-capitalism in light of the new book's expanded treatment? For context, let's consider how Desmond handles the defects of his original argument.

In his quest to tie modern capitalism to slavery, Desmond began with a genealogical claim. Antebellum plantation owners employed double-entry accounting and record-keeping practices, some of them quite sophisticated. A more careful historian might note that such practices date back to the Italian banking families of the late Middle Ages, or point out that accounting is far from a distinctively capitalist institution. After all, even the central planners of the Soviet Union attempted to meticulously track raw material inputs, labor capacity, and multi-year productivity goals. Does this make the gulags a secret bastion of free market capitalism? Though seemingly absurd, such conclusions are the logical extension of Desmond's argument. "When an accountant depreciates an asset to save on taxes or when a midlevel manager spends an afternoon filling in rows and columns on an Excel spreadsheet," he wrote in the original newspaper edition, "they are repeating business procedures whose roots twist back to slave-labor camps."

Setting aside this unusual leap of logic, the claim rests upon a basic factual error. Desmond attributed this genealogy to the University of California, Berkeley, historian Caitlin Rosenthal's 2018 book on plantation financial record keeping, Accounting for Slavery. Yet Rosenthal warned against using her work as an "origin story" for modern capitalism. She "did not find a simple path," she wrote, by which plantation accounting books "evolved into Microsoft Excel." Desmond, it appears, made a basic reading error.

When I first pointed out this mistake to Jake Silverstein, the editor in chief of The New York Times Magazine, in early 2020, he demurred on making any correction. After consulting with Rosenthal, the Times passed off this inversion of phrasing as an interpretive difference between the two authors. In the new book version of Desmond's essay, the troublesome Microsoft Excel line disappears without any explanation, although Desmond retains anachronistic references to the plantation owners' "spreadsheets." As with other controversies from The 1619 Project, the revisions pair a cover-up of an error with haphazard execution.

This pattern persists and compounds through the meatier parts of Desmond's expanded thesis. His original essay singles out American capitalism as "peculiarly brutal"—an economy characterized by aggressive price competition, consumerism, diminished labor union power, and soaring inequality. This familiar list of progressive grievances draws on its own array of suspect sources. For example, Desmond leans heavily on the empirical work of the U.C. Berkeley economists Emmanuel Saez and Gabriel Zucman to depict a society plagued by the growing concentration of wealth among the "top 1 percent." Data from the Federal Reserve suggest that these two authors exaggerate the rise in wealth concentration since 1990 by almost double the actual number. Desmond's own twist is to causally link this present-day talking point with the economic legacy of slavery.

To do so, he draws upon recent statistical analysis that showed a 400 percent expansion in cotton production from 1800 to 1860. In Desmond's telling, this growth stems from the capitalistic refinement of violence to extract labor out of human chattel. "Plantation owners used a combination of incentives and punishments to squeeze as much as possible out of enslaved workers," he declared—a carefully calibrated and systematized enterprise of torture to maximize production levels. In the original essay, Desmond sourced this thesis to Cornell historian Edward E. Baptist, whose book The Half Has Never Been Told essentially revived the old "King Cotton" thesis of American economic development that the Confederacy embraced on the eve of the Civil War. Baptist's book is a foundational text of the "New History of Capitalism" (NHC) school of historiography. The 1619 Project, in turn, leans almost exclusively on NHC scholars for its economic interpretations.

But Baptist's thesis fared poorly after its publication in 2014, mainly because he misrepresented the source of his cotton growth statistics. The numbers come from a study by the economists Alan L. Olmstead of the University of California, Davis, and Paul W. Rhode, then with the University of Arizona, who empirically demonstrated the 400 percent production increase before the Civil War but then linked it to a very different cause. Cotton output did not grow because of refinements in the calibrated torture of slaves, but rather as a result of improved seed technology that increased the plant's yield. In 2018, Olmstead and Rhode published a damning dissection of the NHC literature that both disproved the torture thesis and documented what appear to be intentional misrepresentations of evidence by Baptist, including his treatment of their own numbers. Olmstead and Rhode in no way dispute the horrific brutality of slavery. They simply show that beatings were not the causal mechanism driving cotton's economic expansion, as the NHC literature claims.

As with Desmond's other errors, I brought these problems to the attention of Silverstein with a request for a factual correction in late 2019. Almost two years later I finally received an answer: Desmond replied that "Baptist made a causal claim linking violence to productivity on cotton plantations," whereas his "article did not make such a casual [sic] claim." I leave the reader to judge the accuracy of this statement against The 1619 Project's original text, including its explicit attribution of the argument to Baptist.

Even more peculiar is how Desmond handled the "calibrated torture" thesis in the book edition. In the paragraph where he previously named Baptist as his source, he now writes that "Alan Olmstead and Paul Rhode found that improved cotton varieties enabled hands to pick more cotton per day." But this is far from a correction. Desmond immediately appends this sentence with an unsubstantiated caveat: "But advanced techniques that improved upon ways to manage land and labor surely played their part as well." In excising Baptist's name, he simply reinserts Baptist's erroneous claim without attribution, proceeding as if it has not meaningfully altered his argument.

In these and other examples, we find the defining characteristics of The 1619 Project's approach to history. Desmond and Hannah-Jones initiate their inquiries by adopting a narrow and heavily ideological narrative about our nation's past. They then enlist evidence as a weapon to support that narrative, or its modern-day political objectives. When that evidence falters under scrutiny, The 1619 Project's narrative does not change or adapt to account for a different set of facts. Instead, its authors simply swap out the discredited claim for another and proceed as if nothing has changed—as if no correction is necessary.

Ignoring the Fact-Checkers

We see the same pattern in how Hannah-Jones handles the most controversial claim in the original 1619 Project. Her opening essay there declared that "one of the primary reasons the colonists decided to declare their independence from Britain was because they wanted to protect the institution of slavery." In early 2020, Silverstein begrudgingly amended the passage online to read "some of the colonists" (emphasis added) after Northwestern University historian Leslie M. Harris revealed that she had cautioned Hannah-Jones against making this claim as one of the newspaper's fact-checkers, only to be ignored.

The ensuing litigation of this passage across editorial pages and Twitter threads unintentionally revealed an unsettling defect of the Times' venture. The 1619 Project was not a heterodox challenge to conventional accounts of American history, as its promotional material insinuated. An endeavor of this sort could be commendable, if executed in a scholarly fashion. Instead, the original essays by Hannah-Jones and Desmond betray a deep and pervasive unfamiliarity with their respective subject matters.

When subject-matter experts pointed out that Hannah-Jones exaggerated her arguments about the Revolution, or that Britain was not, in fact, an existential threat to American slavery in 1776 as she strongly suggested (the British Empire would take another 58 years before it emancipated its West Indian colonies), she unleashed a barrage of personally abusive derision toward the critics. Brown University's Gordon S. Wood and other Revolutionary War experts were dismissed as "white historians" for questioning her claims. When Princeton's James M. McPherson, widely considered the dean of living Civil War historians, chimed in, Hannah-Jones lashed out on Twitter: "Who considers him preeminent? I don't."

The 1619 Project did not simply disagree with these subject-matter experts. Its editors and writers had failed to conduct a basic literature review of the scholarship around their contentions, and subsequently stumbled their way into unsupported historical arguments. While some academic historians contributed essays on other subjects, none of The 1619 Project's feature articles on the crucial period from 1776 to 1865 came from experts in American slavery. Journalists such as Hannah-Jones took the lead, while highly specialized topics such as the economics of slavery were assigned to nonexperts like Desmond, whose scholarly résumé contained no prior engagement with that subject.

The book's revised introduction is less a corrective to the defects of the original than a mad scramble to retroactively paint a scholarly veneer over its weakest claims. Hannah-Jones leans heavily on secondary sources to backfill her own narrative with academic footnotes, but the product is more an exercise in cherry-picking than a historiographical analysis.

Consider the book's treatment of Somerset v. Stewart, the landmark 1772 British legal case that freed an enslaved captive aboard a ship in the London docks. Hannah-Jones appeals to the University of Virginia historian Alan Taylor, who wrote that "colonial masters felt shocked by the implication" of the case for the future of slavery in North America. Yet Taylor's elaboration focused narrowly on the case's negative reception in Virginia, while Hannah-Jones generalizes that into a claim that "the colonists took the ruling as an insult, as signaling that they were of inferior status" and threatening their slave property. Curiously missing from her discussion is the not-insignificant reaction of Benjamin Franklin, who complained to his abolitionist friend Anthony Benezet that Somerset had not gone far enough. Britain, he wrote, had indulged a hypocrisy, and "piqued itself on its virtue, love of liberty, and the equity of its courts, in setting free a single negro" while maintaining a "detestable commerce by laws for promoting the Guinea trade" in slaves.

To sustain her contention that a defense of slavery weighed heavily on the Revolutionary cause, Hannah-Jones now latches her essay to the University of South Carolina historian Woody Holton—a familiar secondary source from graduate school seminars who appears to have crossed her path only after the initial controversy. Since its publication, Holton has united his efforts with The 1619 Project, focusing in particular on Lord Dunmore's proclamation of 1775 to argue that the document's promise of emancipation to the slaves of rebellious colonists had a galvanizing effect on the American cause.

Dunmore's decree—which offered freedom to slaves who fought for the crown—came about as a move of desperation to salvage his already-faltering control over the colony of Virginia. Holton and Hannah-Jones alike exaggerate its purpose beyond recognition. Holton has taken to calling it "Dunmore's Emancipation Proclamation," hoping to evoke President Abraham Lincoln's more famous document, and The 1619 Project book repeats the analogy. But all sense of proportion is lost in the comparison. Lincoln's measure, though military in nature, reflected his own longstanding antislavery beliefs. It freed 50,000 people almost immediately, and extended its reach to millions as the war progressed. Dunmore, by contrast, was a slaveowner with a particularly brutal reputation of his own. His decree likely freed no more than 2,000 slaves, primarily out of the hope that it would trigger a broader slave revolt, weaken the rebellion, and allow him to reassert British rule with the plantation system intact. Hannah-Jones also haphazardly pushes her evidence beyond even Holton's misleading claims. "For men like [George] Washington," she writes, "the Dunmore proclamation ignited the turn to independence." This is a curious anachronism, given that Washington assumed command of the Continental Army on June 15, 1775—some five months before Dunmore's order of November 7, 1775.

Fringe Scholars and Ideological Cranks

The same self-defeating pairing of aggressive historical claims and slipshod historical methodology extends into Desmond's expanded essay. Moving its modern-day political aims to the forefront, Desmond peddles a novel theory about the history of the Internal Revenue Service. "Progressive taxation remains among the best ways to limit economic inequality" and to fund an expansive welfare state, he asserts. Yet in Desmond's rendering, again invoking debunked statistical claims from Saez and Zucman, "America's present-day tax system…is regressive and insipid." The reason? He contends that the IRS is still hobbled by slavery—a historical legacy that allegedly deprives the tax collection agency of "adequate financial backing and administrative support."

It is true that slavery forced several compromises during the Constitutional Convention, including measures that constrained the allocation of the federal tax burden across the states. Yet Desmond's rendering of this history borders on incompetence. He declares that the Constitution's original privileging of import tariffs "stunted the bureaucratic infrastructure of the nation"—apparently oblivious to the fact that Alexander Hamilton's Treasury Department set up one of the first true national bureaucracies through the federal customs house system. To Desmond, the United States was a relative latecomer to income taxation because of a reactionary constitutional design that impeded democratic pressures for redistribution in the late 19th century. This too is in error. In fact, comparative analyses of historical tax adoption strongly suggest that less democratic countries with lower levels of enfranchisement were the first movers in the international shift toward income taxation. When the U.S. Congress passed the 16th Amendment in 1909 to establish a federal income tax, the first wave of ratifications came from the states of the old Confederacy, who saw it as a means of transferring the federal tax burden onto the Northeast.

At this point, Desmond's narrative veers from the fringes of academic discourse into ideological crankery. After a misplaced causal attribution of 19th century development to the economic prowess of King Cotton, he turns his attention to what he sees as the true fault of American slavery: It allegedly enabled "capitalists" to leverage race "to divide workers—free from unfree, white from Black—diluting their collective power." This fracture among an otherwise natural class-based alliance is said to have impeded the emergence of a strong and explicitly socialistic labor movement in the United States, leading to "conditions for worker exploitation and inequality that exist to this day."

Desmond's theory makes sense only if one accepts the historical methodology of hardcore Marxist doctrine. History is supposed to progress toward the ascendance of the laboring class; thus, any failure of the proletarian revolution to materialize must arise from some ruling-class imposition. To Desmond, that imposition is slavery: "What should have followed [industrialization], Karl Marx and a long list of other political theorists predicted, was a large-scale labor movement. Factory workers made to log long hours under harsh conditions should have locked arms and risen up against their bosses, gaining political power in the formation of a Labor Party or even ushering in a socialist revolution."

After waxing about the "democratic socialism" of European welfare states, Desmond thus laments that "socialism never flourished here, and a defining feature of American capitalism is the country's relatively low level of labor power." This he considers slavery's legacy for the present day.

This thesis is bizarre, not to mention historically tone-deaf. The 19th century abolitionist rallying cry of "free soil, free labor, free men" reflected an intellectual alliance between free market theory and emancipation. Nowhere was this more succinctly captured than in the words of pro-slavery theorist George Fitzhugh, who declared in 1854 that the doctrine of laissez faire was "at war with all kinds of slavery."

Desmond's historical narrative is not original to The 1619 Project. It revives a line of argument first made in 1906 by the then-Marxist (and later National Socialist) philosopher Werner Sombart. Asking why socialism never took hold in the United States, Sombart offered an answer: "the Negro question has directly removed any class character from each of the two [American political] parties," causing power to allocate on geographic rather than economic lines. Desmond both credits and expands upon Sombart's thesis, writing: "As Northern elites were forging an industrial proletariat of factory workers…Southern elites…began creating an agrarian proletariat." Slavery's greatest economic fault, in this rendering, was not its horrific violation of individual liberty and dignity but its alleged intrusion upon a unified laboring class consciousness.

The great tragedy of the original 1619 Project was its missed opportunity to add detail, nuance, and reflection to our historical understanding of slavery and its legacy. That opportunity was lost not upon publication but in the aftermath, when The New York Times met its scholarly critics with insult and derision. The ensuing controversies, initially confined to Hannah-Jones' and Desmond's essays, came to overshadow the remainder of the project, including its other historical contributions as well as its literary and artistic sections.

The book version continues down this path, obscuring existing errors through textual sleights of hand and compounding them with fringe scholarship. The unifying theme of it all is not historical discovery or retrospection, but the pursuit of political power: less a historical reimagining of slavery's legacy than an activist manual for taxation and redistribution. Here again, Hurston's words offer a fitting warning to those who would rectify the injustices of the past with the politics of the present: "There has been no proof in the world so far that you would be less arrogant if you held the lever of power in your hands.""

The UK is sitting on a gas gold mine, while Putin has Europe’s energy market by the throat

It’s madness not to frack

By Matt Ridley.

"The price of gas is through the roof thanks to Vladimir Putin, who has Europe’s energy market by the throat. Britain is on track to spend a staggering £2BILLION on imported liquefied natural gas from Russia this year as war rages in Ukraine.

Household bills will skyrocket even more than they already were — and could hit £3,000 a year. This is what happens when you rely on imported foreign energy. And what makes it more maddening is that we don’t need to do this. We have supplies here.

Under Lancashire and Yorkshire lies one of the best reservoirs of natural gas in the world, known as the Bowland Shale. At current prices, just ten per cent of this gas is worth several trillion pounds and could keep Britain supplied with gas for five decades. And we will need gas for decades whatever happens: To back up wind farms, heat homes and make vital chemicals for industry.

Last year I asked a Texan gas expert, who has drilled into the Bowland Shale, how it compares with American shale gas reserves. “It’s much better than what we have in the US,” he replied, “better than the Haynesville in Louisiana or the Marcellus in Pennsylvania, thicker and richer in gas”.

The technology to get the gas out is proven, safe and improving all the time. So why don’t we tap this treasure? Because wealthy, posh southerners went up North to protest, and the Government caved in.

The technology is usually referred to as “fracking” but that’s misleading. Hydraulic fracturing has been happening in oil and gas wells, including in Britain, for decades. What changed in the past decade was that it was combined with horizontal drilling and became cleaner and more effective. The latest technology promises to tap shale gas without fracking at all.

In 1997 Nick Steinsberger, of Mitchell Energy, almost by mistake tried cracking shale rocks a mile underground with water, instead of gel, and discovered a recipe for getting gas to flow from the very source rocks of gas, the shales.

The anti-frackers like to call this recipe “toxic chemicals” and imply it could poison aquifers (areas of rock underground that absorb and hold water), but that’s nonsense. The water is mixed with sand and a small amount of soap and bleach, of the kind you keep under your kitchen sink. It is pumped about a mile down, way below the aquifers, and into rocks that are, by definition, full of methane, ethane and petroleum, so they are already “toxic”.

The result of Steinsberger’s break-through was that, in a few short years, America became the biggest gas producer in the world, overtaking Russia. It went from importing gas to exporting it and gave itself some of the lowest gas prices in the world — now less than one-quarter of ours.

When I first visited the Marcellus Shale site in 2011 to understand what was happening, experts here were saying this shale boom was a flash in the pan, would not last and could not cope with low gas prices.

They were wrong.

A few years later I was back in Colorado watching Liberty Oil & Gas producing gas profitably and much more quietly from new wells at low prices. The site was right next to a housing estate. “Aren’t the residents worried about tremors and noise,” I asked. I was told they set up monitors and requested to be informed when the fracking would start, then called back a few days after to say: “Why did you not start when you said you would?”

“But we did,” replied the gas company, “didn’t you detect anything?”

It’s a myth that the American shale gas production happens in the middle of nowhere: Steinsberger started it in the suburbs of Fort Worth, Texas. Almost everything Friends of the Earth and its eco-luvvie rent-a-crowd say about shale gas is a myth. It does not cause water to catch fire, poison aquifers, spill contaminated waste water, increase radioactivity or cause “earthquakes”.

Small tremors do happen during any kind of underground work, but in Britain the shale gas firms such as Cuadrilla were told to stop if they caused a 0.5 tremor on the Richter scale, equivalent to somebody sitting down hard in a chair, and far fainter than what the coal mining or geothermal — or indeed road and rail transport industries — cause all the time.

Why the double standard? The very people who protest about shale gas are often fans of wind farms. But these pour more concrete (a carbon-intensive mat- erial), use more steel (ditto), spoil more views, require more subsidies and, above all, take up far more land.

A single shale drilling pad with 40 wells fanning out in all directions covers a few acres. For a wind farm to produce that much energy it would have to be 1,500 times larger — and it’s useless on a still day.

Britain imports shale gas from America, but — unlike oil — shipping it adds massively to the cost of gas, as well as the carbon footprint.

The Government was wrong to order a moratorium on shale gas, to order the wells plugged and to repeat its dogmatic objections to developing Britain’s shale treasure at a time when war in Europe is reinforcing the need for energy security.


Tuesday, March 29, 2022

Democrats for Higher Gas Prices

Senators propose a windfall-profits tax to reduce oil production 

WSJ editorial

"You knew it was coming. Even as President Biden begs OPEC to pump more oil, Senate Democrats are threatening to punish U.S. oil companies with a windfall-profits tax if they increase production. The contradiction nicely summarizes progressive energy policy.

“Putin’s war is driving up gas prices—and Big Oil companies are raking in record profits,” Elizabeth Warren tweeted Thursday. To curb what she calls “Big Oil profiteering,” she and 11 other Senate Democrats have introduced legislation to impose the new tax. 

“We need to hold large oil and gas companies accountable” and “urgently need to invest in America’s clean energy economy,” says Colorado Sen. Michael Bennet. Accountable for what? Making money in a legal business? Meeting obvious consumer demand?

The Senators’ plan would require companies that produce or import at least 300,000 barrels of oil per day (or did so in 2019) to pay a per-barrel tax equal to 50% of the difference between the current and average price between 2015 and 2019 (about $57 a barrel). They say smaller companies would be exempt so the giants can’t raise prices without losing market share.

But oil companies don’t set prices, as the Federal Trade Commission has found time and again. Supply, demand and market expectations do. Crude prices fell $20 a barrel on Thursday after the United Arab Emirates said it would encourage fellow OPEC members to increase production. Imagine how much oil prices might fall if President Biden announced a moratorium on climate regulation that punishes fossil fuels. Instead, Democrats are threatening to hurt producers for producing more.

Not long ago climate progressives argued that declining oil profits showed that companies needed to move away from fossil fuels. That was what last year’s ExxonMobil board battle was supposedly all about. Liberals also say asset managers should divest from oil companies because their profits are doomed to decline as the world embraces green energy.

But now Democrats say oil companies are too profitable and blame them for benefiting from the tighter oil supply and higher prices that political hostility to fossil fuels has exacerbated. Rhode Island Sen. Sheldon Whitehouse says “oil companies never let a good crisis go to waste.” Neither do Democrats.

The windfall-tax proposal shows that Democrats don’t want U.S. companies to produce more oil so gasoline prices fall. They want higher gas prices so reluctant consumers buy more electric vehicles. They can’t say this directly because it would be politically suicidal in an election year with the average gas price above $4 a gallon, so they do it indirectly via taxes and regulation.

It’s hard to believe President Biden would back the windfall tax, but with the influence of the climate lobby in this Administration, you never know."

Are meatpackers unfairly raising prices?

See Cattle Ranchers Take Aim at Meatpackers’ Dominance: Nebraska cattlemen plan to build their own butchering plant to bypass America’s meat-processing giants, which they say underpay for livestock by Patrick Thomas of The WSJ. Excerpts: 

"The average price for live cattle was up 5% in 2021 from 2019, according to figures from the Livestock Marketing Information Center and Agriculture Department, while the average price of boxed beef—cuts that packaging plants box to ship to retailers—was up 26%."

"Some cattlemen have pushed Washington to tighten antitrust rules for the four biggest meatpackers, JBS USA Holdings Inc., Tyson Foods Inc., Cargill Inc. and National Beef Packing Co. The four together process around 85% of the country’s cattle"

"Some officials at the big meatpackers said that because many processing plants remain short-staffed, they can’t buy and process as many cattle, reducing demand for livestock and pushing prices lower for ranchers and feedlot owners. Those factors are constraining production of ground beef, steaks and brisket, they said, while demand from grocery stores and reopening restaurants hasn’t let up, driving wholesale beef prices higher."

"The North American Meat Institute, a trade group representing meat companies big and small, said that the four major beef processors’ share of the cattle market has held steady for 25 years and that any effects from added processing plants will likely take years to materialize."

"Cattlemen have tried to create their own plants before, but few have been successful, said Bill Rupp, a Sustainable Beef board member and former consultant at a firm that helped set up such ventures. New plants often struggle to keep costs low, he said, because they don’t have the economies of scale to price their meat competitively with the big four."

"Some economists say cattlemen have shared responsibility for sliding livestock prices by expanding their herds. From 2015 through 2016, prices ranchers received for beef cattle dropped about 32%, USDA data show, generally remaining around those levels since. That period coincided with low grain prices and rising U.S. beef exports, which led many U.S. ranchers to raise more cattle than before—putting more beef on the market and helping lower prices, these economists say.

U.S. cattle production grew by about 6 million head to about 95 million from 2014 through 2019 and the number and size of slaughtering plants didn’t change much, said Don Close, an analyst at agriculture lender Rabobank. That shifted market power toward meatpackers that weren’t able to process the oversupply of cattle, and the recent plant disruptions and labor shortages exacerbated the issue, he said.

Demand surge

Those constraints, combined with resurgent demand from restaurants over the past year, also explain why wholesale beef prices have surged, meat-industry officials have said. Beef operating income at Tyson, the largest U.S. meat company by sales, more than doubled in 2021 from the prior year, while profit margins from the business expanded to 18% from 10%. Revenue from JBS’s U.S. beef business rose 28% through the first nine months of 2021 compared with first nine months of 2020. Privately held Cargill’s 2021 sales grew 17% to $134.4 billion from 2020.

Tyson said on its recent quarterly earnings call that raising wages and expanding benefits to recruit and retain staff, and higher transportation and feed costs are what is driving meat prices higher. A Tyson spokesman said its margins were affected by idled plants in 2020 because of the pandemic. In 2021, he said, margins were affected by ample market-ready cattle supplies, strong customer and export demand for beef, and constrained product supplies due to the labor shortage."

Monday, March 28, 2022

A Biden Regulator Is Mugged by Energy Reality

FERC reverses itself on a climate rule for approving natural gas pipelines 

WSJ editorial.

"Federal regulators rarely backtrack, especially as quickly as the Federal Energy Regulatory Commission. Yet on Thursday Democratic commissioners hit pause on a new policy requiring a greenhouse-gas analysis for natural gas pipelines and export projects. Good for them.

The commission last month voted 3-2 with Republicans in dissent to revise its policy for approving gas pipelines and export terminals. FERC by law must certify that projects are in the public interest and won’t have a significant environmental impact, but Democratic commissioners added greenhouse-gas emissions to their permission analyses. Their excuse was a court ruling that they claimed required the change, though it really doesn’t.

Indirect emissions from upstream production and downstream consumption could also have to be tabulated under the rule, even though these are impossible to quantify reliably. Who knows how much more gas will be produced if a pipeline is built? Rest assured, progressives will claim every new pipeline massively increases emissions.

On one point, they are right: Pipelines encourage more energy production. A dearth of U.S. pipeline capacity, especially in Appalachia, has suppressed investment in supply. Crude can be transported by rail or truck. Natural gas can’t. The U.S. also can’t send more liquefied gas to Europe without more pipelines and export terminals. FERC’s climate policy was a gift to Vladimir Putin

His bloody war on Ukraine and weaponization of Russian gas against Europe illuminate the stakes. Perhaps Democratic commissioners were mugged by energy reality like the Europeans. We’re told Democratic commissioners were also stunned by the backlash in Congress, especially from Joe Manchin.

The West Virginia Senator blasted the commissioners from his own party at a hearing this month: “I believe you all took the direction from the court and applied it far more broadly than you needed to, setting in motion a process that will serve to further shut down the infrastructure we desperately need as a country and further politicize energy development in our country.”

Chairman Richard Glick on Thursday did a soft retreat by recasting last month’s policy statements as “drafts” on which FERC will seek public comment. “In light of concerns that the policy statements created further confusion about the Commission’s approach to the siting of natural gas projects, the Commission decided it would be helpful to gather additional comments from all interested stakeholders, including suggestions for creating greater certainty, before implementing the new policy statements,” he said.

We’ll take the reprieve. But the better way to eliminate investment uncertainty would be for FERC to rescind the proposal in toto."

Greed and profiteering are not the cause of high gas prices

See Both parties neglect to propose a solution that might actually lower gas prices by Catherine Rampell. Excerpts:

"Aside from Vladimir Putin, whose contributions to high energy prices are relatively recent, Democrats’ preferred villain is “corporate greed” or “profiteering.” This explanation polls well, which might be why Biden and others on the left keep citing it. But it does little to explain why gas prices are up so much — or what could help to bring them down.

Corporations didn’t suddenly remember that they’re supposed to be greedy. And “profiteering” might sound like an incriminating accusation, but it has little meaning other than “prices are higher than politicians want them to be.” Blaming “high prices” on “profiteering” isn’t an explanation; it’s a tautology."

"Far worse are proposals such as a “windfall-profits tax” on oil. This might sound good if you genuinely believe that “profiteering” is the problem. But the last time Congress tried a similar policy, it reduced oil production. Which means it could well drive gas prices higher."

"We want to know why supply hasn’t ramped up, even in the face of prices that would normally encourage a lot more investment. And here the explanation is complicated.

OPEC countries have deliberately kept their production low. U.S. producers have indeed added more oil rigs in recent months, but there’s a multi-month lag between when one gets leased and set up and when more oil from that site becomes commercially available.

Additionally, these U.S. energy companies have been expanding a little more slowly than in previous cycles when oil prices were also high. Why? As in many other industries, worker shortages and other supply-chain issues are challenges. More important, shareholders and lenders are cautious about financing more production. Lots of people lost their shirts when prices plummeted in 2020. They’re nervous that something similar could happen again — that by the time they get a new rig up and running, prices will have returned to Earth, and they’ll go bust again."

Sunday, March 27, 2022

Petty Thieves Plague San Francisco. ‘These Last Two Years Have Been Insane.’

Small-business owners have been hit particularly hard by additional security and repair costs

By Zusha Elinson of The WSJ. Excerpts: 

"Among the 25 largest U.S. cities, San Francisco has had the highest property-crime rate in four of the most recent six years for which data is available, bucking the long-term national decline in such crimes that began in the 1990s. Property crimes declined in San Francisco during the first year of the pandemic, but rose 13% in 2021. Burglaries in the city are at their highest levels since the mid-1990s. There were 20,663 thefts from vehicles last year—almost 57 a day—a 39% increase from the prior year, although still below the record of 31,398 in 2017, according to the police.

Smashed storefronts are so common that the city launched a program to fix them with public money. Car owners leave notes declaring there is nothing of value in their vehicles, or leave their windows open to save themselves from broken glass. Videos of shoplifters hauling goods out of drugstores such as Walgreens have gone viral, and a smash-and-grab robbery by 20 to 40 people at a Louis Vuitton store last November made the national news.

Owners of small businesses say the costs of security and repairs are eating into profits already diminished by the Covid-19 pandemic. In the Castro, the neighborhood where Cliff’s is located, shops have recorded nearly 100 instances of smashed windows and doors that cost $170,000 to repair since the beginning of 2020, according to the neighborhood’s merchant association."

"District Attorney Chesa Boudin, who took office in 2020 as part of the national “progressive prosecutor” movement and has de-emphasized the prosecution of low-level offenses, will face a recall election in June.

“Nothing is more important than to make sure that people who live in this city, people who work in the city, people who visit San Francisco, feel safe,” Democratic Mayor London Breed said at a news conference last month. “The fact is, that does require police officers.”

Some former police officials and business owners blame Mr. Boudin’s focus on keeping people who commit small-scale crimes out of prison. His office, for example, discourages filing charges in cases where suspects are pulled over for traffic infractions and officers find small amounts of drugs."

"Businesses have been affected in every corner of San Francisco, even traditionally low-crime areas such as the Sunset District, where commercial and residential burglaries rose 80% in between 2019 and 2021."

"Some former police officials said in interviews that officers don’t feel it is worth making an arrest in low-level cases because they assume the district attorney won’t file charges. They also point to a statewide ballot measure passed in 2014—Proposition 47—that raised the dollar amount at which theft can be prosecuted as a felony from $400 to $950."

"consequences aren’t severe enough for repeat offenders. Police investigators have a list of 48 people arrested five or more times for burglaries in recent years, he said, and more than half of them are no longer behind bars."

"Last year, Mayor Breed declared a state of emergency because of overdoses in the city’s Tenderloin neighborhood."

Blame Sacramento, Not Moscow, for California’s Energy Crisis

A fixation on renewables and underinvestment in fossil fuels are causing real economic pain in the state

By Robert Bryce. Excerpts:

"Like Europe, California overinvested in renewables, underinvested in hydrocarbons, prematurely shuttered its baseload power plants, and relied too heavily on imported energy. Now, as Europe is ensnared in Vladimir Putin’s energy trap, Californians watch as the state’s energy prices head toward the stratosphere.

On Feb. 25, the day after Russia invaded Ukraine, the Energy Information Administration reported that the all-sector price of electricity in California jumped by 9.8% last year to 19.76 cents per kilowatt-hour. Residential prices increased even more, jumping 11.7% to an average of 22.85 cents per kilowatt-hour. California residential users are now paying about 66% more for electricity than homeowners in the rest of the U.S., who pay an average of 13.72 cents per kilowatt-hour.

California’s rates are rising far faster than those in the rest of the country. Last year, California’s all-sector electricity prices increased 1.7 times as fast as the rest of the U.S., and residential prices grew 2.7 times as fast as in the rest of the country."

"The surging energy costs show yet again the ruinously regressive effect of Sacramento’s decarbonization policies, which include a requirement for 100% zero-carbon electricity and an economywide goal of carbon neutrality by 2045."

"On Feb. 10, the California Public Utilities Commission unanimously approved a scheme that would add more than 25 gigawatts of renewables and 15 gigawatts of batteries to the state’s grid by 2032 at an estimated cost of $49.3 billion. Also last month, the California Independent System Operator released a draft plan to upgrade the state’s transmission grid at a cost of $30.5 billion. The combined cost of those two schemes is about $80 billion. Dividing that sum among 39 million residents works out to about $2,050 for every Californian."

"For proof of how big public-works projects can exceed initial estimates, consider California’s high-speed train. That project is now expected to cost $105 billion, which is 2½ times the $42 billion Californians were told they would pay when it was launched in 2008."

"The average price of regular gasoline in the state is now $5.72 a gallon, according to the American Automobile Association."

"When it comes to electricity production, climate activists never tire of claiming that weather-dependent renewables are cheaper than fossil fuels. But the state’s rising electricity prices, and Europe’s energy crisis, prove those claims false. The hard truth is that California policy makers are providing a case study on how not to manage an electric grid."

Healthy Children Don’t Need Covid Vaccines

Florida is right. Especially for kids under 12, the risks are trivial. And most have natural antibodies.

By Allysia Finley. Excerpts:

"abundant scientific evidence that Covid-19 poses a negligible risk to healthy children, which makes it impossible to know if the benefit of vaccination outweighs the risk."

"the infection fatality rate for those under 18 at between 0.0023% and 0.0085%—meaning 2.3 to 8.5 of every 100,000 children who get infected will die. Rates are lowest among those 5 to 11." 

"there were 66 Covid-19 deaths among children 5 to 11 between Oct. 3, 2020, and Oct. 2, 2021—exactly the same number as died from suicide"

"there were 969 deaths in this age group from unintentional injury and 207 from homicide in 2019."

"Polio paralyzes 1 in 200 infected children, and the fatality rate for measles ranges between 0.1% to and 0.3%. That’s why childhood vaccinations are recommended for both. The risk of hospitalization from the flu for children 5 to 11 is 50% higher than from Covid and the related multisystem inflammatory syndrome combined. MIS in rare instances can cause gastrointestinal and cardiovascular symptoms after infection."

"Hospitalizations among children did increase during the Omicron wave relative to previous surges, but they still remained very low—80% lower than among young adults"

"Pfizer’s vaccine for children 5 to 11 after a small trial (about 1,500 kids received Covid shots) found it was 90% effective at preventing symptomatic illness. But the vaccine’s efficacy rapidly waned, even more so than in adults, especially as the Omicron variant spread."

"vaccines showed efficacy against hospitalization among 5- to 11-year-olds, though it declined from 100% in mid-December to 48% by the end of January. But because that risk is so low to begin with, the protection against it doesn’t amount to much."

"The vast majority have already been infected. The CDC estimates that 58% of children under 18 had infection-induced antibodies as of January,"

"63% of children under 18 who tested positive for the virus on PCR tests didn’t generate antibodies in their blood."

"Germany, Norway and Sweden don’t recommend vaccines for healthy children under 12, and the Danish Pediatric Society has urged its government to follow suit."

Yale Law Students for Censorship

Maybe those who try to shout down speakers shouldn’t get judicial clerkships.

WSJ editorial.

"Students at Yale Law School recently disrupted speakers in an example of cancel culture that is common on so many university campuses these days. Now comes a senior federal judge advising his judicial colleagues against hiring the protesting students for clerkships.

The March 10 panel was intended as a debate over civil liberties. It was hosted by the Yale Federalist Society and featured Monica Miller of the progressive American Humanist Association and Kristen Waggoner of the Alliance Defending Freedom, a conservative outfit that promotes religious liberty. The two broadly agree on protecting free speech, despite their differences on other issues.

A hundred or so students heckled and tried to shout down the panel and Federalist Society members in attendance. One protester told a member of the conservative legal group she would “literally fight you, bitch,” according to the Washington Free Beacon, which obtained an audio and videotape of the ruckus. The speakers were escorted from the event by police for their safety. It’s not too much to say that the students were a political mob.

No punishment seems forthcoming from Yale Law School, despite its ostensible policy barring protests that disrupt free speech. But the event prompted Senior Judge Laurence Silberman of the Court of Appeals for the D.C. Circuit to write the following letter to all of his fellow Article III judges last week: 

“The latest events at Yale Law School in which students attempted to shout down speakers participating in a panel discussion on free speech prompts me to suggest that students who are identified as those willing to disrupt any such panel discussion should be noted. All federal judges—and all federal judges are presumably committed to free speech—should carefully consider whether any student so identified should be disqualified for potential clerkships.”

That should get some attention at Yale and other law schools. The woke young men and women might not care about the First Amendment, but they care about their careers. Judicial clerkships are plum post-graduate positions that open a path to jobs at prominent law firms, in state and federal government, and later to powerful judgeships. Appellate-court clerkships in particular are highly prized and are often a stepping stone to clerk for a Supreme Court Justice.

Some readers may think these students should be forgiven the excesses of youth. But these are adults, not college sophomores. They are law students who will soon be responsible for protecting the rule of law. The right to free speech is a bedrock principle of the U.S. Constitution. If these students are so blinkered by ideology that they can’t tolerate a debate over civil liberties on campus, the future of the American legal system is in jeopardy.

Individual judges choose their clerks, and no doubt some will figure they can educate these progressive protesters. But Judge Silberman’s letter should, if nothing else, warn these students that there may be consequences for becoming campus censors."

Buying Votes With Gas Tax Rebates

First raise gas taxes. Then promise ‘relief’ with a one-time check.

WSJ editorial.

"State government coffers are swelling even as many Americans are feeling poorer amid surging inflation and gasoline prices. But instead of cutting taxes, some governors want to spend their surpluses buying political relief before the November election.

California Gov. Gavin Newsom last week said he planned to “put money back in the pockets of Californians to address rising gas prices” by sending checks to drivers—more details to come. The state’s gasoline prices are averaging $5.79 per gallon compared to $4.29 nationwide. Californians can blame hefty taxes and climate mandates.

In 2010 Californians were paying a mere 25 to 30 cents a gallon more than the national average. Then Democrats established a low-carbon fuel standard and cap-and-trade program to reduce fossil-fuel consumption and finance electric cars. These add at least 46 cents a gallon to gas prices, according to the Western States Petroleum Association.

Democrats also raised the state excise gas tax by 12 cents a gallon in 2017 and indexed it for inflation, purportedly to repair roads and bridges. Much of the proceeds have gone instead to mass transit. Californians now pay upward of 70 cents a gallon in state and local gasoline and sales taxes versus about 20 cents in Texas and Arizona. 

While projecting a $46 billion surplus, Democrats in Sacramento have rejected a gas tax cut. They also refuse to relax climate regulations. “One thing we cannot do is repeat the mistakes of the past by embracing polluters,” Mr. Newsom says. Gentry climate progressives favor high gas prices because they make electric cars more attractive.

But moderate Democratic legislators are feeling election-year pressure to ease rising prices for low- and middle-class voters. Enter Mr. Newsom’s proposal for a “gas tax rebate.” Unlike broad-based tax cuts, such direct payments let Democrats distribute money to select voter groups. Maybe electric vehicle owners will get a check.

Illinois Democratic Gov. J.B. Pritzker is also promising to “alleviate some pressure on Illinois’ working families,” after having doubled the state gas tax to 38 cents a gallon in 2019. His proposal: Suspend this year’s inflation-adjusted gas tax increase (two cents a gallon) and send $300 property tax credits to middle-income homeowners.

That’s about as much as inflation is costing the average household in a single month, and it doesn’t come close to offsetting higher property taxes from increasing housing values and pension payments. The Democratic strategy is to raise taxes and then redistribute a small cut of the revenues to buy votes.

Last year New York Democrats raised the state’s top income-tax rate to 14.8% from 12.7%. Now armed with a $5 billion surplus, Gov. Kathy Hochul wants to raise annual property tax rebates by a couple hundred dollars for homeowners outside of New York City who earn up to $500,000. That won’t cover half of this winter’s increase in New Yorkers’ heating bills.

States and localities have received $900 billion in federal Covid relief. Soaring asset prices have lifted income and property tax revenue while inflation has pushed earners into higher tax brackets. While many governors are returning the money in tax cuts that benefit everyone, Democrats in high-tax states are sharing surpluses only with a politically favored few."

The Disaster That Is Venezuela (or The NY Times says price controls cause problems)

By Tim Padgett. He reviews "THINGS ARE NEVER SO BAD THAT THEY CAN’T GET WORSE: Inside the Collapse of Venezuela" by William Neuman (a New York Times correspondent). Tim Padgett is the Americas editor at the Miami NPR affiliate WLRN and Time’s former Latin America bureau chief. Excerpts:

"For Venezuela, 2012 was the eve of the worst national collapse in modern South American history.

Hugo Chávez, the red-bereted firebrand who’d brought his socialist Bolivarian Revolution to power in 1999, would win another presidential term that October. But by March 2013, he’d be dead of cancer, and you could feel something malignant about to lay waste to his country’s social and economic body.

Spiraling inflation, widespread corruption and ludicrous financial thinking were erasing Venezuela’s historic oil boom. Through the decade, gross domestic product would free-fall almost 80 percent and malnutrition would stalk the population. In 2015 the capital, Caracas, would suffer the highest homicide rate of any city in the world."

"A fifth of Venezuela’s 30 million people would flee abroad."

"In the 21st century, Chávez’s authoritarian regime, known as Chavismo, turned that Venezuelan delusion into demolition. It did steer petro-riches to the poor for once; many barrios saw their first schools, clinics and potable water pipes. But when the price of oil skyrocketed from less than $8 a barrel at Chávez’s inauguration to more than $100 a barrel shortly before he died, insane economic malpractice and malfeasance ensued."

"“Chávez’s socialism was all means and no production,” he writes. “It was showcialismo,” an endless bacchanal of multibillion-dollar projects — like a national electricity monopoly, Corpoelec — that were essentially left to rot after the ribbon-cuttings. As Venezuela gorged on imports and prices ballooned, Chávez and his handpicked successor, the witless ideologue Nicolás Maduro, kept forcing price controls that further discouraged domestic industry, spawning huge shortages and extortionate black markets."

"Using fraudulent contracts and invoices, Chavista mandarins and their business cronies gamed the chasm between the official and black-market bolívar-to-dollar exchange rates. They reaped Mafia-grade profits; they also bled the state-run oil monopoly, PDVSA, of cash and robbed Venezuelans of urgent necessities like food, housing and energy infrastructure."

Saturday, March 26, 2022

The economic problem of the 1.5°C climate ceiling

By Kenneth P. Green of The Fraser Institute.

"In a recent essay and blog post, we explored the origins of the “10-years-to-save-the-Earth” climate change narrative, and also explained the origins of the “temperature ceiling” narrative, which suggests the maximum amount of atmospheric warming the planet could adjust to without causing ruinous harms to people and the planet. That particular ceiling now stands at 1.5 degrees Celsius, having worked its way down over the last several years from the prior accepted threshold of 2°C as little as five years ago.

To briefly summarize, both the 1.5°C catastrophe threshold, and the countdown timer to avoid it, come from a very large ensemble of speculative computer models forecasting:

  • How greenhouse gas concentrations might increase in the future under various speculative “Storylines and Scenarios created by a subdivision of the United Nations Intergovernmental Panel on Climate Change”
  • How much warming those gases will trap in the atmosphere
  • How that warming will impact various human and non-human ecosystems and economic systems in the future

In this post, we turn to the question of what impact the goal of avoiding the largely speculative 1.5°C disaster threshold would have on Canada’s (and the world’s) overall economic health.

In short, economic analyses suggest that attempting to reduce greenhouse gas (GHG) emissions at the scale required to conform with the 1.5°C upper-limit models would cause considerably more harm to humans (and the ecosystems we are an integral part of) than doing nothing at all.*

In a recent study, Off Target: The economics literature does not support the 1.5°C climate ceiling, Fraser Institute senior fellows Robert P. Murphy and Ross McKitrick explore the history of this question, particularly the (often misrepresented) work of Nobel-winning economist William Nordhaus, who built much of his career conducting economic analysis of climate change policy proposals. As Murphy and McKitrick summarize:

"[W]e will reproduce two separate facets of Nordhaus’ results to show that they reject the 1.5°C target. First, we show that, as of the 2016 calibration of his Dynamic Integrated Climate-Economy (DICE) model, Nordhaus recommended an “optimal carbon tax” that would place the Earth on a trajectory to warm 3.5°C by the year 2100. Second, we show that Nordhaus’ 2016 results conclude that the effects of a 1.5°C ceiling on global warming would be so severe that it would be better for humanity to do nothing about climate change, rather than pursue such a stringent goal."

The table below shows the estimated benefits and costs of three different carbon tax scenarios modeled by Nordhaus—one where there are no (tax) controls implemented to limit GHG gases; one that strikes what Nordhaus considers an economically optimal result of reducing predicted climate change harms; and one where a hard climate change ceiling of 2.5°C is adopted, with the level of carbon taxation needed to keep global temperatures below that ceiling.

Chart

Unpacking this a bit for the non-economist, Nordhaus’s DICE model (part of the work he did that earned him the Nobel) suggested the damages from climate change without any controls would equal about US$134 trillion by 2100 (see row 1in the table and note that these figures are expressed in present-value terms from today’s perspective).

According to the Nordhaus model, that “optimal tax,” which would reduce U.S. economic output by US$20.1 trillion (measured from today’s perspective, discounting future dollars at the going rate of interest), would be expected to offset US$134.2 - US$84.6 = US$49.6 trillion in climate change damage that would occur in the absence of controls over GHG emissions. Put simply, Nordhaus’ “optimal tax” would do more good than harm, to the tune of avoiding a net US$29.9 trillion in economic harms from predicted climate change over whatever costs (harms) the tax itself would impose on the economy (see the last column in row 2 in the table, showing how much on net the “optimal carbon tax” scenario improves upon the baseline “no controls” scenario). That would represent a net social gain, though under that model, the world would still see 3.5 °C of global warming (which is not shown in the table, but appears elsewhere in Nordhaus’ results).

But when Nordhaus modelled what it would take (via a carbon tax) to limit the Earth’s temperature increase to 2.5°C, the outcome was much different. The economic harm of imposing a tax needed to prevent warming above 2.5°C would reach US$134.6 trillion. It’s true that such an aggressive tax would limit climate change damages to US$43.1 trillion, but even so, on net the costs of the tax would exceed these climate benefits by US$43.2 trillion relative to the “no controls” baseline (shown in the right column in the last row).

As for capping potential climate change to 1.5°C, as Murphy and McKitrick observe, “Nordhaus considered the 1.5°C ceiling so far out of reach that he didn’t even bother to model it as a policy option.” However, a simple exercise in multiplication would suggest that lowering the Nordhaus-modelled 2.5°C scenario down to 1.5°C (a 60 per cent reduction) would increase the net harms done to society proportionately, causing (43.2*1.6) or 69 US$trillion more economic costs to society as a whole than would allowing totally uncontrolled GHG emissions and consequent damages from climate change.

The current maximum heating threshold that IPCC models suggest will fall short of catastrophic levels is currently 1.5°C. Still other IPCC models of GHG emissions trajectories in the future suggest warming could be “locked in” in as little as 10 years without radical GHG emission-reduction policies such as the Trudeau government’s stated goal of achieving net-zero carbon emissions by 2050, for example, or Prime Minister Trudeau’s more recent pledge to reach net-zero carbon emissions from Canada’s electrical generation sector. But economic models by Nobel prize-winning economist Nordhaus (and others) suggest that the costs of achieving the big “net-zero by 2050” goal would cause much more harm to society than imposing no GHG emission controls at all."

Doomsday predictions rely on flawed climate models

By Kenneth P. Green of The Fraser Institute

"Much of the debate around manmade climate change, and the timing and stringency of government policies meant to manage the risk of climate change, is based on the perception that the descent into catastrophic climate degradation seems to always be about 10 years away.

Over the last 30 years, the media has made this clear. “A senior U.N. environmental official says entire nations could be wiped off the face of the Earth if the global warming trend is not reversed by the year 2000,” wrote Peter James Spielmann of the Associated Press in 1989. “UN scientists warn time is running out to tackle global warming. Scientists say eight years left to avoid worst effects,” wrote David Adam in the Guardian in 2007. “We have 10 years left to save the world, says climate expert,” wrote HuffPost’s Laura Paddison in 2020.

It all sounds terribly worrying. But where does this idea originate—that the climate catastrophe (or an irretrievable tipping point into one) is 10 years away?

The answer may surprise you. It does not come from simple extrapolating what we’ve seen regarding the rates of temperature change and greenhouse gas emissions since 1950 (when humans started putting significant amounts of greenhouse gases into the atmosphere). Instead, the 10-years-to-catastrophe idea stems from a set of speculative forward-looking computer models generated across the research network of climate change modellers, as summarized by the United Nations Intergovernmental Panel on Climate Change (IPCC), considered by many to be the authoritative body on all things climate.

Three broad types of prospective models underpin the 10-year tipping point paradigm:

  • Models of greenhouse gas (GHG) emissions based on future estimates of population, economic growth, technology development, etc.
  • Models of atmospheric warming caused by those GHG emissions expected to manifest in the future
  • Ecological impact models that estimate what impact a warmer climate will have on a variety of ecosystems in the future.

The outputs of these three types of modelling have varied since they first took the stage in the Second Assessment Report of the IPCC published in 1995, but it was not until the Third Assessment Report of the IPCC in 2001 that the triad of models would be assembled to create the new “tipping point” paradigm.

In the Third Assessment Report, three figures show the evolution of the separate model components. These figures show the 2001 estimated range of outputs for the three sets of models. One set of models (j) measures “radiative forcing,” (a surrogate term for the heat-retaining impact of the GHGs). The second graph (k) shows how much the global atmosphere would be expected to warm under a variety of futuristic scenarios. The spaghetti lines will be explained shortly. The chart on the right shows how impacts to the Earth’s various biologic, ecologic and social systems could increase as the climate warms.

Chart 1

The chart on the right is basically the origin of the maximum allowable warming targets defining the goals for global policies to control GHG emissions after 2000. The fourth bar from the left shows the point at which impacts from climate warming become “Net negative in all metrics,” which in 2001, was not predicted to be reached until the increase in global average temperature reached about 3.5°Celsius. That temperature point according to chart (k) was not predicted to be locked in by projected increases in greenhouse gas concentrations, as you can also see, until around the year 2100, and even then, only under extreme scenarios of future greenhouse gas emission levels. However, in chart (k), the IPCC has given us a drop-line at the intersection of the magic number of 2°C (the point where mid-range models of accumulated greenhouse gases assess that we’ll reach the tipping point into climate catastrophe). That point was scheduled to arrive in 2050.

This is how the “we have X years to avert Y disaster” scenarios evolved in subsequent IPCC reports. In the IPCC Fourth Assessment Report (2007), the net-harm threshold drops to a range of from 1.5 to 2.5°C, locked in by greenhouse gas emission trends in 2020; In the IPCC Fifth Assessment Report (2014), the net-harm threshold drops to about 1.6°C, to become unavoidable in 2030; and in the IPCC Special Report Global Warming of 1.5C (2018), the net-harm threshold drops to 1.5°C, still estimated to be unavoidable by 2030. This carries through to the most recent draft report of the IPCC, the Sixth Assessment Report (2021).

But about those scenarios. As mentioned above, IPCC future scenarios are not based on actual real-world data for the last 20 years. They are speculative scenarios of the future, envisioned by special research groups within the IPCC research community. The full explication of this exercise in predicting the future can be found here. Critical caveats that often go undiscussed are important in understanding the scenarios and their utility. In the introduction to the IPCC Special Report on Emission Scenarios (2000) (SRES), for example, the IPCC explains that: “Future greenhouse gas (GHG) emissions are the product of very complex dynamic systems, determined by driving forces such as demographic development, socio-economic development, and technological change. Their future evolution is highly uncertain.”

The IPCC further explains that a “set of scenarios was developed to represent the range of driving forces and emissions in the scenario literature… No judgment is offered in this report as to the preference for any of the scenarios and they are not assigned probabilities of occurrence, neither must they be interpreted as policy recommendations”

But there’s one glaring problem with these prognostications. They do not mesh with reality. In a research letter published in the journal Earth and Space Science, climate researchers Ross McKitrick and John Christy show that most of the models used to project climate warming as a result of increasing GHG concentrations exceed observations of the actual climate response of the last 35 years. The charts below represent different modelled estimates of warming that were expected to occur from 1980 to 2015, with the heavy black line representing the average of the estimated model warming. The blue line, however, shows the actual empirically measured temperature trend in the Earth’s troposphere. The discrepancy between predicted and observed temperatures post 2000 are plain to see.

Chart 2

Chart 3

The prevailing wisdom that underpins the sense of climate urgency in today’s policy debates—10 years to save the world!—stems from three sets of speculative models developed over the last 30 years by scientists working under the umbrella of the IPCC.

But empirical evidence taken from the real world suggests that the IPCC’s estimates of future warming are overstated, and what scientists have seen from looking at actual measurements of increased GHGs in the environment, and the recent rise in global average temperatures makes it clear that these “10 years to save the planet” invocations are based more on science fiction models and less on scientifically determined facts.

Overreliance on these flawed models results in policy recommendations and decisions that miss more effective and actionable solutions, particularly those related to adapting to a changing climate whether that change is natural or manmade. Examples of such actions might include strengthening coastal protections against rising tide levels, strengthening the capabilities and protection of water distribution systems to account for potential drought or flood-prone periods, improving flood control systems and snow-management capabilities in Canada’s cities, improving forest management techniques to adapt to potential changes in fire seasons or pest distributions, strengthening power systems needed to deliver sufficient affordable energy for heating and cooling of homes and businesses in the event of climate fluctuations, and more."

Friday, March 25, 2022

Climate policy—modelling vs. market-based measures

By Kenneth P. Green of The Fraser Institute

"In part one of this series, we observed that the current globally dominant climate policy framework of arbitrarily picking an upper bound of acceptable global warming (1.5 degrees Celsius), which is expected to develop over an equally arbitrary period of time (through 2100) is based on speculative computer modelling that exaggerates the climate’s sensitivity to greenhouse gases (GHG), and is contradicted by real-world evidence of climate sensitivity gathered by satellite measurements of the climate over the last 20 to 30 years.

In part two, we observed that economic analysis suggests continuing to chase today’s model-based targets, timelines and policy proposals (i.e. net-zero carbon by 2050) will do far more harm than good, in terms of ensuring human goals such as functioning efficient economies. In particular, we showed that using the climate-economic model of recent Nobel laureate William Nordhaus, the popular goal of limiting global warming to 1.5°C is so aggressive that it would actually be worse than if governments did nothing to restrict GHG emissions.

In this final entry in this blog series, we ask the question: If we aren’t to work backwards from models of the future, how should we address the risks of climate change?

It's a question I’ve studied since 1998. Surprisingly, despite the overwhelming expansion of climate change science and policy research, little has changed from that time in terms of weighing whether a policy based on achieving speculative future targets, timelines and specific achievement approaches is better or worse than a policy of continuously measurable management of GHG emissions and damages via pricing-based policies.

To understand what those “continuous measurable improvement” policies would look like, it helps to examine how this is done, on a routine basis, for the provision of goods and services critical to present-day health, safety and environmental protection. Such goods and services would include medical technologies, chemicals in consumer goods, food sanitation testing and other similar risk-management activities we use to address thousands of potential life-and-death risks every day in developed economies. And for this, we need to introduce the useful idea of “Quality Control and Quality Assurance” as it’s used today.

This essay, by Toronto-based consultant Christopher Chapman, offers a good introduction to what are generally referred to as “quality control charts,” as pioneered by Dr. William Deming who’s considered the “father of quality management.” The figure below is an example (from Chapman) of what a basic quality control or “process behaviour chart” looks like.

Chart 1

In this chart, the green bar represents the desired control level of meeting a standard. In the case of climate change, this target might be restricting GHG emissions to achieve a stated goal of limiting global warming to 1.5°C. The two red bars depict the range of possible variation around that target, happening over time. In this example, if the blue line was variations in the global average surface temperature, the upper red line would be the “OMG, we roast to death” line and the lower red line would be the “OMG, we freeze” to death line.

The value of this approach is one can, in real time, using verifiable measured data, chart a course forward to adjust to unacceptable deviations from the mean. As Chapman then observes, one can set reaction plans (or rules) in place for deviations too far from the mean.

As this second chart shows, different trigger points are possible in such a system to indicate when some change are needed. One violation (rule 1) might trigger only moderate scrutiny, three violations (rule 2) still more scrutiny, and 8 consecutive violations (rule 3) as a signal that significant change is needed.

Chart 2

With regard to climate policies, many economists and policy analysts have observed that while governmental control systems based on speculative modelling of the future have constantly failed to achieve either GHG control targets or meet mandated timelines, market-based measures such as the oft-discussed (but too rarely implemented) revenue-neutral carbon tax or GHG-emission pricing more broadly might be able to do so and have a history of successfully doing so in addressing environmental problems comparable to climate change (stratospheric ozone depletion and acid rain are two examples).

The reason that these latter approaches (taxes and pricing) are deemed superior to alternative command-and-control approaches is precisely because they would allow the kind of ongoing assessment of progress and refinement of control over time. Unfortunately, the way governments have generally chosen to set and administer carbon taxes (and other emission-pricing schemes) tends to disfavour any adjustments that do not move upward, negating the utility of the continuous refinement value of real-time control systems.

What would this look like for climate policy in practice?

  1. Replacing (not simply augmenting) existing GHG control methods based on predictive modelling with market-based price signals based on real-time achievement of the optimal “climate balance,” measured through price signals. This would require adapting current versions of carbon tax and pricing schemes to allow for real-time setting of emission prices that are tied to actual manifestations of warming rather than modelled scenarios of abstract social costs of carbon.
  2. Relying more on empirical measurement systems which give real-time feedback about how climate balance is being achieved, such as satellite temperature measurements and ambient GHG measurements rather than modelled values.
  3. Establishing simple, understandable, predictable “behaviours” to be triggered by unacceptable deviations from the golden-mean we are seeking to achieve with climate policy (i.e. raise-lower carbon price, raise-lower GHG emission caps/limits in accordance with deviation from the acceptable level of warming observed in real time).

Using pragmatic methods of measuring climate variability and impacts in real time, and making small incremental adjustments to emissions incentives via market-based interventions, may not be as dramatic a policy approach as aggressively attempting to address a speculative prediction of catastrophic climate change scheduled to occur sometime after 10 years by setting stringent concrete limits on GHG emissions today.

However, experience with society’s successful management of a myriad of risks, from small to large, suggests that using a market-driven, quality control-style system to manage climate risk would be much more likely to achieve a desired end, which is balancing the risks of climate change with human economic prosperity."

“Buy American” Rules Hurt More Than They Help

By Adam A. Millsap. Excerpts:

"A direct effect of “Buy American” rules is to insulate domestic companies from international competition. This results in higher prices now and lower quality products in the future due to less innovation.

The companies themselves know this. In the 1970s, the U.S. steel and auto industries were protected from international competition by quotas and tariffs. Both industries floundered rather than thrived: Product quality suffered, brands were harmed, and consumers shunned their products.

U.S. companies were ultimately forced to adapt to compete with higher quality foreign producers. The quote below from the 1980 annual report of the American Iron and Steel Institute—which represented many of the steel companies—sums up the cause of their failure:

“Inadequate capital formation in any industry produces meager gains in productivity, upward pressure on prices, sluggish job creation, and faltering economic growth. These effects have been magnified in the steel industry. Inadequate capital formation ... has prevented adequate replacement and modernization of steelmaking facilities, thus hobbling the industry’s productivity and efficiency.”

History will repeat itself if we once again try to prop up U.S. companies. They will become complacent since there will be no need to compete for captive domestic consumers. The interests of large, complacent companies will become intertwined with those of the government that protects them. This will make too-big-to-fail the norm and stifle the creation of new businesses. Economic growth and innovation will both slow, harming the entire country. States with major ports, such as California, New Jersey, Georgia, Washington, and Virginia, will suffer the most.

The alternative is robust competition among domestic and foreign producers via international trade. Trade enhances individual liberty, increases economic growth, stimulates innovation, and helps spread support for free markets and representative government. There is also evidence that free trade—meaning trade free of tariffs, quotas, and other government-imposed restrictions—reduces military conflict between nations. As economist Scott Sumner recently wrote:

“Our best hope for world peace is to enmesh every country so deeply in a web of interdependence with its neighbors that even our…leaders will be able to see the negative sum nature of war. Globalization may not prevent war, but it makes war less likely at the margin. And if war does break out, economic interdependence gives us a weapon to use in place of violence.”

And because international trade makes countries richer, they have more to lose from the disruption and destruction war causes.

While the benefits of globalization were oversold in some respects, it remains true that stronger economic relationships among countries make the world safer and less violent.

Rather than retreat from the global economy, now is the time for America to lead by fostering relationships with other free countries. The Economic Freedom of the World Index ranks 165 countries by the degree to which their policies and institutions support voluntary exchange, property rights, personal choice, and the ability to enter and compete in markets.

There are 82 countries in the first two quartiles of the index, and these are the countries America should prioritize trade with. Five of America’s top six trade partners are already in the first two quartiles. Only China is outside looking in. Other relatively free countries among America’s top 15 trade partners include Taiwan, the United Kingdom, and Ireland.

The United States also has an opportunity to pursue more economic integration with Ukraine and similar countries once the current violence subsides. Many of the cities, factories, and businesses in Ukraine will need to be rebuilt. More trade and private investment with Ukraine will allow American companies and investors to contribute to Ukraine’s future economic success. Investments in businesses, supply chains, and infrastructure will help Ukraine’s workers and consumers, while more trade will provide a larger market for the goods and services produced by Ukrainians.

We are currently helping Ukraine by providing weapons and supplies for its defense. But in the long run, an economic relationship with Ukraine that is built on trade will provide far more benefits to both of us.

Winding down trade with authoritarian countries like China and Russia and increasing trade with freer countries like Indonesia, Mexico, or Peru will not happen overnight. It will take time to reorient supply chains, find trading partners with the right comparative advantages, and reduce the tariffs, regulations, and other government barriers that currently impede trade with various countries. But with the right policies and private sector investments, America can strengthen trade relationships with countries that share our values.

It is important to remember that reducing trade with totalitarian countries like China and Russia does not mean we have to isolate ourselves. There are dozens of other countries that share America’s values and would welcome the opportunity to strengthen their economic relationships with us. Political leadership that recognizes the many benefits of free trade and a private sector that recognizes the uncertainty and dangers inherent in supply chains located in authoritarian countries can work together to set us on a more prosperous path: Not one of independence and isolation, but one of cooperation and community."