"From the latest American Economic Review:
A large body of evidence finds that relative mobility in the US has declined over the past 150 years. However, long-run mobility estimates are usually based on White samples and therefore do not account for the limited opportunities available for nonwhite families. Moreover, historical data measure the father’s status with error, which biases estimates toward greater mobility. Using linked census data from 1850 to 1940, I show that accounting for race and measurement error can double estimates of intergenerational persistence. Updated estimates imply that there is greater equality of opportunity today than in the past, mostly because opportunity was never that equal. (JEL J15, J62, N31, N32)
That is from Zachary Ward of Baylor University. If that is true, and it may be, how many popular economics books from the last twenty years need to be tossed out? How many “intergenerational mobility is declining” newspaper columns and magazine articles? Ouch. No single article settles a question, but for now this seems to be the best, most up to date word on the matter.
Here are earlier, less gated copies of the research."
Thursday, November 30, 2023
Perhaps intergenerational mobility has not declined in the United States after all
What Makes Capitalism Tick?
"Understanding the market process as a systematic, error-corrective sequence of profit-inspired entrepreneurial discoveries, continually reshuffled and redirected as a result of the ceaseless impact of exogenous changes, should drastically alter our appreciation of key features of capitalism.
—Israel M. Kirzner, Competition, Economic Planning, and the Knowledge Problem1 (page 301)
This volume of the collected works of Israel M. Kirzner, edited with a modestly brief introduction by Peter J. Boettke and Frederic Sautet, addresses deep and important questions that most economists would rather skip. These pertain to what distinguishes market activity from central planning, the economic role of entrepreneurs, and what is meant by competition.
I found the conceptual issues that Kirzner raises to be intellectually challenging, and so I imagine that many readers will as well. If you pick up the book, I recommend starting near the back with the essay “How Markets Work: Disequilibrium, Entrepreneurship, and Discovery,” in order to get a general overview before you tackle the essays from the beginning.
Here, I will focus primarily on the question of what distinguishes a market economy from a centrally planned economy. While my discussion is informed by Kirzner’s writing, I do not claim to completely understand or share his views.
In a market economy, decisions about what to produce and how to produce are made by individual entrepreneurs. In order for entrepreneurs to do this in a way that promotes more efficient economic outcomes:
- 1. They must be guided by a profit incentive.
- 2. They must compete in a never-ending process in which they correct mistakes and seize opportunities for improvement.
Many economists believe that the main weakness of socialism is the absence of a profit incentive. But Kirzner writes,
Our further exploration of the interface between the economics of socialist calculation and the economics of the process of entrepreneurial competition will permit us to argue, I believe, that there are analytical grounds for maintaining that the Misesian “problem of knowledge” is indeed anterior to [the] problem of motivation. (page 151)
The problem of knowledge is to discover what consumers want and how to efficiently provide for those wants. Entrepreneurial competition is a process for making such discoveries. In the absence of such competition, the central planner must rely on guesswork.
In a socialist economy, the planner lacks a means for obtaining information on what individuals want. Kirzner point out that, conversely, a market economy has no concept of what “society” wants.
A market economy is by definition made up of a multitude of independently-made individual decisions. In such a context to talk of decisions made “by society” is, at best, to engage in metaphor. “Society” does not, as a simple matter of fact, choose; it does not plan; it does not engage in the “allocation of resources”; it does not have ends; it does not have means; to talk of society facing “its” allocative, economizing problem is, strictly speaking, to talk nonsense. (pages 153-154)
Those of us who wish to defend both methodological individualism and markets are faced with a paradox. When we say that the economy works well, we are claiming to speak for the entire society. But as individualists, we would say that there is no such moral entity as “society.”
My way of dealing with the paradox is to say that I have my intuition about what constitutes a “good economic outcome for society,” and you have yours. If our intuitions have little or nothing in common, then we have no basis for further discussion. But if our intuitions are similar, then we can have a productive dialogue about what sort of institutional arrangements are likely to produce desirable outcomes relative to our respective intuitions.
During the “socialist calculation debate,” economists who advocated socialism conceded that the price mechanism performs an essential information-processing function. They suggested, however, that a government bureau (today we would say a powerful computer) could store a list of all of the economy’s inputs and outputs. Call this the WAC, for Walrasian-Auctioneer Computer. The WAC would then propose a set of prices for inputs and outputs. Consumers would decide on their demands, and firms would decide on outputs. The WAC would look at the results to see what shortages or surpluses emerged. For inputs or outputs that are in surplus, the WAC would adjust prices downward. For inputs and outputs that are in shortage, the WAC would adjust prices upward. Then it would allow consumers and firms to respond to this new set of prices, and look at those results. This process would continue until all surpluses and shortages were eliminated.
In fact, the process just described is problematic, because the economic activity that takes place at “false prices” in one iteration might alter the desired activity at a subsequent iteration. It by no means guarantees smooth convergence to the point where all markets are in balance.
An alternative is to have the WAC announce a set of prices but not allow trading to take place. Instead, the WAC asks everyone to report what they wish to trade at those prices. Based on these wishes, the WAC looks at the resulting surpluses and shortages as hypothetical. It proposes a new set of prices to eliminate these hypothetical shortages, and everyone reports what they wish to trade at these new prices. Assuming that this iterative process converges to a balanced solution, the WAC finally allows trading to take place at the market-clearing set of prices.
Some remarks about this hypothetical WAC mechanism:
1. Most mainstream economists, whether they favor socialism or not, do not worry about whether or not the WAC mechanism exists or is feasible. The standard approach is to construct economic models that assume that the economy works “as if” it used the WAC mechanism. In particular, it can be taken for granted that the economy will adjust to equilibrium states. Therefore, the task of the economist is to analyze the properties of equilibrium states and to compare one such state with another.
2. In contrast, Kirzner and other Austrian economists insist on the importance of the fact that the WAC mechanism does not exist in the real world. In the real world, central planners make their dictates using guesswork, not by using databases and trial-and-error prices. Kirzner points out that in a real-world market economy, entrepreneurs take on the task of adjusting prices and identifying opportunities to alter the mix of what is produced and how it is produced. A computer does not identify shortages, surpluses, and opportunities. Individual entrepreneurs find them.
What Kirzner calls “entrepreneurial alertness” is what grinds down inefficiencies and drives the economy in the direction of equilibrium, or market balance. Of course, the economy never actually reaches such a state, because new opportunities to improve efficiency always arise as events take place and new discoveries emerge.
3. Even if the WAC mechanism were technically feasible, I believe that it still would not be sufficient to facilitate a socialist economy. We would still be missing the element of “entrepreneurial alertness.” It is one thing to believe that a factory manager could decide how many compact cars and how many mid-size cars to produce, based on prices proposed by the WAC. But who has responsibility for coming up with the idea of a ride-sharing service? Or a self-driving car? That is neither the job of the WAC nor the car manufacturer. In addition to the WAC, would-be market socialists need a cadre of designated innovators, whose job it is to generate new products and processes.
4. I think this still leaves open the question of how to motivate firm managers and others in a socialist economy. You can tell a manager to adjust production to maximize a profit that is purely an accounting device, with no effect on remuneration. But what incentive will that provide to managers? And will designated innovators take the right risks if they are playing the game for tokens that are not real money?
5. While all of these arguments point to the difficulty of central planning, this leads to the question: how do firms manage to operate? Within a firm, activities are not guided by a price system and entrepreneurial alertness. Instead, like a central planner, the boss sets internal prices, notably the compensation rules for its workers. Like a central planner, the boss chooses projects based on informed hunches rather than leaving the selection to a market mechanism.
Skeptics of socialism like to point to North Korea or the former Soviet Union as proof that central planning fails. But can advocates for socialism point to Wal-Mart or Apple Computer as proof that central planning can work?
I would say that the difference between Wal-Mart or Apple on the one hand and North Korea or the former Soviet Union on the other is that when central planning breaks down at one of these entities, the ineffective firm will be weeded out and replaced much more quickly than the ineffective socialist government.
If we think of the firm as a locus of central planning, then a market economy consists of these planned enterprises, jostling with one another. We might use a metaphor of ships that are centrally managed, some large and some small, all trying to stay afloat in a sea of competition. Corrosion and natural disasters frequently sink some of the ships, but other ships arrive, and people’s lives generally get better because these ships are new and improved. A centrally planned economy is a like a single structure sitting on dry land. It is less likely to experience rapid improvement, and when it corrodes or is hit by a natural disaster, its population suffers for a long time."
Wednesday, November 29, 2023
Life expectancy can increase by up to 10 years following sustained shifts towards healthier diets in the United Kingdom
"Abstract
Adherence to healthy dietary patterns can prevent the development of non-communicable diseases and affect life expectancy. Here, using a prospective population-based cohort data from the UK Biobank, we show that sustained dietary change from unhealthy dietary patterns to the Eatwell Guide dietary recommendations is associated with 8.9 and 8.6 years gain in life expectancy for 40-year-old males and females, respectively. In the same population, sustained dietary change from unhealthy to longevity-associated dietary patterns is associated with 10.8 and 10.4 years gain in life expectancy in males and females, respectively. The largest gains are obtained from consuming more whole grains, nuts and fruits and less sugar-sweetened beverages and processed meats. Understanding the contribution of sustained dietary changes to life expectancy can provide guidance for the development of health policies."
Related posts:
Even Short Runs Have Major Health Benefits (2023)
What if the Most Powerful Way to Live Longer Is Just Exercise? (2023)
Exercise Helps Blunt the Effects of Covid-19, Study Suggests (2023)
How lifestyle changes can reduce the risk of dementia (2019)
Good health begins with individual decisions (2018)
Nearly half of U.S. cancer deaths blamed on unhealthy behavior (2017)
Regular Exercise: Antidote for Deadly Diseases? (2016)Elizabeth Warren Wants the Government To Investigate America's 'Sandwich Shop Monopoly'
The owner of Jimmy John's and Arby's has bought Subway, and a Massachusetts senator has concerns.
By Christian Britschgi of Reason.
"Subway might not be the only one that's freshly baked. Sen. Elizabeth Warren (D–Mass.) thinks the government should investigate America's alleged "sandwich shop monopoly."
"We don't need another private equity deal that could lead to higher food prices for consumers," Warren tweeted Sunday. She was responding to a Politico piece reporting that the Federal Trade Commission (FTC) is probing the private equity firm Roark Capital's $10 billion acquisition of Subway.
Roark already owns the sandwich-serving chains Arby's, Jimmy Johns, McAlister's Deli, and Schlotzky's. Warren said that adding Subway to that list could create a "sandwich shop monopoly."
The senator has made a career of crusading against such "monopolies," regardless of how monopolististic they actually are or beneficial to consumers they might be. (Witness her war on Amazon-branded chargers.)
Her attack on America's alleged "sandwich shop monopoly" scores new points for pettiness. It also shows just how broad (and therefore meaningless) the word "monopoly" has become in modern political discourse—and at Lina Kahn's FTC.
It's easy to assert that something is a monopoly if you narrow your focus on the market or product being discussed. There are, after all, only so many national fast-casual restaurant chains focused on serving deli sandwiches. If Roark snatches up Subway, then ownership of that particular ham slice of the market may in fact look pretty consolidated.
But from the casual consumer's perspective, competition remains robust. There are endless options for getting a sandwich without paying a Roark-owned enterprise. Grocery stores, convenience stores, coffee shops, non-chain delis, and more all sell some variety of sandwich. And yes, non-Roark-owned national sandwich chains still exist.
Sandwiches, not being the most elaborate meal in the world, can also be made by most Americans at home.
On top of all that robust competition within the sandwich market, sandwich shops are in heated competition with all manner of other restaurants selling hamburgers (technically also a sandwich), hot dogs (debatably a sandwich), burritos (not a sandwich), salads, soups, Asian rice bowls, Mediterranean rice bowls, and more.
Consumers can, and do, flit between all of these options with ease. Even if Roark's acquisition of Subway gives it a stranglehold over the sandwich market, its ability to raise prices on consumers will still be hemmed in by this dizzying array of additional lunchtime options.
The original purpose of antitrust laws was to prevent the Cornelius Vanderbilts of the world from using their ownership of the commanding heights of the economy to raise prices and gouge consumers. Libertarians have long criticized such statutes, arguing that an existing monopoly can't sustainably charge consumers above-market prices as long as new competitors armed with new technologies are allowed to undercut them. And indeed, even tech companies that once seemed invincible are now being laid low by competitive pressures.
If markets can work in that arena, we surely don't need the government to police who owns businesses that specialize in putting cold cuts between slices of bread."
Tuesday, November 28, 2023
It’s Time to Discard Piketty’s Inequality Statistics
By Phil Magness & Vincent Geloso.
"Thomas Piketty is well-known for his work on estimating income and wealth inequality. That work made him an “economics rockstar” in the eyes of the media, as he appeared to confirm a popular narrative about rising inequality. Piketty’s stats showed a consistent trend across the 20th-century United States. Top income and wealth concentrations followed a U-curve pattern, where the early 1900s were marked by high “Gildedq Age” levels of inequality. These levels fell rapidly during the 1940s, stayed low until the 1980s, and rapidly rebounded until the present day as the “top 1 percent” pulled away from the rest of the pack.
In fact, Piketty claims that US inequality today is higher than it was in 1929 — the highest point on the first half of the U-curve. The main culprit behind rising inequality, according to his story, is a series of tax cuts beginning with the Reagan administration. Just the same, Piketty points to the mid-20th century’s tax system, where top marginal rates peaked at over 90 percent, as the reason for the trough in his U-curve. The resulting series of academic articles — often co-authored with Gabriel Zucman and Emmanuel Saez — are deemed as novel and important contributions to the scholarly literature on inequality.
The empirical work of Piketty and his coauthors has attained immense influence in American political life. The media often touts the U-curve and its depictions of skyrocketing inequality since the 1980s as a stylized fact. Politicians and pundits invoke his academic works to justify tax hikes and redistributive programs, all in the name of combating inequality.
What if Piketty and his team got the numbers wrong though? What if inequality wasn’t rising as fast as he claimed, or what if the effects of growing income concentrations were already offset by existing government programs? There would no longer be an empirical case for hiking taxes or expanding government redistribution. That’s the implication of a bevy of recent research articles, showing that Piketty’s statistics could (and should) be discarded in favor of more rigorous work.
The most recent of these is an article by David Splinter and Gerald Auten in the Journal of Political Economy. Auten and Splinter revisited many of the data construction assumptions made by Piketty and his acolytes in dealing with data from 1960 to 2020. Most notably, they made sure that income definitions were consistent over time, that the proper households were considered (as Piketty et al. used tax units that can be easily biased by demographic changes), and that better data were used. They ended up finding that Piketty’s mid-century trough was not as low as advertised. They also showed that the increase in income concentrations after 1980 was far more moderate than Piketty claims.
In the main article by Piketty and Saez, the top 1 percent earned 9 percent of all pre-tax incomes in 1980 versus 20 percent in 2020. In Auten and Splinter’s improvements, these proportions are 9 percent and 14 percent, respectively. After accounting for transfers and taxes (something that Piketty and Saez fail to do), Auten and Splinter find virtually no changes since 1960. Piketty and his defenders have thus far attributed the differences to differing assumptions about methodology and the calculation of imputed portions of their series. But Auten and Splinter’s work shows that these assumptions matter a great deal, meaning Piketty’s version is no longer an authoritative standard for evaluating levels of inequality.
But what if we set aside the methodological disagreements about imputed data and focus instead on simply getting the underlying statistics right? It turns out that Piketty and Saez’s original series had multiple accounting errors, data discrepancies, and even historical mistakes in how they dealt with changes to the tax code.
In a recent working paper, we set aside the discretionary disagreements over imputation and only looked at the ways that Piketty and his coauthors handled the underlying tax statistics. At multiple points over their century-long series, they switch out their approaches for estimating the total amount of income earned in the United States each year. This figure allows them to calculate the percentage of those earnings that went to the richest 1 percent, using income tax records.
Oddly enough, Piketty’s most sweeping methodological changes happen at crucial junctures in their depicted U-Curve, such as the sharp decline in income inequality that they depict during World War II. It is no coincidence that these same years coincided with an overhaul of the tax code that standardized how the IRS collects and reports income data. In this instance, we found that Piketty and his coauthors failed to properly correct for the accounting changes, and used an inaccurate estimate of total personal income earnings. Similar errors pervade the entire Piketty-Saez series.
After correcting for these problems, we found that Piketty and his co-authors tend to underestimate total personal income earnings, thereby artificially pumping up the income shares of the richest earners. They do so inconsistently though, as their largest underestimations are from the periods between 1917-1943 and from 1986-present. These errors correspond precisely with the two highest periods of inequality, the two tails of the U-shaped pattern. Shifting to a consistent methodology that does what Piketty and his co-authors aimed to do, but does so more rigorously (we carefully assembled year-by-year data of national accounts components to create a consistent definition rather than use a “rule of thumb” as they did), shows that 40 percent of the differences between Piketty and the work of Auten and Splinter is due to the methodological inconsistencies of the former.
In earlier works published in The Economic Journal and Economic Inquiry, we also found other signs of carelessness by Piketty and his acolytes with data sources pre-1960. They used inconsistent definitions to link discontinuities in tax records. They omitted certain tax filing records after misreading their data sources. They made arbitrary decisions about how to impute gaps in their data, and used unreliable ratios to estimate the effects of accounting changes by the IRS. When we corrected all of these issues, we found that inequality was far lower in the 1920s than depicted. The decline did not start in the 1940s — it started in 1929 and close to two-thirds of it was completed by 1941. Again, the mid-century trough was not as deep as depicted. The combination of all work – the pre-1960 corrections and the century-long consistent methodology can be seen in the graph below where the U-curve is far less pronounced and at a lower level.
Other works have confirmed these points differently. A small list of these suffices to show this. Miller et al. in an article in Review of Political Economy showed that most of the increase from 1986 onward is due to tax shifting behavior linked to the 1986 Tax Reform. Armour et al. in an article in the American Economic Review showed that properly measuring capital gains eliminates all the increase since 1989. In subsequent work in the Journal of Political Economy, Armour et al. confirmed this finding. Finally, a National Bureau of Economic Research by Smith et al. confirmed that all of these findings also apply to wealth inequality. Moreover, work by Sylvain Catherine et al. from the University of Pennsylvania shows that Piketty and his team failed to properly consider the role of social security which – when included – essentially levels the evolution of wealth inequality.
Normally, these findings would be cause to revisit the conventional wisdom around Piketty’s narrative. The problems with his underlying statistics are now well-documented, and newer and better estimates are available to take their place. Those estimates show a weaker U-curve with different timing and magnitudes for its evolution. Most of the decline to the trough is no longer tied to tax rate changes but rather to the effects of the Great Depression. Most of the increase post-1986 is an artifice of accounting and can be probably better attributed to changes in the returns to education during the 1970s, 1980s and 1990s which have since stabilized. Overall, the causal link between high taxes and low inequality (or the inverse scenario) is no longer apparent in the corrected data, which shows a much more nuanced evolution of top income levels over time. Indeed, one of Auten and Splinter’s main findings shows that if you look at top income levels after taxes are paid, the top 1 percent has hovered around a stable 8 percent income share for the last 60 years.
As the study and measurement of inequality progresses, Piketty’s (and his team’s) main estimates have become obsolete and might be properly consigned to the field of the history of economic thought. However, Piketty is now calling anyone who refuses to accept his stats an “inequality denier” and saying it is equivalent to climate denial.
Critics do not deny inequality. They merely want to measure it correctly. Piketty’s own data are deeply suspect and open to challenges that he simply does not want to answer. Labeling his critics as “deniers” is a way of sidestepping the many problems with his own work. That alone warrants not only discarding his estimates but also discounting any future research because of bad academic behavior."
Nikki Haley’s Medicare Advantage
A new study shows insurer competition reduces costs
WSJ editorial. Excerpts:
"Medicare Advantage plans are growing rapidly and cover about half of the entitlement’s beneficiaries. Private insurers administer the plans and are paid by Medicare per beneficiary. Insurers compete for patients by offering benefits, including vision and dental care that aren’t available in traditional fee-for-service Medicare.
Lower premiums have made Advantage plans popular in particular among low-income seniors. Plans are able to offer more benefits at lower cost in part by reducing unnecessary care and expensive hospital stays.
Avalere, a healthcare consulting firm, analyzed utilization rates in traditional Medicare versus Advantage plans. After adjusting for disease and demographics, Avalere found that fee-for-service utilization was 12% higher for skilled nursing homes and 37% higher for hospital inpatient care in 2019."
"private insurers have a financial incentive to keep patients out of the hospital by improving adherence to treatments and coordination of care."
"If fee-for-service utilization rates were similar to those in the Advantage program, Avalere projects that the hospital trust fund would remain solvent until 2048." [instead of the projected 2031)
"the Administration is resorting to brute government force to curb Medicare spending: restricting access to new Alzheimer’s treatments, imposing price controls on other medicines, and reducing reimbursements to doctors."
"Medicare’s low reimbursement rates are driving doctors to leave private practice for hospitals, which reduces provider competition and increases healthcare spending."
The Green Electric Power Grid Isn’t Coming
The International Energy Agency says it would require millions of miles of transmission lines
"The International Energy Agency said this week that 49.7 million miles of transmission lines—enough to wrap around the planet 2,000 times—will have to be built or replaced by 2040 to achieve the climate lobby’s net-zero emissions goal. This amounts to a plan for everyone to buy more metals from coal-fired plants in China.
Grid investment, the IEA report argues, is needed to carry additional renewable energy “as the world deploys more electric vehicles, installs more electric heating and cooling systems, and scales up hydrogen production using electrolysis.” By its estimate, the world needs to spend $600 billion annually on grid upgrades by 2030.
Unlike fossil fuel and nuclear power plants, solar and wind projects are typically many miles from population centers. That means long transmission lines, some under the sea to take electricity from off-shore wind installations. Tens of thousands of extra power transformers will be needed to step up and down voltage.
All of this would cost trillions of dollars and require enormous quantities of metals. “Copper and aluminium are the principal materials for the manufacture of cables and lines,” the IEA report says. Transmission lines also need insulators, such as cross-linked polyethylene and ethylene-propylene polymer—both derived from fossil fuels.
Transformers are made of the same specialized steel used in charging stations for electric vehicles. Smaller transformers require non-oriented electrical steel, used in EV motors. The green-energy gold rush has contributed to shortages of both types. Buyers of transformers “face a wait of over 18 months,” the report notes.
Meantime, advanced economies must replace aging equipment to prevent power outages and safety hazards. About half of the transmission and distribution lines in the U.S. are more than 20 years old, according to the IEA.
Where are the materials going to come from? The report doesn’t say, but the most likely answer is China, which dominates global copper, steel and aluminum production, owing to its lax environmental regulation and low labor costs. Over the past 20 years, primary aluminum production has increased ninefold in China while declining 68% in the U.S.
Metals manufacturing takes massive amounts of power, and coal accounts for 60% of China’s electric generation. In other words, the IEA’s path to a net-zero grid would involve emitting a lot more CO2, even assuming it wasn’t a political nonstarter, which it is."
Monday, November 27, 2023
Inside Ohio State’s DEI Factory
I obtained 800 pages of ‘Diversity Faculty Recruitment Reports.’ Here’s what I found
By John Sailer. Mr. Sailer is director of university policy at the National Association of Scholars. Excerpts:
"A search committee seeking a professor of military history rejected one applicant “because his diversity statement demonstrated poor understanding of diversity and inclusion issues.” Another committee noted that an applicant to be a professor of nuclear physics could understand the plight of minorities in academia because he was married to “an immigrant in Texas in the Age of Trump.”"
"In February 2021, then-president Kristina Johnson launched an initiative to hire 50 professors whose work focused on race and “social equity” and “100 underrepresented and BIPOC hires” (the acronym stands for black, indigenous and people of color)."
"Each report required search committees to describe how their proposed finalists “would amplify the values of diversity, inclusion and innovation.”"
"One report said a candidate would “greatly enhance our engagement with queer theory outside of the western epistemological approaches which would greatly support us both in recruitment and retention of diverse graduate populations.”"
"In a search for a professor of chemistry, the report notes that one candidate’s “experiences as a queer, neurodivergent Latinx woman in STEM has provided her with an important motivation to expand DEI efforts beyond simply representation and instead toward social justice.” Another report concedes that “as a white male” one proposed finalist “does not outwardly present as a diversity candidate.” In his defense it notes that he recently published on critical race theory."
"The committees cited those [diversity] statements as the sole reason for eliminating certain candidates in fields as varied as aquatic ecology, lighting design, military history and music theory."
"A committee searching for a professor of freshwater biology selected finalists “based upon a weighted rubric of 67% research and 33% contribution to DEI.” To evaluate the statements, the committee used a rubric that cited several “problematic approaches” for which a candidate can receive a zero score—for example, if he “solely acknowledges that racism, classism, etc. are issues in the academy.” It isn’t enough for a freshwater biologist to believe that racism pervades higher education."
"The rubric meanwhile gave a high score for DEI-focused activism outside academia, for demonstrating an understanding of “intersectionality” and for embracing a vision of “anti-racism” that “requires consistent and long-term growth, reflection, and engagement (and that they are prepared to put in this work).”"
"For a search in astrophysics, “the DEI statement was given equal weight to the research and teaching statements.”"
"Throughout the reports, references to the race and sex of candidates abound. Many of the job candidates’ diversity statements emphasized their own “intersectional” identities—“a person of color and a member of the LGBTQ+ community,” “a first generation, fat, queer scholar of color” and so on."
"For a role in communications, four of the 46 applicants were Hispanic—and so were two of the three finalists. One role in medical anthropology had 67 applicants. The four finalists include the only two black applicants and the only Native American applicant. “All four scholars on our shortlist are women of color,” the committee said."
"Some search committees at Ohio State were surprisingly forthcoming about their use of racial preferences. “Diversity and inclusion featured prominently in all our discussions,” wrote one committee in the division of geodetic sciences. “Naturally, most weight was given to candidates from URM”—underrepresented minority—“backgrounds, but we also gave considerable weight to the diversity statements that were provided by all candidates.”
One faculty position advertised last year was in French and francophone studies with a “specialization in Black France.” It yielded a more racially diverse but still majority-white applicant pool. The committee was adamant about its intended outcome. “In our deliberations to select finalists, the importance of bringing Black scholars to campus was deemed to be essential. We thus chose three Black candidates.”"
The US College Campus as a Long-term Strategic Threat to Israel, the US and Global Stability
By David Bernstein. David Bernstein is the founder of the Jewish Institute for Liberal Values (JILV.org) and author of Woke Antisemitism: How a Progressive Ideology Harms Jews.
"By now it’s clear to anyone paying attention that many American college campuses have since October 7 become hotbeds of anti-Zionism and antisemitic fervor. One Jewish professor at a small liberal arts college in the Pacific Northwest, an institution you’re not hearing about in the news, recently told me that “From the River to the Sea” is among the mildest chants he hears in the raucous daily campus protests beneath his office window. That same professor has been subject to ongoing, fierce harassment from radical students for expressing moderate pro-Israel positions on social media. Jewish students on his campus have faced death threats and intimidation. Some have been escorted to class by campus security to avoid angry mobs. And we are seeing similar anti-Israel activity on numerous other campuses across the country.
My intention in this article is not to recount the horrors of the current moment, but to examine the roots of the problem and to offer a series of recommended long-term interventions. I say long-term because much of the discussion in the mainstream Jewish community revolves around short-term actions that may temporarily ameliorate the mayhem but fail to address root causes and stem the tide of hate and erosion of support for Israel. The problem on campus has been a long time in the making and it will take a long time in the unmaking.
As challenging as it will be to affect such a shift, the stakes couldn’t be higher. If future generations of young elites continue to be educated into hostility toward Israel, we should expect to see a decline in US-Israel ties with increasing pressure to end the special relationship. And if they continue to be educated into antipathy toward what America stands for and its role in the world, we can expect an America that will withdraw from the global scene, eschew the use of power, and abandon the field to hostile powers such Iran, Russia and China. It’s hard to imagine that seemingly absurd ideological trends in the humanities departments at American universities could wreak such havoc. But quackery in American universities is a long-term threat to global stability.
The Roots of Campus Hate
Three trends converge in the emergence of today’s campus hate. The first factor is the Soviet anti-Zionist campaign of the late 1960s. Wilson Center scholar Izabella Tabarovsky describes the development of a field called “Zionology” in the late 1960s in the USSR that actively discredited Zionism. In the wake of the 1967 Six Day War, the Soviets were distressed that Israel had handily defeated their Arab allies, and that Soviet Jews, inspired by Israel’s victory, increasingly identified with the Jewish state. In 1969, a party official, Yuri Ivanov, wrote “Beware: Zionism!,” which sold upwards of 800,000 copies in the USSR alone. Tabarovsky explains that the Zionologists’ “most important contribution to global anti-Jewish discourse was to make antisemitic conspiracy theories, typically associated with the far right, not only palatable to the Western hard left but politically useful to it.” In other words, the Soviets successfully created the template for the anti-Zionist campaign we are seeing on American campuses today.
The second factor is the emergence of postmodern and postcolonial studies in American universities. Postmodernism holds that all of what we consider “knowledge” and attribute to science and free discourse is really the outgrowth of powerful interests encoding their preferred understanding of the world in social discourses so that they can continue to rule over the masses.
In the late 1960s, at the same time the Soviets were delegitimizing Zionism, postmodern scholars with an activist agenda forced their way into higher education and established ethnic studies and other “Studies” departments across the country, which did not adhere to usual standards of scholarly inquiry. Over time a more activist and less scholarly brand of postmodern scholarship emerged and became the basis of today’s radical leftist discourse, which gained further momentum through the writings of the Palestinian-American literary critic Edward Said, the founder of postcolonial thought. Said discredited the Western study of the Middle East and influenced scholars to see Zionism as a colonialist project. These popular academic theories today see the world through a stark oppressed/oppressor binary, and are predisposed to keeping alive anti-Zionism and other such canards about white, Jewish, and colonial power.
The third factor is the role of Middle Eastern money. In 2019, the Institute for the Study of Global Antisemitism and Policy (ISGAP) first presented research findings to the Department of Justice entitled “Follow the Money.” The research examines illicit funding of United States universities by foreign governments, foundations and corporations. The research revealed billions in Middle Eastern funding, primarily from Qatar, to US universities that had not been reported to the Department of Education. Such funding has had a substantial impact on fueling antisemitic discourse, identity politics and anti-democratic sentiment within these institutions of higher education.
In other words, the ideological trends described above have been fomented by Qatari financing of American universities. A report issued by the National Association of Scholars, “Hijacked,” describes the problem: “The same leftist hysteria which has consumed the humanities and social sciences since the 1960s has spread to MESCs (Middle East Studies Centers)…Academics have repurposed critical theory to galvanize activism on Middle East issues. For instance, they have recast the Israel–Palestine debate as a fight for “indigenous rights” against the supposed evils of colonialism.”
Formulating a Long-term Strategy
There is an abundance of short-term responses currently under consideration. Among them are some which might reduce tensions including: exhorting university presidents to actively oppose radical voices and to discipline perpetrators who intimidate or accost Jewish students; enforcing Title VI anti-harassment laws against those who generate a hostile environment; banning Students for Justice in Palestine (SJP) chapters that cross the line and bully Jewish students. These interventions can help, but none will likely permanently lower the level of animosity from students and professors. Some interventions, like trying to accommodate Jewish concerns in existing campus Diversity, Equity and Inclusion (DEI) efforts, may be downright counterproductive and merely reinforce the ill-bred ideological conditions that fomented the hostile sentiment in the first place.
Supporters of Israel and Jewish security in America and, indeed, all those concerned about the health of American democracy, need to mount a sustained effort to change the campus culture. Here’s what this involves:
- End or Transform DEI
Campus DEI bureaucracies function as an ideological authority, reinforcing political orthodoxies on campus. The National Association of Diversity Officers in Higher Education describes itself as “a leading voice in the fight for social justice” by “creating a framework for diversity officers to advance anti-racism strategies, particularly anti-Black racism, at their respective institutions of higher education.” Sprawling bureaucracies in major universities now typically have 45 paid staff members who reinforce the overall illiberal ideological environment. A 2021 study conducted by Jay Greene at the Heritage Foundation reviewed the social media output of campus DEI officers and found that a high percentage had hostile views toward Israel. One can only imagine what such a study would show today.
Bari Weiss, among others, argues that “it is time to end DEI for good.” “The answer,” she states, “is not for the Jewish community to plead its cause before the intersectional coalition, or beg for a higher ranking in the new ladder of victimhood. That is a losing strategy—not just for Jewish dignity, but for the values we hold as Jews and as Americans.” Another approach proposed by interfaith leader Eboo Patel is to replace DEI with a less ideological form of diversity built on the traditional American model of pluralism. Either way, as long as the current model of DEI reigns supreme, many universities will be hostile places for Jews and Israel.
- Recommit to the Liberal University
As stated above, university humanities departments have become riven with ideological academic programs that perpetuate notions of power and oppression that cast Jews and Israel as oppressors. It will not be easy to totally unseat these departments but over time we can weaken their influence. Major Jewish donors have begun to withdraw their philanthropy from elite universities often run by weak-kneed presidents, such as those at Harvard and University of Pennsylvania. One of the most important things these donors can do is to reinvest their philanthropy in new academic programs that specifically and explicitly elevate free inquiry and freedom of expression. Yale Law School, for example, recently established a new free speech and academic freedom center. Such centers can begin to compete with the politicized “Studies” programs and attract superior faculty and student talent.
Indeed, there seems to be a strong correlation between campuses that stifle free inquiry and promote anti-Israel climates. The free speech organization FIRE, which conducts an annual College Free Speech Rankings, ranked Harvard and University of Pennsylvania, respectively, last and second to last. Not coincidentally, these schools are among the most hostile environments for Jewish students who support Israel. Restoring freedom of inquiry in college campuses is a long-term, generational challenge, and a necessary condition for improving attitudes toward Jews and Israel.
- Cut Middle Eastern Sources of Funding
There is no reason that the US must continue to allow foreign funding of American university programs. In the aftermath of October 7, efforts to expose Qatari funding of American university programs have picked up steam. Hearings have been held on Capitol Hill detailing the failure of universities to disclose sources of funding. Now is the time to redouble such efforts. We should not forget that Saudi Arabia was once the major funder of such anti-American academic programs but, under the scrutiny in the post-911 atmosphere, largely pulled back. Qatar filled the vacuum. Like Saudi Arabia before it, Qatar has much at stake in its relationship with the US. Last year, the US designated Qatar a major non-Nato ally, undoubtedly owing in large part to the role the Gulf state played as an intermediary with Iran. Until recently, however, the Biden Administration has shielded Qatar from scrutiny over its funding of universities. Turning up the heat on the Biden Administration to hold Qatar accountable will be critical.
Such a long-term, strategic approach to changing university cultures will not be easy. But unless we are successful in affecting such a change, the environment toward Jews and Israel will only worsen."
Sunday, November 26, 2023
Hamas’s Barbarity Heightens the Crisis in Higher Education
Jewish students bear the brunt of colleges’ culture of intolerance, conformity and ‘safe spaces.’
By Michael R. Bloomberg. Excerpts:
"Intentionally targeting civilians for slaughter is inexcusable no matter the political circumstances.
For Americans, this isn’t a matter of defending Israel but of defending our nation’s most sacred values."
"In a 2014 commencement speech at Harvard, I warned that many of America’s top colleges had become Soviet-like in their lack of viewpoint diversity."
"It is no surprise that support for terrorism, dressed in the language of social justice, has emerged from this environment."
"no student should ever feel physically intimidated or unsafe going to or speaking in class, as many Jewish students have lately."
"presidents and deans should make a priority of hiring faculty with greater viewpoint diversity"
The Startling Evidence on Learning Loss Is In
NY Times editorial. Excerpts:
"The school closures that took 50 million children out of classrooms at the start of the pandemic may prove to be the most damaging disruption in the history of American education. It also set student progress in math and reading back by two decades and widened the achievement gap that separates poor and wealthy children."
"Economists are predicting that this generation, with such a significant educational gap, will experience diminished lifetime earnings and become a significant drag on the economy."
"Millions of young people have joined the ranks of the chronically absent — those who miss 10 percent or more of the days in the school year — and for whom absenteeism will translate into gaps in learning."
"More than a quarter of students were chronically absent in the 2021-22 school year, up from 15 percent before the pandemic. That means an additional 6.5 million students joined the ranks of the chronically absent."
"Based on survey data collected in 2021, the Centers for Disease Control and Prevention reported this year that more than 40 percent of high school students had persistent feelings of sadness and hopelessness; 22 percent had seriously considered suicide; 10 percent reported that they had attempted suicide."
"In some communities, children have fallen behind by more than a year and a half in math."
Saturday, November 25, 2023
Cold truth about government pushing electric: Natural gas is much cheaper
"The Department of Energy (DOE) has repeatedly documented that using natural gas in homes is far cheaper than using electricity.
This hasn’t stopped the Biden administration from trying to limit the use of natural gas in homes or some state and local governments from proposing and passing bans on natural gas hook-ups for new residential construction.With winter, and the home heating challenges that come with it, fast approaching, these cold truths should be of serious concern for homeowners and policymakers.
Now there’s yet another federal government report highlighting these differences in costs. The Energy Information Administration (EIA) recently released its Winter Fuels Outlook. This document addresses possible scenarios for the cost of various sources of heating this winter.EIA’s analysis sheds light on what we should be watching for winters ahead. Energy expert Robert Bryce, points out that for the coming winter, electric heating for households is projected to cost 77 percent more on average than natural gas. The discrepancy is even steeper for the Northeast in particular, where electric heat is projected to be 92 percent more expensive than gas.
Home heating is a major budget item for a lot of people in cold parts of the country. Policies at all levels of government that limit Americans to more expensive heating options are harmful. This includes policies like the one in New York state that would ban new natural gas furnaces in most new buildings by 2029. It also includes DOE’s final rule that would make it more difficult for people to purchase new gas furnaces.
Cost concerns are not the only problem with government policies that that are trying to stop people from using natural gas. Natural gas heating and cooking, and the electric grid currently exist in parallel. For homes with natural gas heating, if the power goes out, there’s still heat and usually a water heater and maybe even a stove capable of operating on natural gas. In an all-electric house, if a winter storm takes the power out for a few days, there’s nothing much to fall back on.
If grid capacity doesn’t expand at the same time as more homes and businesses move to electric heating, the mismatch in demand and supply may only become apparent during a time of grid stress—namely very cold days. Government meddling in this only serves to make blackouts more likely.
When it comes to personal energy decisions, people know best what will work for their homes and families. Policymakers should respect the individual personal energy choices of Americans and block this dangerous and costly electrification agenda."
The CFPB’s Digital Wallet Rule Proposal Reveals What’s Wrong with the CFPB
"When you use a product that’s closely supervised by the government, you might be tempted to assume the bureaucratic babysitting is somehow necessary for the product or its industry to run smoothly. Yet when regulators first propose special supervision years after you’ve already seen the product work as intended, you may be tempted to ask, “What gives?”
When it comes to the Consumer Financial Protection Bureau’s (CFPB) proposal to bring popular payment apps (like Apple Pay, Google Pay, PayPal, Venmo, and Cash App) under a supervisory regime, the answer is that the agency sees the apps have become quite popular, and the CFPB treats success alone as a reason for more invasive oversight.
The digital payment app market is hardly crying out for a regulator to ride to consumers’ rescue, and the CFPB’s proposed rule provides a real‐time demonstration of how regulators won’t hesitate to “fix” something even when—and perhaps, especially when—“it ain’t broken,” as the old saw goes.
This month, the CFPB proposed subjecting major digital consumer payment applications to agency supervision by designating the apps as “larger participants” in a market for consumer financial services. The Dodd‐Frank Act gives the CFPB the authority to supervise these larger participants, meaning that in addition to the ability to conduct enforcement actions for violations of consumer financial protection law, the CFPB also may proactively monitor and examine these specially designated businesses.
Under the proposed rule, covered digital payment apps would find themselves facing a host of potential CFPB supervisory activities, including on‐site exams involving requests for records, regulatory meetings, record reviews, as well as compliance evaluations, reports, and ratings. The Bureau estimates such exams would take approximately eight to ten weeks on average.
All this mucking about while a business is trying to get work done conjures images of Homer Simpson’s brief stint supervising a team of engineers:
Homer: “Are you guys working?”
Team: “Yes, sir, Mr. Simpson.”
Homer: “Could you, um, work any harder than this?”
Who exactly would become subject to CFPB supervision under the proposal? The proposed rule would cover providers of “general‐use digital consumer payment” apps—including both fund transfer and digital wallet apps—that meet requirements around transaction volume (five million transactions annually) and firm size (not being a small business as defined by law). The proposal contains some notable exclusions, including exemptions for apps that only facilitate payments for specific goods or services (i.e., are not general use), as well as for transactions with marketplaces through those marketplaces’ own platforms.
One question raised by the proposal, particularly its reference to digital wallets, is whether cryptocurrency transfers and wallets are in scope. The answer, in short, is sometimes.
According to the CFPB, covered fund transfers include crypto transfers, so the rule likely would cover hosted crypto wallets (where an intermediary controls the private keys for accessing users’ funds) used for those purposes. However, the proposed rule does not cover purchasing or trading cryptocurrencies, as it excludes exchanges of one form of funds for another, as well as purchases of securities and commodities regulated by the Securities and Exchange Commission (SEC) and the Commodity Futures Trading Commission (CFTC). (Add this to the list of ways in which lingering questions about SEC and CFTC jurisdiction over crypto create unhelpful regulatory ambiguity.)
The proposed rule’s application to self‐hosted crypto wallets (where users control their own private keys) likely will hinge on interpretive questions (including those related to the definition of “wallet functionality”), and these could leave the agency room to find some self‐hosted wallets in‐scope. (If the CFPB were to go this route, it would be yet another example of subjecting a core crypto technology to poorly conceived regulation.)
When it comes to the CFPB’s reasons for the proposal, perversely, the very data indicating that the market for digital payment applications is anything but broken is the data the CFPB cites as the basis for subjecting the market to special supervision. According to the agency, “The CFPB is proposing to establish supervisory authority over nonbank covered persons who are larger participants in this market because this market has large and increasing significance to the everyday financial lives of consumers.” Another way to put this is that fulfilling consumer demand alone calls for greater scrutiny.
How popular have these apps become? According to the CFBP itself, 76 percent of Americans have used one of four major payment apps; 61 percent of low‐income consumers report using payment apps; merchant acceptance of payment apps “has rapidly expanded as businesses seek to make it as easy as possible for consumers to make purchases through whatever is their preferred payment method;” and adoption by younger users may drive even further growth.
Separate survey data tend to support the idea that consumers’ positive assessments of these apps line up with their revealed preferences. According to survey data compiled by Morning Consult in 2017, a sizable majority of American adults were either very satisfied or somewhat satisfied with a variety of digital payment apps, including Venmo (71 percent), Apple Pay (82 percent), Google Wallet (79 percent), and PayPal (91 percent). Recently, some even tried to frame Apple Pay as making payments “too easy” for consumers’ own good.
The CFPB’s proposal is not an example of a regulator seeking to impose sorely needed order in a broken and lawless sector, but rather an agency ratcheting up compliance requirements in an already regulated space. For instance, consumer financial products and services—which include consumer payment services via any technology—already are subject to the CFPB’s authority to enforce prohibitions against unfair, deceptive, or abusive acts or practices. Moreover, the CFPB already has the power to supervise relevant financial service providers where it issues orders determining, with reasonable cause, that the providers pose risks to consumers, something that the agency fails to do in any convincing manner in the proposal.
That the CFPB is seeking to assert supervisory authority over the digital payment app market without having to identify specific risks to consumers is emblematic of a fundamentally flawed approach to regulation.
In the case of digital payment apps, the proposed supervisory regime is not targeting a consumer financial service market failure but rather a market success. Witnessing this, it’s reasonable to ask what other supervisory regimes that consumers take for granted began as solutions in search of problems."
Friday, November 24, 2023
Things that didn't cause the Great Depression
"The following tweet caught my eye:
I once wrote an entire book on the causes of the Great Depression, focusing on the role of the interwar gold standard and FDR’s labor market policies. In doing this research, I discovered that the question of causation is quite tricky. One can look for proximate causes, such as bad macroeconomic policy, or deeper causes, such as institutional failures. (In theory, a depression might also be caused by a natural phenomenon such as a plague or drought, but that was not the case with the Great Depression. It was clearly a human created problem.) Although we do not precisely know all of the factors that caused the Great Depression, we have a pretty good idea as to which hypotheses are not helpful.Many people associated the stock market crash with the Depression due to the fact that it occurred at about the time it became apparent we were sliding into a deep slump. Note that I said “became apparent”; the Depression actually began a few months before the crash. In October 1987, we had a nice test of the theory that the stock crash was a causal element in the Depression. A crash of almost equal size occurred at almost exactly the same time of year, after a long economic expansion. Many pundits expected a depression, or at least a recession. Instead, the 1987 stock market crash was followed by a booming economy in 1988 and 1989.
Of course it’s possible to explain some difference in outcome to other factors at play, but when the difference is this dramatic (booming economy vs. the greatest depression in modern history), one has to wonder whether the hypothesis is of any value at all.
The same is true of the inequality/underconsumption hypothesis. Over the last 45 years, we’ve seen an interesting test of this theory. China has experienced a huge increase in economic inequality. More importantly, it has seen some of the lowest levels of consumption (as a share of GDP) ever observed. Even lower than other fast growing East Asian economies such as South Korea. Pundits have claimed that China’s consumption levels are too low, and that too many resources are being devoted to investment in areas of dubious merit.
That may all be true. Perhaps China should invest less and consume more. But it’s also clear that low levels of consumption in China have not caused a Great Depression. Indeed China’s had one of the fastest growing economies in the world since 1978.
Again, what impresses me about these two counterexamples (the US in 1987-89 and China since 1978) is not that things didn’t play out exactly as the historians might have expected based on their theory of the Great Depression. Rather what impresses me is that the results were almost 180 degrees removed from what might have been expected. That tells me that theories that stock market crashes and underconsumption cause depressions are essentially useless. They are ad hoc explanations with no real supporting economic theory and no predictive power. Why should a stock market crash cause 25% of workers to stop working? What is the mechanism? Why should high levels of investment cause real GDP to decline by 30% over 4 years? What is the mechanism? If they have no theoretical support and no predictive power, then why should we care what historians believe?
If you get creative enough you could find a causal mechanism running through aggregate demand. But then why not argue that a decline in aggregate demand caused the Great Depression? After all, that’s what actually did happen.
You might say that it’s important to know the cause of the Great Depression. But why? If the theories offered by historians provide no help in understanding the modern world, then how are they of any use?
More broadly, I distrust all theories of economic causation developed by non-economists (not just historians). These theories tend to rely on “common sense”. Thus many average people think that countries are rich because they are big, or because they have lots of natural resources. (Perhaps because that theory sort of fits the US.) But looking more broadly, rich countries don’t tend to be places with large populations or high levels of natural resources. They tend to be smaller countries in East Asia and Western Europe. The actual (institutional) factors that explain the varying wealth of nations are much harder to see, and hence tend to be ignored by non-economists."
Related post:
Monetary Policy and the Great Crash of 1929: A Bursting Bubble or Collapsing Fundamentals?
Conclusion
In retrospect, it seems that the lesson of the Great Crash is more
about the difficulty of identifying speculative bubbles and the risks
associated with aggressive actions conditioned on noisy observations. In
the critical years 1928 to 1930, the Fed did not stand on the sidelines
and allow asset prices to soar unabated. On the contrary, its policy
represented a striking example of The Economist’s
recommendation: a deliberate, preemptive strike against an (apparent)
bubble. The Fed succeeded in putting a halt to the rapid increase in
share prices, but in doing so it may have contributed one of the main
impulses for the Great Depression."
Can Metal Mining Match the Speed of the Planned Electric Vehicle Transition?
By Kenneth P. Green of The Fraser Institute.
"The governments of Canada, the United States, and many other nations are mandating a shift in vehicle technology: away from vehicles powered primarily by internal combustion engines, and toward vehicles powered primarily with electricity stored on board in batteries.
Canada’s government has established policies designed to push automakers to achieve the government’s goal of having 35 percent of all new medium- and heavy-duty vehicle sales be electric by 2030, rising to 100 percent of all new medium- and heavy-duty vehicle sales being electric by 2040.
The US has set a target requiring 50 percent of all new passenger cars and light trucks sold in 2030 be electric, or largely electric hybrid vehicles. These timelines are ambitious, calling for a major expansion of the prevalence of electric vehicles (EVs) in the major vehicle classes in a very short time—only 7 to 10 years.
Barring breakthrough developments in battery technology, this massive and rapid expansion of battery-electric vehicle production will require a correspondingly massive and rapid expansion of the mining and refining of the metals and rare earth elements critical to battery-electric vehicle technology.
The International Energy Agency (IEA) suggests that to meet international EV adoption pledges, the world will need 50 new lithium mines by 2030, along with 60 new nickel mines, and 17 new cobalt mines. The materials needed for cathode production will require 50 more new mines, and anode materials another 40. The battery cells will require 90 new mines, and EVs themselves another 81. In total, this adds up to 388 new mines. For context, as of 2021, there were only 270 metal mines operating across the US, and only 70 in Canada. If Canada and the US wish to have internal supply chains for these vital EV metals, they have a lot of mines to establish in a very short period.
Historically, however, mining and refining facilities are both slow to develop and are highly uncertain endeavors plagued by regulatory uncertainty and by environmental and regulatory barriers. Lithium production timelines, for example, are approximately 6 to 9 years, while production timelines (from application to production) for nickel are approximately 13 to 18 years, according to the IEA.
The establishment of aggressive and short-term EV adoption goals sets up a potential conflict with metal and mineral production, which is historically characterized by long lead-times and long production timelines. The risk that mineral and mining production will fall short of projected demand is significant, and could greatly affect the success of various governments’ plans for EV transition."
Wednesday, November 22, 2023
EVALUATING THE SUCCESS OF PRESIDENT JOHNSON’S WAR ON POVERTY: REVISITING THE HISTORICAL RECORD USING AN ABSOLUTE FULL-INCOME POVERTY MEASURE
NBER working paper by Richard V. Burkhauser, Kevin Corinth, James Elwell and Jeff Larrimore. Excerpts:
"Abstract
We evaluate progress in President Johnson's War on Poverty relative to the 20 percent baseline poverty rate he established for 1963. No existing poverty measure fully captures poverty reductions based on these standards. We fill this gap by developing an absolute Full-income Poverty Measure (FPM) whose thresholds are established to obtain this same 20 percent official poverty rate in 1963 while using a fuller measure of income and updating thresholds each year only for inflation. While the official poverty rate fell from 19.5 percent in 1963 to 10.5 percent in 2019, our absolute FPM rate fell from 19.5 to 1.6 percent. This reflects increases in full income throughout the distribution, with real median income more than doubling between 1963 and 2019, together with the expansion of government transfers and tax benefits not fully captured by the official measure. It is also broadly consistent with the expectations of President Johnson and his Council of Economic Advisers, including Robert Lampman who predicted in 1971 that poverty based on these absolute standards would be eliminated by 1980. However, we also show that reductions in relative poverty since 1963 have been far more modest, falling from 19.5 to 16.0 percent in 2019.""we create a poverty measure, which we refer to as the absolute Full-income Poverty Measure (FPM). This measure maintains the same 1963 poverty rate as the Official Poverty Measure, matching Johnson’s baseline poverty rate (Johnson 1965). We hold poverty thresholds constant in inflation-adjusted terms using the Personal Consumption Expenditure (PCE) price index, which more accurately reflects price changes than the CPI-U inflation measure used for the Official Poverty Measure. Additionally, unlike the Official Poverty Measure, we include both cash and in-kind programs designed to fight poverty, including the market value of food stamps (now the Supplemental Nutrition Assistance Program, or SNAP), the school lunch program, housing assistance, and health insurance. Finally, we incorporate in a consistent way the technical improvements in how income is measured since the 1960s in both our measure of full income and in the new thresholds we create to anchor our new FPM poverty rate to the official poverty rate of 19.5 percent in 1963."
Mistaken About Poverty (Matthew Desmond's book Poverty, by America)
By Samuel Gregg. Excerpt:
"Let’s begin with Desmond’s core claims about extreme poverty. According to Desmond, America is characterized by “a kind of extreme poverty” of the “bare feet and swollen bellies” variety. This claim flies in the face of extensive evidence that the real poverty upon which his book focuses attention has—far from growing—been radically diminished.
Take, for instance, a recently released 2023 Journal of Political Economy study. Employing what the authors call “an absolute Full-income Poverty Measure (FPM),” which “uses a fuller income measure” rather than the official poverty rate, “and updates thresholds only for inflation,” this paper showed that since the beginning of President Johnson’s War on Poverty, the “absolute FPM rate fell from 19.5 to 1.6 percent.”
That is an amazing achievement. It indicates that, statistically speaking, the war against serious poverty has effectively been won. Moreover, when we add to this mix the fact that the poor in America generally have cellphones, air conditioning, cars, are not even close to starving, etc., we see that, in terms of consumption patterns, the realities about poverty in America simply do not match Desmond’s very bleak portrayal.
In fairness, it should be pointed out that the authors of the JPE study note that, unfortunately, “relative poverty reductions have been modest.” That is certainly something to be concerned about. But they also stress another trend: that “government dependence increased over this time, with the share of working-age adults receiving under half their income from market sources more than doubling.” The economic and social implications of this unfolding development, which appear to be disproportionately affecting working-age males, are just as much a cause for worry.
What’s curious about this particular trend is that the FPM fell in the 1990s along with a fall in welfare dependency among black children, black working-age adults, and working-age adults in general. That period correlates to the welfare reforms passed by a Republican Congress and a Democratic president in the middle of the decade. This suggests, as the JPEauthors observe, that “a rise in dependence is not a necessary condition for a reduction in poverty.” That is very good news insofar as it indicates, at a minimum, that you can reduce poverty and diminish welfare dependency at the same time. Poverty alleviation, in other words, need not facilitate soft despotism.
One can also question Desmond’s claims about poverty in America compared to other wealthy nations. America, Desmond states, is “the richest country in earth, with more poverty than any other advanced democracy.” Again, the numbers don’t indicate this.
In 2019, for instance, the National Academy of Sciences published A Roadmap to Reducing Child Poverty. Among other things, it included an analysis of child poverty rates across major Anglophone countries. According to its absolute measure of deprivation, the child poverty rate in Canada (10.3) and Ireland (11.3) is only slightly lower than that of the United States (12.5), while Britain’s (13.5) is slightly higher than America’s. The Anglophone country that does the best in this category is Australia (8.1).
Desmond might counter that the measurement he is using identifies the poverty level at half of the median income of the advanced democracies. But it is precisely because America has some of the highest median incomes in the world that relative poverty measurements make it seem poorer. That’s why an absolute measure of deprivation is a far more meaningful point of comparison between American and other advanced economies.
Putting aside the questionable statistical foundations for his claims, another dimension of Desmond’s argument merits considerable scrutiny. This concerns his contention that the wealthy actually benefit from the poverty endured by their fellow Americans. Put another way, the poor are poor because not-poor Americans and policymakers will it to be so. That is quite an assertion, but it turns out to be as doubtful as Desmond’s use of poverty measurements.
An example of this concerns the minimum wage. “Corporate profits rise,” Desmond says, “when labor costs fall.” According to Desmond, it benefits American businesses to keep the minimum wage as low as possible because it boosts their profits. That, he believes, translates into effectively locking particular categories of people into subsistence wages. It follows that the minimum wage must be raised.
Increasing minimum wage rates, however, will not likely pull significant numbers of Americans out of poverty. Moreover, Desmond himself acknowledges that going down that path will probably cost jobs. Many employers will respond to minimum wage increases by reducing their number of employees either by consolidating positions or turning to automation to replace people. Minimum wage increases also tend to price entire categories of people out of, say, entry level jobs. (Think unskilled workers, young people less interested in an income than they are in acquiring basic work skills, etc.) In any case, Desmond doesn’t account for the fact that, in developed nations like the United States, a higher degree of average labor productivity generally translates into higher average wages, and minimum wages have little to do with productivity.
A similar observation may be made about Desmond’s belief that America needs bigger and stronger trade unions (a claim, incidentally, also being made by interventionists on the conservative side of American politics today). That, Desmond believes, is one way to reverse what he believes to be the anemic growth in wages that helps account for considerable poverty in the United States.
That claim, too, runs into a basic objection: wages and incomes for average workers have not been more or less stuck for 30 years. As the economist Michael R. Strain observes in a Project Syndicate article entitled “The Myth of Income Stagnation”:
According to the CBO, median household income from market activities—labor, business, and capital income, as well as retirement income from past services—was not stagnant from 1990 to 2019. Instead, after adjusting for inflation, it grew by 26%. This is in line with wage growth. By my calculations using Bureau of Labor Statistics (BLS) data, inflation-adjusted average wages for nonsupervisory workers grew by around one-third over this period.
Moreover, a more comprehensive measure of the flow of financial resources available to households for consumption and savings helps to account for the non-market income they received and for the taxes they paid. After factoring in social insurance benefits (from Social Security and unemployment insurance, for example), government safety-net benefits (such as food stamps), and federal taxes, the CBO finds that median household income increased by 55% from 1990 to 2019, which is significantly faster than wage growth and certainly not stagnate. The bottom 20% of households enjoyed even greater gains, with market income growth of 51% and after-tax-and-transfer income growth of 74%.
None of this is to suggest that everything stated by Desmond in this book is wrong. In fact, there are some important points that he makes that should be highlighted. Desmond notes, for example, that a good deal of welfare spending goes to people who are not its intended recipients. That includes lawyers who make money out of suing the government, as well as middle-class families with bright accountants skilled at extracting considerable amounts of largesse from the government.
Another instance where Desmond is correct concerns his attention to the ways in which regulations and ordinances severely limit opportunities for housing construction in many parts of the country. The effect is to put home ownership—and the many positive cultural, social, and economic effects of owning property—out of reach of a considerable number of Americans. This also makes it difficult and more expensive for people to leave their suburbs, towns, or even states to pursue work opportunities. Those who consequently find themselves least able to make such major changes in their lives are those on the lower end of the income scale. The solution is to reduce the scope of regulations applying to housing construction: in other words, to liberalize some of the conditions surrounding the housing market. It is not clear to me, however, that Desmond would be willing to accept this.
In the end, curiously enough, Desmond’s primary preferred approach for addressing poverty is less about policies than it is about changes in attitude. Economically well-off Americans, he argues, need to take off their blinders about those in need around them and alter their choices and actions accordingly.
That means rethinking things ranging from where we shop and how we invest our capital to whom we employ and where we choose to live. “We must ask ourselves,” Desmond writes, “and then ask our community organizations, our employers, our places of worship, our schools, our political parties, our courts, our towns, our families: What are we doing to divest from poverty?” It is more than a whole-of-government approach to poverty that Desmond is calling for; he wants a whole-of-society approach to “finally put an end to it.”
The difficulty with all this is that America has already put an end to the type of poverty that certainly should seriously bother Americans. But the broader problem with his concluding recommendation is that the key to poverty reduction is long-term economic growth. And economic growth is delivered when people are allowed to pursue their self-interest peacefully in a context of rule of law, constitutionally limited government, private property rights, and dynamic entrepreneurship.
The fact that these conditions have been the exception rather than the norm for most of human history is why poverty was, until relatively recently, the everyday economic reality experienced by most humans. Understanding this and then acting accordingly is the attitudinal and behavioral shift that will give us an America that lives up to its promise."
Why Has the Left Sided With Hamas? Antisemites Are Recasting Themselves as Victims
By Ralph Schoellhammer. He is an assistant professor in economics and political science at Webster University, Vienna.
"The 19th century Austrian socialist Ferdinand Kronawetter once remarked, "Antisemitism is the socialism of idiots." He was not wrong, but I believe even he would have been surprised that most of these idiots are either celebrities or attend the West's most prestigious educational institutions.
From Greta Thunberg to Harvard yard, you can barely throw a stone without hitting someone holding an anti-Israel rally. Apparently, we also have normalized Jewish students needing to be locked up in university libraries for their own "safety," or the BBC regularly publishing blood libels, first about a supposed bombardment of a hospital in Gaza by Israeli forces, and a few weeks later with the claim that Israeli Defence Forces are intentionally targeting medical personnel and Arabic speakers.
But even that is, in fact, old news: In January 2009, the French public broadcaster France-2 aired footage of Palestinians killed in an Israeli air raid on New Year's Day. As it turned out, however, the recording was from 2005 and not 2009, and the victims were not killed by the IDF but an "accident" of Hamas explosives detonating prematurely. And on it goes: Recently, pictures showing the carnage inflicted on the Syrian people by the Assad regime made the rounds claiming to show destruction in Gaza.
Who needs the "The Protocols of the Elders of Zion" when you have the mainstream media?
Sometimes it seems as if no ideology evolves as quickly as antisemitism: Before Israel, the Jews where hated for being rootless and without national allegiances. Now that they have a state, they are hated for that. And if the last Jew were to leave the Middle East tomorrow, some other reason would be found to justify Jew-hatred.
In fact, after 1948, Jews had to leave most of the Middle East and North Africa: Once thriving communities from Morocco to Iraq ceased to exist after the regimes of these countries drove out their Jewish populations. Many of them moved to Israel, the only state in the region that actually allows Muslim Arabs and Jews to live side-by-side.
Alas, if you are Jewish, it is never enough: Unless all of the Middle East is "Judenfrei," the Jews will always be painted as oppressors and everyone else as oppressed.
So in some ways, it's not surprising to see antisemitism thrive in the ranks of the global leftist elite. When global climate celebrity Greta Thunberg blabberers on about "no climate justice on occupied land," she is only revealing the next iteration of Jew hate. It turns out that for the "environmental movement," the solution to the "Jewish Question" seems to be more important than addressing climate change.
Similarly, in the famously tolerant Netherlands where government advisors are openly pushing for the normalization of paedophilia, the line needs to be drawn somewhere: Several filmmakers pulled out of the International Documentary Festival in Amsterdam after the organizers refused to allow their stage to be used for promoting the eradication of Israel.
Pot, porn, and paedophilia are all a go for the Dutch, but being pro-Israel? One should be careful not to go too far!
Another turn-of-the-century Austrian who would have a lot to say about this would probably be Sigmund Freud: If the West is suffering from the pathologies of historical guilt for all the alleged sins of its past, Israel and the Jews are the favourite object of projection. If the Jews are as bad as the Nazis and European colonialists, then opposing them is like re-running history, but this time those Westerners on college campuses can finally be on the side of the oppressed.
Both Israel and the Palestinians are just extras in an exercise of excessive narcissism that allows one to stand "on the right side of history" and virtue signal at no cost. Of course, this doesn't apply when Arabs are slaughtering Arabs. There were no protests for the peoples of Syria or Yemen or when ISIS was committing an actual genocide against the Yazidis.
That's what this all comes down to: a rewriting of history so that the Jews' oppressors can absolve themselves of guilt and claim to be the oppressed. During the Holocaust, the Jews experienced the worst that men can do to their fellow man, and many in the West are itching for the opportunity to find historic salvation in creating a moral equivalence between the Nazis and the state of Israel.
Neither Greta Thunberg nor her acolytes have any clue about Middle Eastern or Jewish history, because it is not about that—just as her climate activism was never about climate, but the usual Leftist tropes from colonialism to social justice.
In 1968, Eric Hoffer wrote about a premonition that would not leave him: "As it goes with Israel, so will it go with all of us. Should Israel perish, the Holocaust will be upon us." After all we have seen in recent weeks in in Western capitals and college campuses, I am afraid that he was right."