Monday, April 30, 2018

The Job Corps Failure: Taxpayers spend billions on a training program that doesn’t deliver

WSJ editorial.
"Nearly 50,000 people enrolled in 2017, and 87% lived in Job Corps dorms."

"Such comprehensive support doesn’t come cheap—the taxpayer cost per student last year was $33,990—and the IG suggests that the investment often doesn’t pay off."

"in 27 of 50 cases where full employment data existed, graduates were working the same sort of low-wage, low-skill jobs they held before training. One participant completed 347 days of Job Corps carpentry training but five years later worked as a convenience-store clerk for $11,000 a year."

"the program matched more than 1,500 students with “jobs that required little or no previous work-related skills, knowledge, or experience, such as fast food cooks and dishwashers that potentially could have been obtained without Job Corps training.” The audit also found Job Corps had placed nearly one in five graduates in jobs that “did not relate or poorly related to the students’ training.”"

"The new report suggests that Job Corps’ biggest beneficiaries may be government contractors, not rookie job seekers. Job Corps spent more than $100 million between 2010 and 2011 on transition-service specialists to place students in a job after training.

But among 324 sampled Job Corps alumni, the IG found evidence that contractors had helped a mere 18 find work. The contractors often claimed credit for success even though they provided no referrals or résumé and interview help. Overall, the IG estimates that Job Corps paid contractors some $70.7 million for transition services they failed to adequately perform."

Due to the tax overhaul top earners will pay a higher share of income taxes

Top 20% of Americans Will Pay 87% of Income Tax: Households with $150,000 or more in income make up 52% of total income nationally but pay large portion of total taxes by Laura Saunders of The WSJ.
"For 2018, households in the top 20% will have income of about $150,000 or more and 52% of total income, about the same as in 2017. But they will pay about 87% of income taxes, up from about 84% last year.

By contrast, the lower 60% of households, who have income up to about $86,000, receive about 27% of income. As a group, this tier will pay no net federal income tax in 2018 vs. 2% of it last year."

"Roughly one million households in the top 1% will pay for 43% of income tax, up from 38% in 2017. These filers earn above about $730,000."

"the share of taxes paid by the top 5% will rise despite the fact that people in it were the largest beneficiaries of the overhaul’s tax cut, both in dollars and percentages."

"The share of tax paid by the top 20% of Americans also changes when social-insurance levies are included. It drops to about 67% of federal taxes from roughly 87% of income taxes."

Sunday, April 29, 2018

Government aid for firms with few employees is bad economics and bad policy

Stop Propping Up Small Business Robert D. Atkinson and Michael Lind. Mr. Atkinson is the president of the Information Technology and Innovation Foundation. Mr. Lind is a visiting professor at the University of Texas Johnson School of Public Affairs.
"Economist David Birch of the Massachusetts Institute of Technology claimed in the late 1970s—inaccurately, as it turned out—that small businesses were the jobs engine of the economy, which allowed advocates to argue that aid to small businesses was a driver of economic growth."

"A 2010 study published by the National Bureau of Economic Research showed, however, that it is the age of a firm, not its size, that matters for job creation. Just as children grow faster than adults, young firms grow faster than mature ones.

Over the decades, lawmakers have conferred an array of valuable benefits and exemptions on small firms. In the years since the SBA’s founding, Congress has passed at least 68 pieces of legislation explicitly favoring small business, including the Small Business Prepayment Penalty Relief Act of 1994, the Small Business Job Protection Act of 1996 and the SEC Small Business Advocate Act of 2016."

"firms with fewer than 11 employees are exempt from most workplace safety requirements. The Family and Medical Leave Act does not apply to small firms"

"A firm with fewer than 20 employees can legally discriminate against workers on the basis of age; if it has fewer than 15 employees, it can discriminate against qualified individuals with disabilities. Only federal contractors with 50 or more workers are required to use affirmative action plans when hiring."

"Profits from publicly traded companies (most of which are large) are typically taxed twice, once at the corporate level and again when shareholders accrue capital gains or dividends. By contrast, pass-through firms such as sole proprietorships, partnerships and LLCs—the lion’s share of which are small—are taxed only once, on the owners’ incomes."

"many tax incentives either apply only to small firms or are more generous for small firms, including (at least before recent tax-reform legislation) the ability to expense investments in new equipment, exemptions from imputed interest obligations, completed-contract rules, expensing of agricultural costs and a host of others."

"To give small businesses a leg up, federal agencies are required to buy goods and services from them even when their prices are higher. Federal agencies also provide a variety of special subsidies to small firms. Small firms get discounts when buying rights to use the radio spectrum and pay lower patent fees."

"most special favors for small businesses . . . have nothing to do with their difficulty in coping with regulation."

"Don’t all the breaks for small businesses at least help “the little guy”? No, in fact, they go mostly to the wealthy. In 2016, according to the nonpartisan Tax Policy Center, the top 1% of pass-through businesses earned 50.8% of the income for such firms. A mere 13.4% of all pass-through income went to the bottom 60%."

"“More wealthy individuals are small-business owners than poor individuals. The subsidy on small-business ownership just transfers resources to the wealthy from the poor.”"

"subsidies and other size-based industrial policies slow productivity growth by enabling less efficient small firms to gain more market share than would otherwise be the case. Second, discriminatory policies provide an incentive for small firms to remain small."

Federal programs created to support homeownership dating to the 19th century “largely, and in some cases, exclusively benefited whites”

See Blacks Still Face a Red Line on Housing. NY Times editorial. Excerpts:
"In the 2017 volume “The Fight for Fair Housing: Causes, Consequences and Future Implications of the 1968 Federal Fair Housing Act,” the housing expert Lisa Rice notes that federal programs created to support homeownership dating to the 19th century “largely, and in some cases, exclusively benefited whites” while making it difficult for black citizens to achieve the dream of owning homes and land.

As enslaved people, she writes, black Americans were initially unable to take advantage of the Homestead Act, under which the government encouraged westward migration by giving away tens of millions of acres — to settler citizens. Former slaves gained full citizenship with the 14th Amendment and became eligible for land grants, but that right became irrelevant with the collapse of Reconstruction, the rise of Jim Crow and the limitations on the rights of black people that the Southern states placed in their constitutions.

The pattern of exclusion continued into the Great Depression, when programs aimed at rescuing homes from foreclosure were carried out in a patently racist fashion. The Homeowners Loan Corporation, established in the 1930s to refinance mortgages, set a discriminatory pattern when it drew lines around black communities — a system known as “redlining” — and decreed them unsafe for federal investment.

That system of exclusion was picked up with disastrous effect by the Federal Housing Administration, created in 1934 to encourage homeownership with federally backed mortgage insurance. A 2017 study by the Federal Reserve Bank of Chicago found “evidence of a long-run decline in homeownership, house values and credit scores” that persists to this day in the formerly redlined neighborhoods."

Saturday, April 28, 2018

Oil Companies Pay More In Taxes Than They Make In Profits

Christian Science Monitor Letter on Greedy Oil Companies by Don Boudreaux.
"In this November 16th, 2005, letter in the Christian Science Monitor I pointed out that, however greedy U.S. oil companies are, U.S. politicians are greedier:

In his Nov. 10 editorial cartoon, Clay Bennett shows an oil-company executive sitting smugly in front of charts showing that “audacity” and “temerity” are rising along with profits.

Is it audacious to risk billions of dollars annually to explore for oil? Is it temerity to enjoy high profits in some years, knowing that other years will bring losses?

The truly audacious (and greedy) ones are the politicians who demagogue this issue. After all, since 1980 oil companies have paid taxes of $2.2 trillion (in 2004 dollars) – an amount more than three times higher than the profits these companies earned during the same period."

The burden for allowing private interests to flourish should be placed on the shoulders of the law

By Alberto Mingardi of EconLog.
If...we want businesses to be "responsible" for other interests than their shareholders', we are, it seems to me, "privatizating" public concerns.

"A friend pointed me to this piece published by Business Insider. Apparently, here Nobel Laureate Joe Stiglitz* explains that fellow Nobel Laureate Milton Friedman has to be blamed for high inequality and weak growth. The piece reports a conversation with Stiglitz, but I would maintain it is by and large the result of journalistic simplification, as I'm sure Stiglitz would have more rigorous, and persuasive, arguments. The article reads as quite a milkshake of far wider debate on free markets vs interventionism.

I am, however, quite impressed by the fact that Stiglitz focuses his own criticism on Friedman's rebuttal of corporate social responsibility:

In his highly influential 1962 collection of essays [sic], "Capitalism and Freedom," Friedman proclaimed that in a free economy, "there is one and only one social responsibility of business -- to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game, which is to say, engages in open and free competition, without deception or fraud."

The idea that "the business of business is business" implies, for Stiglitz and/or Mr Feloni, who summarises Stiglitz's thoughts, a belief in the "invisible hand", meaning by it both a tendency towards equilibrium and the happy coincidence of private and public interests.

Leaving aside long standing controversies on the real importance of the metaphor in Adam Smith's thought (for a summary of the previous instalments, see this paper by Gavin Kennedy), is it really so? Does thinking that companies should focus on shareholder interest require a faith in unfettered competition, and vice versa?

I am not persuaded.

Friedman himself indeed wrote that in a free economy "there is one and only one social responsibility of business―to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game."

Also, again in Capitalism and Freedom, Friedman indeed argued that:

It is the responsibility of the rest of us to establish a framework of law such that an individual in pursuing his own interest is, to quote Adam Smith again, 'led by an invisible hand to promote an end which was no part of his intention.'

He quoted Smith's invisible hand, but he placed the burden for allowing private interests to flourish, thereby profiting collective interests too, on the shoulders of the law. It is, in other words, institutions that make for the pursuit of profit to be beneficial to society at large or, on the contrary, detrimental to society at large. Heavy regulations and omnipotent functionaries are likely to give us crony capitalism, in which clearly a handful of beneficiaries tend to have a liberal access to the public purse.

We know that some influential people on the free market side are uneasy with the way in which Friedman phrased his own argument - for example, John Mackey, who has long argued for what he calls conscious capitalism. Personally, I regard these arguments more as a plea to understand the fact that the entrepreneurial experience has a richer texture than profit maximisation, than as a debunking of Friedman's reasoning.

But on the other hand, I think you can believe that the business of business is business, without being particularly libertarian. In a way, the question is: if CEOs and entrepreneurs have a wider social responsibility wider than making profits for stockholders, how shall they know what it is? Do they dispose of all the relevant information? They certainly do frequent mistakes even when they tend to focus on one goal (creating value for shareholders). God forbids what may happen when they need to take care of many.

If you go beyond platitudes (be nice to your employees, keep good relationships with your suppliers), answering this question is problematic and involves an unmistakable exercise of discretion. For one thing, if the CEO were prioritising other goals over increasing shareholder value, he would de facto be imposing his own worldview and worries (whatever they are) on them. He would be playing a political, not a managerial role. In this sense, it doesn't change much if he does so by his own will, or because it is so "nudged" by regulators. What matters is that you are using resources that are supplied to you for a certain end (generating a profit) for another.

Widening the scope of business responsibility seems to me to be largely different than applying general norms to a company's activity. In this case, we are fully consistent with Friedman's words: we may differ on what it takes to make private and collective interests compatible, but that's the game.
If, on the other hand, we want businesses to be "responsible" for other interests than their shareholders', we are, it seems to me, "privatizating" public concerns.

I've always thought that over-emphasising corporate social responsibility was an unwilling admission, on the part of the left, of the government's inefficiency in fostering the goals dearer to her. So the burden gets shifted to private business.

In a sense, Friedman was simply stating the need for a separation of business and politics, not unlike the the separation of state and church. Isn't that something which should be welcomed by people who care about transparency and fear lobbying and cronyism, whatever the source?

In the Feloni piece, Stiglitz does much more than this. He basically offers his own view of one century of economic (and, indeed, political) debate. I don't think the issue of business responsibility is a good focal point for that long a history.

* Note that Stiglitz discussed inequality with Russ Roberts in this 2012 EconTalk episode."

Friday, April 27, 2018

A 10 percent increase in the effective federal regulatory burden upon a state is associated with about a 2.5 percent increase in the poverty rate

See Regulation and Poverty: An Empirical Examination of the Relationship between the Incidence of Federal Regulation and the Occurrence of Poverty across the States by Dustin Chambers, Patrick McLaughlin and Laura Stanley of Mercatus. 
"“They should regulate that!”

This is an instinctive response to any perceived problem. Regulation, however, can have unintended consequences. Before opting for regulation, policymakers should carefully assess the consequences of such an action, particularly how it will affect low-income Americans.

Regulation can reduce real incomes, diminish entrepreneurship, increase income inequality, and raise the price of consumer goods. Regulation is very often regressive—low-income individuals bear a disproportionate share of the cost. It is not unreasonable to expect that regulation also increases the number of people living in poverty. Until recently, however, the lack of state-level data has made it impossible to quantify the impact of regulation on poverty rates in the United States.

Dustin Chambers, Patrick A. McLaughlin, and Laura Stanley seek to answer this question in “Regulation and Poverty: An Empirical Examination of the Relationship between the Incidence of Federal Regulation and the Occurrence of Poverty across the States.” This paper is the first to estimate the relationship between poverty and regulation.

The three key elements of the paper are the innovative data used, the main finding, and the practical recommendation to policymakers and regulators.
  • New data source—the FRASE index. For their study, the authors employ the Mercatus Center’s recently created Federal Regulation and State Enterprise (FRASE) index. This is an industry-weighted measure of the burden of federal regulations at the state level. 
  • Relationship between regulation and poverty. The authors were careful to control for the other factors known to influence poverty rates. Having done this, they find a robust, positive, and statistically significant relationship between the FRASE index and poverty rates across states. A 10 percent increase in the effective federal regulatory burden upon a state is associated with about a 2.5 percent increase in the poverty rate.
  • Policymaker recommendation. When weighing costs and benefits, policymakers should always be mindful of the unintended rise in poverty that is associated with additional regulations. This relationship between regulation and poverty should give policymakers reason to pause the next time they hear the demand, “They should regulate that!”"

5 Reasons Why 5-Point Takedown of AEI Report on School Choice, Tests, and Long-Term Outcomes Misses the Mark

By Corey A. DeAngelis of Cato.

"They say you shouldn’t throw stones if you live in a glass house. Fordham Institute’s president, Michael J. Petrilli, recently threw five of them. He released a five-part series that supposedly showed why an American Enterprise Institute report incorrectly concluded that “a school choice program’s impact on test scores is a weak predictor of its impacts on longer-term outcomes.” But Petrilli’s flawed dissection of the AEI analysis failed to invalidate its conclusions or to establish that “impacts on test scores matter.” Here’s what he got wrong.

The most important weaknesses of the five-part series had to do with methodology. First, all of Petrilli’s re-analyses were based on a highly truncated sample. In fact, he dropped over a third of the original review’s studies linking test scores to high school graduation. Petrilli believed this was justified because they were not what he considered “bona fide” school choice programs. For example, he argued that career and technical education school evaluations should be dropped because “high-quality CTE could still boost high school graduation, postsecondary enrollment, and postsecondary completion” even though those schools spend less time shaping skills that are captured by standardized tests.

But that is precisely the point. A disconnect between programs’ effects on standardized test scores and on long-term outcomes suggests that test scores are not good proxies for long-term success. And if we regulate teachers and schools based on them, educators may have a perverse incentive to focus less on the character skills that are necessary for true lifelong success.

And, in every single case, the dropped studies included schools that students chose to attend. But it shouldn’t even matter if the studies were of schools of choice. Finding a divergence between short- and long-term outcomes — from any type of educational evaluation — should cause us to question the validity of test scores. Put simply, Petrilli should not have dropped over a third of the original report’s observations.

But assume that dropping observations was a good call. The much more astonishing methodological error was in counting null results as positive or negative. Petrilli argued that it would be “reasonable” to look for matches by “seeing whether a given study’s findings point in the same direction for both achievement and attainment, regardless of statistical significance. In other words, treat findings as positive regardless of whether they are statistically significant, and treat findings as negative regardless of whether they are statistically significant.” No serious social scientist would call that approach “reasonable.” This is because null results are statistically indistinguishable from zero.
But even when treating zeros as positive or negative, Petrilli still found disconnects between test scores and high school graduation 35 percent of the time for math and 27 percent of the time for reading. However, the original report found that 61 percent of the effects on math test scores — and 50 percent of the effects on reading test scores — did not predict effects on high school graduation. In either case, effects on test scores are unreliable predictors of effects on attainment.

But that’s not all. The literature finding divergences isn’t limited to high school graduation and college enrollment. I have started to compile more evidence of these divergences that exist in the most rigorous private school choice literature. I’ve already found 11 disconnects between private schools’ effects on test scores and their effects on other arguably more important educational outcomes, such as tolerance of others, political participation, effort, happiness in school, and adult crime. For example, an experimental evaluation of a private school voucher program in Ohio found that winning the lottery to attend a private school had no effects on test scores but a 23 percent increase in students’ charitable donations in a lab setting.

And methods aren’t the only problem. There are some important logical errors to note as well.
Petrilli correctly points out that higher graduation rates could simply mean that individual schools have lowered their standards. In other words, high school graduation rates can be gamed. We have recent evidence of this in D.C. public schools. But Petrilli fails to point out that the same problem of gaming also applies to standardized tests. We also have lots of evidence of this from places like Atlanta. In fact, the corruption involved with using top-down metrics — of any kind — for accountability is so widespread that social scientists have given the principle its own name: Campbell’s Law. This is just another reason we should not regulate schools based on top-down metrics like test scores or even graduation rates.

In-Depth: Michael Petrilli’s 5-Part Argument About School Choice, Tests & Long-Term Outcomes
But assume that no disconnects existed in the literature. Let’s also assume that test scores were indeed valuable predictors of all long-run outcomes we actually cared about. And let’s further assume that it was impossible to game the metric.

Regulators would still have a severe knowledge problem. How would they know which schools were the best at shaping test scores? Average test score levels would tell us nothing about how well the schools improved them. However, we could look at test score growth instead. And if the regulators were highly informed, they could use one of the most rigorous econometric methods social scientists currently have to determine schools’ effects on test scores: value-added methodology. The problem is that value-added methodology relies on the assumption that children are randomly assigned to schools. By definition, schools of choice fail that assumption. Because, you know, kids don’t choose their schools at random.

In other words, even if we all believed test scores were valuable, and even if regulators used the best methodology available, they could still close down schools that were doing good things for their students.

But what happens when regulators close schools that are actually low-quality? Obviously, this causes disadvantaged children to switch schools, which itself has been found to reduce student achievement by over two months of learning. But that isn’t the only problem. Closing an objectively low-quality school could mean that children are displaced into an even worse institution. And there is absolutely no guarantee that a better institution will magically pop up.

The fact is, several studies show that test scores are weak proxies for the outcomes we actually care about. The weak predictive power of test scores suggests that policies incentivizing teachers and schools to improve these crude metrics could actually harm students in the long run.

But families already know this. When given the chance to choose their children’s schools, families consistently prioritize things like school culture and safety over standardized test scores. Maybe families know a little something about their own kids that the experts don’t know. And maybe the experts should learn to leave them alone."

Thursday, April 26, 2018

Did You Know the Greatest Two-Year Global Cooling Event This Century Just Took Place?

From The Global Warming Policy Forum

"Would it surprise you to learn the greatest global two-year cooling event of the last century just occurred? From February 2016 to February 2018 (the latest month available) global average temperatures dropped 0.56°C.

You have to go back to 1982-84 for the next biggest two-year drop, 0.47°C—also during the global warming era. All the data in this essay come from GISTEMP Team, 2018: GISS Surface Temperature Analysis (GISTEMP). NASA Goddard Institute for Space Studies (dataset accessed 2018-04-11 at https://data.giss.nasa.gov/gistemp/). This is the standard source used in most journalistic reporting of global average temperatures.

The 2016-18 Big Chill was composed of two Little Chills, the biggest five month drop ever (February to June 2016) and the fourth biggest (February to June 2017). A similar event from February to June 2018 would bring global average temperatures below the 1980s average. February 2018 was colder than February 1998. If someone is tempted to argue that the reason for recent record cooling periods is that global temperatures are getting more volatile, it’s not true. The volatility of monthly global average temperatures since 2000 is only two-thirds what it was from 1880 to 1999.

None of this argues against global warming. The 1950s was the last decade cooler than the previous decade, the next five decades were all warmer on average than the decade before. Two year cooling cycles, even if they set records, are statistical noise compared to the long-term trend. Moreover, the case for global warming does not rely primarily on observed warming; it has models, historical studies and other science behind it. Another point is both February 1998 and February 2016 were peak El Niño months so the record declines are starting from high peaks—but it’s also true that there have been many other peak El Niño months in the past century and none were followed by such dramatic cooling.

My point is that statistical cooling outliers garner no media attention. The global average temperature numbers come out monthly. If they show a new hottest year on record, that’s a big story. If they show a big increase over the previous month, or the same month in the previous year, that’s a story. If they represent a sequence of warming months or years, that’s a story. When they show cooling of any sort—and there have been more cooling months than warming months since anthropogenic warming began—there’s no story.

The public and media case for global warming, unlike the scientific case, depends heavily on short-term observation of actual temperatures. Biased reporting suggests warming is much steadier than it is. If the global temperature really showed half a century of uninterrupted warming—with only warming records, no cooling records—then people with nuanced views of plausible future temperatures could be dismissed as deniers. Annual atmospheric CO2 levels have gone up in pretty much a straight line since 1960, if temperatures did the same thing, the link to CO2 would be direct and obvious. In fact, it is real but complex, and those complexities are important for analyzing policy choices. […]

How should the global average temperature data be reported? There should be equal attention on warming and cooling records. Coverage should be based on how unusual the event is, not whether or not it increases support for favored policies. Ordinary events that have happened many times in the past, before global warming, should not be treated as evidence supporting global warming. Mildly unusual events, in the 2 to 3 standard deviation range, should be reported with strong qualifiers that they have little meaning compared to the overall weight of evidence for global warming. Very unusual events, 3 standard deviations and more, deserve investigation, even if that means burdening the reader with somewhat technical material.

Temperatures may climb from here, so these unusual cooling events need not make mainstream news. But unless that happens soon—and remember that would be bad news—climate reporters will have to discuss cooling, which will mean presenting a more complex story than has been typical in the past. I hope they are up to that task."

Temperatures Might Be Rising Only Half As Much As The Climate Models Predict

See Some More Insensitivity about Global Warming by Patrick J. Michaels of Cato.
"Hot off the press, in yesterday’s Journal of Climate, Nic Lewis and Judith Curry have re-calculated the equilibrium climate sensitivity (ECS) based upon the historical uptake of heat into the ocean and human emissions of greenhouse gases and aerosols. ECS is the net warming one expects for doubled atmospheric carbon dioxide. Their ECS ranges from 1.50 to 1.56 degrees Celsius.

Nic has kindly made the manuscript available here, so you don’t have to shell out $35 to the American Meteorological Society for a one-day view.

The paper is a follow-on to their 2015 publication that had a median ECS of 1.65⁰C. It was criticized for not using the latest-greatest “infilled” temperature history (in which less-than-global coverage becomes global using the same data) in order to derive the sensitivity. According to Lewis, writing yesterday on Curry’s blog, the new paper “addresses a range of concerns that have been raised about climate sensitivity estimates” like those in their 2015 paper.

The average ECS from the UN’s Intergovernmental Panel on Climate Change (IPCC) is 3.4⁰C, roughly twice the Lewis and Curry values. It somehow doesn’t seem surprising that the observed rate of warming is now running at about half of the rate in the UN’s models, does it?

Lewis and Curry’s paper appeared seven days after Andrew Dessler and colleagues showed that the mid-atmospheric temperature in the tropics is the best indicator of the earth’s energy balance. This means that any differences between observed and forecast midatmospheric temperatures there can be used to adjust the ECS.

Late last year, University of Alabama’s John Christy and Richard McNider showed that the observed rate of warming in the tropical mid-atmosphere is around 0.13⁰C/decade since 1979, while the model average forecast is 0.30⁰C/decade. This adjusts down the IPCC’s average ECS to the range of 1.5⁰C (actually 1.46⁰).

That’s three estimates of ECS all in the same range, and all approximately half of the UN’s average.
It seems the long-range temperature forecast most consistent with these findings would be about half of what the IPCC is forecasting. That would put total human warming to 2100 right around the top goal of the Paris Accord, or 2.0⁰C.

Stay tuned on this one, because that might be in the net benefit zone."

Wednesday, April 25, 2018

Decarbonization: It Ain’t That Easy

By Daniel Raimi and Alan J. Krupnick of Resources for the Future.

"This week, the New York Times published an editorial entitled “Earth, Wind, and Liars” by economist Paul Krugman. In it, Krugman argues that the costs of renewable energy—wind in particular—have fallen so dramatically that “…there is no longer any reason to believe that it would be hard to drastically “decarbonize” the economy. Indeed, there is no reason to believe that doing so would impose any significant economic cost.”

While we share Krugman’s enthusiasm for the rapidly declining costs of wind, solar, energy storage, and other low- or zero-carbon technologies, the op-ed leaves readers with the impression that decarbonization would be cheap and easy if it weren’t for entrenched fossil fuel interests impeding government policy. We disagree.

While it’s certainly true that some in Washington, not least the Trump administration, have pursued policies aimed at slowing or reversing the recent reductions in US greenhouse gas emissions, there are still numerous economic and societal barriers to rapid decarbonization.

Economic Barriers

The energy system is enormous, and it changes slowly. Globally, fossil fuels currently provide 81 percent of global primary energy. In the United States, the number is 80 percent. While wind and solar have grown rapidly in recent years, they together account for just 1 percent of the global energy supply, and in the United States just 2 percent. Even if they grow rapidly, the sheer scale of the energy system means that even the most rapid transition would take many decades.

Figure 1. Shares of Primary Energy Consumption


 
Sources: US Energy Information Administration (US) and the International Energy Agency (world)
Second, the growth in renewables we’ve seen to date has been supported by government subsidies, both in the United States and internationally. That’s not to say that renewables aren’t becoming more cost competitive, but in many parts of the United States and the world, fossil fuels continue to offer the lowest cost option for electricity generation, even with subsidies. This is particularly true in the United States, where the shale revolution will likely provide a low-cost supply of natural gas for decades to come.

And it is not like wind and solar come free of environmental concerns. The sheer size of wind and solar installations needed to underpin our electricity system is significant. According to MIT’s Future of Solar Energy study, solar to power one-third of the US 2050 electricity demand would require 4,000 to 11,000 square kilometers (for context, Massachusetts’s area is 27,000 square kilometers). Wind farms take more land for the same power—66,000 square kilometers, although only a small portion of that is actually disturbed by installations (TheEnergyCollective has an insightful discussion on this topic). Even for relatively modest (from a national perspective) proposals—such as Texas’s goal for 14 to 28 gigawatts of new solar by 2030—there are concerns about habitat fragmentation, loss of endangered species and other impacts on the environment.

Krugman also forgets to mention nuclear power, which is responsible for about 20 percent of electricity generation in the United States. Nuclear plants are aging fast, with many retirements and few new reactors planned. The more that retire, the more other sources will have to fill in, upping CO2 emissions or creating a greater burden on renewables.

Keep in mind, decarbonization isn’t just about electricity. Achieving steep cuts in greenhouse gas emissions will require large reductions from the transportation, industrial, and heating sectors which, in 2017, accounted for 62 percent of US primary energy consumption. While some of these energy services can be electrified via passenger vehicles, electric home heating, and other means, wind and solar is no replacement for fossil fuels in certain industrial and transportation applications (to his credit, Krugman acknowledges the impracticality of electrifying air travel). And despite years of subsidies, the percentage of electric vehicles in the fleet remains miniscule.

Indeed, consumption of petroleum products internationally is galloping ahead. This year alone, global demand for oil is set to grow by about 1.5 million barrels per day. This growth isn’t driven by lobbyists on Capitol Hill, but instead by strong economic growth, spurred by developing countries in Asia.

Societal Barriers—Distributional Effects

Setting aside the technological hurdles of decarbonization, it is important to remember that reducing GHG emissions will have winners and losers. While the aggregate economic effects may be relatively small (as RFF researchers have shown), the distributional effects of such a massive shift have political and social impacts that can’t be wished away.

First, lower income households will bear the largest relative burdens of the higher energy costs that are likely as a result of climate policies. While there are ways of mitigating these unequal impacts, they require difficult trade-offs.

Second, consider the effects of the downturn in Appalachian coal mining, where an entire region has struggled to cope with an energy transition. Now apply a similar logic to the hundreds of communities around the country that are, or have become, heavily reliant on oil and gas extraction as their economic base. Cities like Midland, Texas, or Williston, North Dakota, recently bursting at the seams because of the shale boom, would face fundamental challenges in a world devoid of fossil fuels. Is it any wonder that politicians representing these regions fight for the economic engine that underlies the wellbeing of their regions?

Providing assistance to the individuals and communities negatively affected by climate policies has been an important component of past legislative efforts, and must be acknowledged as a complex and daunting challenge in and of itself.

What to Do

This post has argued that deep decarbonization won’t be easy, and that fossil fuel companies and the policymakers who support them are far from the only impediment to achieving long-term climate goals. In the face of these myriad challenges, there are a variety of technological and policy measures that can ease the transition towards a low-GHG future.

On the technological side, entrepreneurs are pursuing a variety of strategies with large-scale potential. This includes new nuclear technologies, which can provide reliable electricity while substantially reducing the risks of older generation light-water reactors. It includes carbon capture, utilization, and sequestration (CCUS), which has the potential to reduce GHG emissions while continuing to enable fossil fuel consumption. It includes carbon dioxide removal (CDR), which can remove CO2 directly from the atmosphere, reducing the harm caused by emissions from decades past. It includes pursuing ever greater energy density of batteries at lower costs to make electric cars and energy storage more attractive. And, yes, it absolutely includes continued investment in renewables. Solar power, in particular, offers enormous potential to scale and provide electricity, and also perhaps liquid fuels.

To lay the path for decarbonization, policymakers can provide a variety of incentives. While subsidies to renewables and other technologies have been the instrument of choice in the United States in recent years, a more efficient strategy would put a price on greenhouse gas emissions and, possibly, subsidize stages of the development process that are resistant to such incentives. Such an approach could provide a roadmap for the investors of today, while laying the groundwork for the future technologies we can only dream about. In sum, we can see the path forward, but in the words of D'Angelo “it ain't that easy.”"

If Solar And Wind Are So Cheap, Why Are They Making Electricity So Expensive?

By Michael Shellenberger. He is President of Environmental Progress, a research and policy organization. 
"Over the last year, the media have published story after story after story about the declining price of solar panels and wind turbines.

People who read these stories are understandably left with the impression that the more solar and wind energy we produce, the lower electricity prices will become.

And yet that’s not what’s happening. In fact, it’s the opposite.

Between 2009 and 2017, the price of solar panels per watt declined by 75 percent while the price of wind turbines per watt declined by 50 percent


And yet — during the same period — the price of electricity in places that deployed significant quantities of renewables increased dramatically.

Electricity prices increased by:

What gives? If solar panels and wind turbines became so much cheaper, why did the price of electricity rise instead of decline?
EP

Electricity prices increased by 51 percent in Germany during its expansion of solar and wind energy.
One hypothesis might be that while electricity from solar and wind became cheaper, other energy sources like coal, nuclear, and natural gas became more expensive, eliminating any savings, and raising the overall price of electricity.

But, again, that’s not what happened.

The price of natural gas declined by 72 percent in the U.S. between 2009 and 2016 due to the fracking revolution. In Europe, natural gas prices dropped by a little less than half over the same period.

The price of nuclear and coal in those place during the same period was mostly flat.
EP
Electricity prices increased 24 percent in California during its solar energy build-out from 2011 to 2017.

Another hypothesis might be that the closure of nuclear plants resulted in higher energy prices.
Evidence for this hypothesis comes from the fact that nuclear energy leaders Illinois, France, Sweden and South Korea enjoy some of the cheapest electricity in the world.

Since 2010, California closed one nuclear plant (2,140 MW installed capacity) while Germany closed 5 nuclear plants and 4 other reactors at currently-operating plants (10,980 MW in total).

Electricity in Illinois is 42 percent cheaper than electricity in California while electricity in France is 45 percent cheaper than electricity in Germany.

But this hypothesis is undermined by the fact that the price of the main replacement fuels, natural gas and coal, remained low, despite increased demand for those two fuels in California and Germany.

That leaves us with solar and wind as the key suspects behind higher electricity prices. But why would cheaper solar panels and wind turbines make electricity more expensive?

The main reason appears to have been predicted by a young German economist in 2013. 

In a paper for Energy Policy, Leon Hirth estimated that the economic value of wind and solar would decline significantly as they become a larger part of electricity supply. 

The reason? Their fundamentally unreliable nature. Both solar and wind produce too much energy when societies don’t need it, and not enough when they do. 

Solar and wind thus require that natural gas plants, hydro-electric dams, batteries or some other form of reliable power be ready at a moment’s notice to start churning out electricity when the wind stops blowing and the sun stops shining.

And unreliability requires solar- and/or wind-heavy places like Germany, California and Denmark to pay neighboring nations or states to take their solar and wind energy when they are producing too much of it.

Hirth predicted that the economic value of wind on the European grid would decline 40 percent once it becomes 30 percent of electricity while the value of solar would drop by 50 percent when it got to just 15 percent.

 
 
EP
Hirth predicted that the economic value of wind would decline 40% once it reached 30% of electricity, and that the value of solar would drop by 50% when it reached 15% of electricity.
In 2017, the share of electricity coming from wind and solar was 53 percent in Denmark, 26 percent in Germany, and 23 percent in California. Denmark and Germany have the first and second most expensive electricity in Europe.

By reporting on the declining costs of solar panels and wind turbines but not on how they increase electricity prices, journalists are — intentionally or unintentionally — misleading policymakers and the public about those two technologies.  

The Los Angeles Times last year reported that California’s electricity prices were rising, but failed to connect the price rise to renewables, provoking a sharp rebuttal from UC Berkeley economist James Bushnell.  

“The story of how California’s electric system got to its current state is a long and gory one,” Bushnell wrote, but “the dominant policy driver in the electricity sector has unquestionably been a focus on developing renewable sources of electricity generation.”

Part of the problem is that many reporters don’t understand electricity. They think of electricity as a commodity when it is, in fact, a service — like eating at a restaurant.

The price we pay for the luxury of eating out isn’t just the cost of the ingredients most of which which, like solar panels and wind turbines, have declined for decades.

Rather, the price of services like eating out and electricity reflect the cost not only of a few ingredients but also their preparation and delivery.

This is a problem of bias, not just energy illiteracy. Normally skeptical journalists routinely give renewables a pass. The reason isn’t because they don’t know how to report critically on energy — they do regularly when it comes to non-renewable energy sources — but rather because they don’t want to.

That could — and should — change. Reporters have an obligation to report accurately and fairly on all issues they cover, especially ones as important as energy and the environment. 

A good start would be for them to investigate why, if solar and wind are so cheap, they are making electricity so expensive."

Tuesday, April 24, 2018

Protecting U.S. Dredgers Kills Jobs: The Foreign Dredge Act of 1906 has stifled competition in the seaport industry

By Nancy McLernon in The WSJ. She is president and CEO of Organization for International Investment. Excerpts:
"The Foreign Dredge Act was an effort to protect America’s fledgling shipbuilding and seaport industries so they could compete with old-world rivals. The law survives, mostly unchanged, and international companies remain in Washington’s regulatory drydock."

"The problem is that U.S. dredging companies simply aren’t capable of meeting demand. The four largest free-market dredging companies, all based in Belgium or the Netherlands, could complete the U.S. projects for half the estimated cost and a third of the time—if Washington allowed them to compete. In the past decade, these companies have invested $15 billion in new dredgers, while the entire U.S. market invested only $1 billion. The European equipment is larger than any American company’s and handles more than 90% of the world’s open-bid dredging projects.

These international companies have succeeded in their home markets and elsewhere. If they could set up operations in the U.S., they would bring world-class training and techniques and other benefits in addition to the needed capital. Opening the dredging market would spark an infrastructure boom that would result in thousands of new, unionized dredging jobs for Americans.

The real jobs jackpot, however, would come from having the ports deepened in the next five years. This accelerated modernization would create more than 1.5 million new American jobs in port construction, services, manufacturing, warehousing, trucking, logistics and more. Radically slicing export costs would spur manufacturing."

How Bad Is the Government’s Science?

Policy makers often cite research to justify their rules, but many of those studies wouldn’t replicate

By Peter Wood and David Randall in The WSJ. Mr. Wood is president of the National Association of Scholars. Mr. Randall is the NAS’s director of researc. Excerpts:

"The biggest newsmakers in the crisis have involved psychology. Consider three findings: Striking a “power pose” can improve a person’s hormone balance and increase tolerance for risk. Invoking a negative stereotype, such as by telling black test-takers that an exam measures intelligence, can measurably degrade performance. Playing a sorting game that involves quickly pairing faces (black or white) with bad and good words (“happy” or “death”) can reveal “implicit bias” and predict discrimination.

All three of these results received massive media attention, but independent researchers haven’t been able to reproduce any of them properly. It seems as if there’s no end of “scientific truths” that just aren’t so. For a 2015 article in Science, independent researchers tried to replicate 100 prominent psychology studies and succeeded with only 39% of them.

Further from the spotlight is a lot of equally flawed research that is often more consequential. In 2012 the biotechnology firm Amgen tried to reproduce 53 “landmark” studies in hematology and oncology. The company could only replicate six. Are doctors basing serious decisions about medical treatment on the rest? Consider the financial costs, too. A 2015 study estimated that American researchers spend $28 billion a year on irreproducible preclinical research.

The chief cause of irreproducibility may be that scientists, whether wittingly or not, are fishing fake statistical significance out of noisy data. If a researcher looks long enough, he can turn any fluke correlation into a seemingly positive result. But other factors compound the problem: Scientists can make arbitrary decisions about research techniques, even changing procedures partway through an experiment. They are susceptible to groupthink and aren’t as skeptical of results that fit their biases. Negative results typically go into the file drawer. Exciting new findings are a route to tenure and fame, and there’s little reward for replication studies."

"A deeper issue is that the irreproducibility crisis has remained largely invisible to the general public and policy makers. That’s a problem given how often the government relies on supposed scientific findings to inform its decisions. Every year the U.S. adds more laws and regulations that could be based on nothing more than statistical manipulations.

All government agencies should review the scientific justifications for their policies and regulations to ensure they meet strict reproducibility standards. The economics research that steers decisions at the Federal Reserve and the Treasury Department needs to be rechecked. The social psychology that informs education policy could be entirely irreproducible. The whole discipline of climate science is a farrago of unreliable statistics, arbitrary research techniques and politicized groupthink."

Monday, April 23, 2018

Are Private Companies More Efficient Than NASA?

See Stargazers See a Business Plan by Randall Stross, a professor of business at San Jose State University. He reviewed two books for The WSJ. Excerpts:
"Both books show how SpaceX and Blue Origin have been impressively creative in reducing design and production costs far below what NASA and defense contractors are accustomed to. SpaceX builds its rockets horizontally, for example, so that it can use ordinary warehouse space instead of building expensive “high bay” space. SpaceX and Blue Origin are similar in their approach to reusability as well, constructing rockets, not planes, that rely on retropropulsion for landing. By flipping the descending rocket so that the nose points upward, and by switching on the rocket engine in the final stage, a cushion of hot gas provides a gentle landing.

Practical reusability also entails reducing the need for extensive refurbishing after each flight, something that SpaceX seems to have achieved. The space shuttle did not. After each flight, it required 1.2 million procedures and many months before it was ready to fly again."

"Virgin Galactic and Blue Origin have trained their sights on space tourism, preparing to offer wealthy passengers the chance to experience microgravity. “Entertainment turns out to be the driver of technologies,” Mr. Bezos says, noting the barnstormers in the early days of aviation who would land in farmers’ fields and sell tickets for short rides. SpaceX need not tarry with such trifles. It seems tantalizingly close to being the first startup to supply reliable, reusable rocket technology to take U.S. astronauts up to their orbiting workplace. Most gratifying, it will mean an end to NASA’s generous payouts to hitch rides with the Russians."

The Interstate Tax Grab

WSJ editorial.
"Online commerce makes up less than 10% of retail sales, and a 2017 report by the Government Accountability Office said 87% to 96% of sales by the top 100 online retailers are taxed. Amazon collects sales tax on all customer purchases, as do Target , Walmart , Costco and Sears. The major exceptions are small businesses that sell on eBay and Etsy.
 
GAO estimates that untaxed online sales make up between 2% and 4% of state and local sales tax revenues. Sales tax growth has been robust in states with healthy economies. South Dakota’s sales tax revenues have grown more than 5% annually over the last five years. Between 2012 and 2017, state and local sales tax revenues grew by a quarter."

"Some 12,000 jurisdictions in the U.S. impose sales tax, twice as many as in 1992, often with disparate rules and rates. Illinois taxes Twix and Snickers at different rates. Twix is taxed at a lower rate because it includes flour and thus qualifies as “food.” Snickers is considered candy. In New Jersey, yarn is tax-exempt only if used for knitting. How are retailers supposed to divine a buyer’s purpose?

Installing and maintaining software to comply with 12,000 tax regimes could break small businesses. One business told GAO “they had just dealt with an expensive audit that lasted 3 years” and “do not have the resources to comply with similar audits from other jurisdictions.” Businesses that collect too little tax can face stiff penalties including jail time. If they collect too much, they get slapped with class-action lawsuits.

The Justice Department has filed a brief supporting South Dakota, taking the odd position that Quill should be overturned because online retailers benefit from government-built broadband. Seriously? According to Justice, businesses that operate a website have a “virtual” presence everywhere. The European Commission has invoked the same argument to impose a digital tax on Silicon Valley tech giants, which the Trump Administration has denounced as an extraterritorial tax grab.

If the Court were to adopt Justice’s virtual standard, there would be nothing to stop California from requiring remote retailers to post cancer warnings on coffee or potato chips advertised on their websites. This would vitiate the Commerce Clause."

Sunday, April 22, 2018

Medicaid Crowds Out K-12 Education

See Crowding Out K-12 Education: The real budget story behind those teachers strikes: Medicaid and public pensions. WSJ editorial: Excerpts:
"Medicaid has taken a growing toll on Oklahoma’s budget. In 2017 the health-care program that is supposedly for the poor consumed nearly 25% of the state’s general fund, up from 14% in 2008"

"Per-student funding declined by nearly 16% between 2008 and 2017. Class sizes have grown, particularly in rural districts. Ninety-six of the state’s 513 school districts hold class only four days a week.

Oklahoma teachers went a decade without a significant raise, and only three states pay less on average, according to the National Education Association. Depending on which grade they teach, Oklahoma educators’ mean annual pay lags around $1,000 to $3,000 behind the overall state mean of $43,340"

"In Kentucky the protests have been about pensions, not pay, but the same Medicaid crowding out is taking place. The Bluegrass State was one of the first Medicaid expansion states under ObamaCare. Some 22% of residents—more than two million people—are enrolled. In 2008 Medicaid spending in Kentucky was $4.9 billion, but by 2017 it was $9.9 billion. The federal government paid $7.7 billion of that sum last year, but the burden has already begun shifting to states."

"Kentucky’s public pension woes place it on par with New Jersey and Illinois, and teachers’ pensions are only 56% funded. Participants can draw full benefits as early as age 49, and some collect longer for more years than they’ve worked."

Cuba's leaders fear that more economic freedom would lead to political liberalization

See Cuba Leaves Castro Era With Slim Prospects for Change by José de Córdoba of The WSJ. Milton Friedman thought that economic freedom was necessary for political freedom to exist. It is a necessary but not a sufficient condition. Excerpts:
"Cuba’s communist leaders are dedicated to maintaining their one-party system and reasserting state control of the economy after a brief surge in private enterprise fueled by the restart of diplomatic relations with the U.S. in 2015.

Mr. Castro’s likely successor, Miguel Díaz-Canel, a 57-year-old party apparatchik, has excoriated small-business owners as enemies of the state and said the U.S. opening to Cuba under President Barack Obama was meant to “destroy the Cuban revolution.”"

"But the Obama detente spurred a backlash from alarmed regime stalwarts scared that more economic freedom would lead to political liberalization. Last year, the government stopped issuing new licenses for restaurants and other businesses as officials railed against the new entrepreneurial class.

Mr. Castro himself, despite being the architect of the modest economic reforms that enabled private jobs, lashed out at the new entrepreneurs.

“There are reports of cases where the same person has two, three, four and as many as five restaurants,” he said in a speech to the National Assembly last year. “Someone who has traveled abroad as many as 30 times. Where did he get the money?”"

Saturday, April 21, 2018

Do Not Fear the Population Explosion: Malthus' fearmongering doesn't stand a chance against our innovation

By Chelsea Follett and Tirzah Duren of HumanProgress.org.
"Tom Hanks – of all people  – was recently discussing overpopulation on NBC’s Today show. He was doing it to promote his upcoming movie, Inferno, which is all about an overpopulation crisis. The actor claimed that we will have too many people “in an instant” and that the planet will be unable to support them. This is not a new idea. It dates back to the late 1700s, when Thomas Robert Malthus feared that large population would exhaust Earth’s resources and result in mass poverty and starvation.

Mr Hanks is not the first to echo his concerns. Hollywood has a long history of making dystopian movies painting a gloomy portrait of humanity’s future and Malthusianism even remains popular among some university professors. Economist Jeffrey Sachs, who directs the Earth Institute at Columbia University, fears that “we might yet confirm the Malthusian curse”.

Yet in over 200 years, Malthus’ fears have not come to pass. We are not facing species-wide starvation: human innovation has brought hunger and poverty to record-lows and food production has climbed to new highs, as farmers have found new ways to produce ever more food per hectare of land.

yeilds

According to the UN Food and Agriculture Organisation, the amount of land dedicated to agriculture globally has remained roughly stable from the early 1990s – approximately when the previous trend of expansion of agricultural land came to an end. In fact, since the turn of the new millennium, use of land for agriculture has fallen slightly. Around 26 million fewer hectares of land were farmed in 2013 than in 2000. Even so, this reduction occurred alongside a dramatic decrease in world hunger. We were able to reduce hunger because agricultural productivity increased.

agarea

Agricultural productivity has rapidly improved through the efforts of ordinary people engaged in innovation and exchange. It was an Iowan called Norman Borlaug who pioneered the development of hybrid crops through selective cross-breeding of plants, which enhanced certain desired traits. In the case of wheat, for example, he was able to create plants with shorter stalks. Less energy wasted on growing tall stalks meant more energy for growing the edible portion of the wheat. This technology helped to increase global grain output by an incredible 170 per cent between 1950 and 1992.

Even Professor Sachs acknowledges that if “technology enables us to economize on natural capital”, then we can avert a Malthusian disaster. Hearteningly, technology is helping us to do precisely that.

Not all academics are as pessimistic as Professor Sachs. There is a growing movement of “ecomodernists”, who believe that human ingenuity can help the planet. The New York Times admitted last year that the Earth is not facing a problem of overpopulation.

Environmental scientist and HumanProgress.org advisory board member Jesse Ausubel, who helped set up the world’s first climate change conference in Geneva in 1979, believes that agricultural land use will start to radically shrink.He has argued that if we “keep lifting average yields, stop feeding corn to cars, restrain our diets lightly, and reduce waste, then an area the size of India or the USA east of the Mississippi could be released from agriculture over the next 50 years or so”. And by freeing up land, we can allow nature to rebound.

More innovation doesn’t just reduce land use, it can also prevent other kinds of environmental depletion as well. As shown below, agricultural water use in OECD countries, global greenhouse gas emissions from agriculture, and American energy use for agriculture, have all either decreased or remained stable while food production has been increased.

water

greenhouse

energymaterials

Modern agriculture has married farming and technology to meet the nutritional demands of a growing population while limiting the environmental impact of doing so.

Even with a global population projected by some to reach over 11 billion by 2100 (which may be too high, given that population growth is now falling even in developing areas), there is still no need for alarm. Recent trends in agricultural development show that humanity can find ways of eliminating hunger, while limiting negative environmental impacts.

So the next time you hear a celebrity bemoaning overpopulation or watch a dystopian movie like Inferno, just remember: Malthus’ fearmongering doesn’t stand a chance against human ingenuity."

The only thing you need to know about Earth Day

By Tim Worstall.
"Whale oil provided the lighting to read the breakthrough novel of 1870, the story of Captain Nemo in 20,000 Leagues Under The Sea. That was also the year of the foundation of Standard Oil. The result of that foundation is that we didn’t hunt the whales to extinction, but instead turned to kerosene to light the latter part of the 19th century, moving to electricity only in the 20th.

It really isn’t hyperbole to insist that John D. Rockefeller saved the whales by his making mineral oil products so much cheaper than the cetacean-derived equivalent. And that’s really all you need to know to understand Earth Day and what to do about it.

We need to be as viciously capitalist and free market as we can to save the planet.

This is not, as you will note, what is generally said about this day of celebration of all things environmental. Yet it's still the truth.

One thing noted by Simon Kuznets was that poor people, truly poor, don’t give a damn about the environment. Forests are burnt down, rivers clogged with sewage, animals hunted to extinction, (the megafauna of every continent and island group underwent catastrophic extinctions just after man first arrived. This is not a coincidence) just because everyone’s too busy finding dinner for that day – or avoiding being something else's dinner. Kuznets went on to note that richer people care and do more about the preservation of the natural world -- partly because we all just like looking at it knowing that it’s there, partly because richer people can have a longer time horizon.

This observation has been codified as the environmental Kuznets curve, and the switch to the environment getting better as incomes rise, well, there’s a bit of argument about that. Generally around the $8,000 to $10,000 per person per year range for GDP when people start to care about the environment. That's about the average income in the world today. Back in the 1960s, the U.S. was at that level (when expressed in today’s money). That was around when we decided that after a century of the Cuyahoga River catching fire maybe we should stop doing that? We became rich enough to do so.

So, if we want the environment to get better, Mother Earth to heal her wounds, Gaia to recover, etc., we need to go pell for leather in making the world's poorest as rich as possible as fast as possible. That’s the pre-condition for people to care enough to do all of those things which we know will benefit the environment.

The excellent news is we know how to do that.

We know there’s only one economic system which does do that. Every country and populace that has got rich has done so through some variant of capitalist free-marketry. No socialist system has managed it, no feudal, no planned, no fascist, certainly no communist. Every place and time which has engaged in what we today call neoliberal globalization has become rich. Thus, that's the system we need to use to save the environment, isn’t it?

Fortunately we have been using it these recent decades. The effects upon poverty have been startling. Absolute poverty, living on less than $1.90 a day of modern money, has fallen since Ronald Reagan moved into the White House. At that time, 45 percent of all humans lived in absolute poverty, but today fewer than 10 percent do. That’s another 35 percentage points of all of us getting rich enough to care about Gaia at least a little bit. The effects of that upon air (and other forms of) pollution have been equally startling. Sure, CO2 is still going up, but we can deal with that the same way: getting rich enough to both have the resources to do it and also rich enough to care to do so.

It’s wealth, human wealth, which saves the planet. Given that we’ve only one known way of creating wealth, neoliberal globalization, then that’s what we should be doing.

If you’re not a capitalist free-marketeer for Earth Day, then you’re not being serious about the environment, are you? And that would make Gaia sad and Mother Earth cry."

Friday, April 20, 2018

How public and private bureaucracies stifle innovation

See The coagulated economy by Matt Ridley.


"While the world economy continues to grow at more than 3 per cent a year, mature economies, from Europe to Japan, are coagulating, unable to push economic growth above sluggish. The reason is that we have more and more vested interests against innovation in the private as well as the public sector.
Continuing prosperity depends on enough people putting money and effort into what the economist Joseph Schumpeter called creative destruction. The normal state of human affairs is what The jurist Sir Henry Maine called a “status” society, in which income is assigned to individuals by authority. The shift to a “contract” society, in which people negotiate their own rewards, was an aberration and it’s fading. I am writing this from Amsterdam and am reminded we caught the idea off the Dutch, whose impudent prosperity so annoyed the ultimate status king, Louis XIV.

In most western economies, it is once again more rewarding to invest your time and effort in extracting nuggets of status wealth, rather than creating new contract wealth, and it has got worse since the great recession, as zombie firms kept alive by low interest rates prevent the recycling of capital into new ideas. A new book by two economists, Brink Lindsey and Steven Teles, called The Captured Economy: How the Powerful Enrich Themselves, Slow Down Growth, and Increase Inequality, argues that “rent-seeking” behaviour — the technical term for extracting nuggets — explains the slow growth and rising inequality in the US.

They make the case that, in four areas, there is ever more opportunity to live off “rents” from artificial scarcity created by government regulation: financial services, intellectual property, occupational licensing and land use planning: “The rents enjoyed through government favouritism not only misallocate resources in the short term but they also discourage dynamism and growth over the long term.”

Here, too, hidden subsidies ensure that financial services are a lucrative closed shop; patents and copyrights reward the entertainment and pharmaceutical industries with monopolies known as blockbusters; occupational licensing gives those with requisite letters after their name ever more monopoly earning power; and planning laws drive up the prices of properties. Such rent seeking redistributes wealth regressively — that is to say, upwards — by creating barriers to entry and rewarding the haves at the expense of the have-nots. True, the tax and benefit system then redistributes income back downwards just enough to prevent post-tax income inequality from rising. But government is taking back from the rich in tax that which it has given to them in monopoly.
[Thomas Babington Macaulay MP summarised an early attempt to extend copyright in a debate thus: "The principle of copyright is this. It is a tax on readers for the purpose of giving a bounty to writers. The tax is an exceedingly bad one; it is a tax on one of the most innocent and most salutary of human pleasures; and never let us forget, that a tax on innocent pleasures is a premium on vicious pleasures." A correspondent sends me the following details of this appalling saga: "Someone noted that there is a divergence in copyright term in the European Union. All the then member states protect works for the life of the author plus fifty years while West Germany alone protects works for the life of the author plus seventy years. Immediately the copyright publishers suggested this as something in need of harmonisation. But instead of harmonising down to the norm, all the member states were lobbied to harmonise up to the unique German standard. As a result, Adolf Hitler's "Mein Kampf" which was going out of copyright in 1995 was suddenly revived and protected as a copyrighted work throughout the European Union. Gilbert and Sullivan operettas whose copyright had been controlled by the stultifying hand of the D'Oyly Carte Opera Company found themselves in a position to once again stop anyone else performing Gilbert and Sullivan works or creating anything based upon them. It is not surprising that, following a brief flowering of new creativity when the Gilbert and Sullivan copyrights initially expired (e.g. Joseph Papp's production of Pirates on Broadway and the West End stage), since their revival by the European Union harmonisation legislation their use have become effectively moribund. A generation of young people are growing up without knowing anything about Gilbert and Sullivan - an art form which, it can be argued, gave birth to the modern American and British musical theatre."]
 
As for occupational licensing, Professor Len Shackleton of the University of Buckingham argues that it is mostly a racket to exploit consumers. After centuries of farriers shoeing horses, uniquely in Europe in 1975 a private members bill gave the Farriers Registration Council the right to prosecute those who shod horses without its qualification.

Then there are energy prices. Lobbying by renewable energy interests has resulted in a system in which hefty additions are made to people’s energy bills to reward investors in wind, solar and even carbon dioxide-belching biomass plants. The rewards go mostly to the rich; the costs fall disproportionately on the poor, for whom energy bills are a big part of their budgets.

An example of how crony capitalism stifles innovation: Dyson found that the EU energy levels standards for vacuum cleaners were rigged in favour of German manufacturers. The European courts rebuffed Dyson’s attempts to challenge the rules, but Dyson won on appeal and then used freedom of information requests to uncover examples of correspondence between a group of German manufacturers and the EU, while representations by European consumer groups were ignored.

So deeply have most businesses become embedded in government cronyism that it is hard to draw the line between private, public and charitable entities these days. Is BAE Systems or Carillion really a private enterprise any more than Oxford University, Oxfam, Oxfordshire county council or the NHS? All are heavily dependent on government contracts, favours or subsidies; all are closely regulated; all have well-paid senior managers extracting rent with little risk, and thickets of middle-ranking bureaucrats incentivised to resist change. Disruptive start-ups are rare as pandas; the vast majority work for corporate brontosaurs.

Capitalism and the free market are opposites, not synonyms. Some in the Tory party grasp this. Launching Freer, a new initiative to remind the party of the importance of freedom, two new MPs, Luke Graham and Lee Rowley, not only lambast fossilised socialism and anachronistic unions, but also boardrooms “peppered with oligarchical and monopolist cartels”.

One of the most insightful books of recent years was The Innovation Illusion by Fredrik Erixon and Björn Weigel, which argues that big companies increasingly spend their profits not on innovation but on share buybacks and other “rents”. Far from swashbuckling enterprise, much big business is “increasingly hesitant to invest and innovate”. Like Kodak and Nokia they resist having to reinvent themselves even unto death. Microsoft “was too afraid of destroying the value of Windows” to go where software was heading.

As a result, globalisation, far from being a spur to change, is an increasingly conservative force. “In several sectors, the growing influence of large and global firms has increasingly had the effect of slowing down market dynamism and reducing the spirit of corporate experimentation”.

The real cause of Trump-Brexit disaffection is not too much change, but too little. We need to “radically reduce the restrictive effect of precautionary regulation” and promote a new regulatory culture based on permissionless innovation, Erixon and Weigel say. “Western economies have developed a near obsession with precautions that simply cannot be married to a culture of experimentation”. Amen."