Friday, September 30, 2016

What President Obama Will Not Tell Leonardo DiCaprio About Climate Policy

By Marlo Lewis, Jr. of CEI.
“President Obama will meet with actor Leonardo DiCaprio at an upcoming White House-sponsored arts festival to discuss the dangers posed by climate change,” the Washington Examiner reports. They will be joined by climate scientist Dr. Kathryn Hayhoe to examine “the importance of protecting the one planet we’ve got for future generations,” according to the White House website. Following the conversation, attendees will watch the U.S. premiere of DiCaprio’s climate change documentary film Before the Flood.

Mr. DiCaprio is one of my favorite actors, and I do not question his passion to protect humanity and nature. But as the saying goes, the road to hell is sometimes paved with good intentions. There are risks not only of climate change but also of climate change policy. However, the Titanic star will never learn that from President Obama or Dr. Hayhoe.

In his acceptance speech at the Oscars, Mr. DiCaprio said:

And lastly, I just want to say this: Making The Revenant was about man's relationship to the natural world. A world that we collectively felt in 2015 as the hottest year in recorded history. Our production needed to move to the southern tip of this planet just to be able to find snow. Climate change is real, it is happening right now. It is the most urgent threat facing our entire species, and we need to work collectively together and stop procrastinating. We need to support leaders around the world who do not speak for the big polluters, but who speak for all of humanity, for the indigenous people of the world, for the billions and billions of underprivileged people out there who would be most affected by this. For our children’s children, and for those people out there whose voices have been drowned out by the politics of greed. I thank you all for this amazing award tonight. Let us not take this planet for granted. I do not take tonight for granted. Thank you so very much. 

I encourage Mr. DiCaprio and others who share his convictions to open their hearts and minds to competing concerns and ideas. Climate change is not the most urgent threat facing humanity. Globally, poverty always has been and remains by far the number one cause of preventable illness and premature death.

The billions of underprivileged people who are the most vulnerable to climate change are also the most vulnerable to the vicissitudes of nature and climate generally. Why? In large part because they lack access to commercial energy. An estimated 1.3 billion people have no electricity, another 2.3 billion have too little electricity to support development, and many of those same people cannot afford automobiles and may never experience the personal mobility we take for granted.



By the same token, automobiles and other largely fossil-fueled technologies have dramatically reduced humanity’s vulnerability to climate-related risks. As energy scholar Alex Epstein puts it, human beings using fossil fuels did not take a safe climate and make it dangerous, they took a dangerous climate and made it much safer.

For example, historically drought has been the most lethal form of extreme weather, because it threatens the availability of food and water. In the 1920s an estimated 472,000 people worldwide died from drought. Since then roughly 90 percent of all carbon dioxide (CO2) emissions since the dawn of the Industrial Revolution entered the atmosphere, CO2 concentrations increased from about 303 parts per million (ppm) to 400 ppm, and the planet warmed by about 0.8°C. If fossil-fueled development were “unsustainable,” the death toll from drought should be even larger today. Instead, total deaths and death rates related to drought declined by a spectacular 99.8 percent and 99.9 percent, respectively.



Drought is far less dangerous today thanks largely to fossil fuel-supported technologies (mechanized agriculture, synthetic fertilizers, refrigeration, plastic packaging) and capabilities (motorized transport, modern communications, emergency relief programs). Deaths and death rates related to other forms of extreme weather have also declined substantially since the good old days of ~300 ppm CO2. In short, the climate has become more livable.



In the process, by making agriculture fantastically more productive, fossil fuels also rescued nature from humanity. Cato Institute scholar Indur Goklany estimates that to maintain the current level of global food production without fossil fuels, “at least another 2.3 billion hectares of habitat would have to be converted to cropland”—an area equivalent to the territories of the United States, Canada, and India combined. Thus, he observes:

Not only have these fossil fuel-dependent technologies ensured that humanity’s progress and well-being are no longer hostage to nature’s whims, but they saved nature herself from being devastated by the demands of a rapidly expanding and increasingly voracious human population.

The politics of greed knows no boundaries of party or ideology. Indeed, politics everywhere from time immemorial is chiefly the organized pursuit of plunder. Greed is no stranger to global warming advocacy, which seeks to redistribute trillions of dollars in wealth from fossil-energy interests to alternative-energy interests via carbon taxes, cap-and-trade, renewable energy quota, and other interventions designed to pick energy market winners and losers. Moreover, a major objective of the Paris Agreement is “mobilizing climate finance,” more commonly known as foreign aid, i.e. taxing poor people in rich countries for the benefit of rich people in poor countries.

A key point for Mr. DiCaprio to ponder is this. The actual climate influenced by fossil fuel emissions is already on track to meet the Paris Agreement’s 2°C climate “stabilization” target. But if we assume the validity of the modeled climate based on “consensus” science, the 2°C limit on global warming cannot be achieved without imposing painful sacrifices on developing countries.

Stephen Eule of the U.S. Chamber of Commerce’s Institute for 21st Century Energy has done the math. If industrial countries like the U.S. magically reduce their emissions to zero by 2050, the 2°C target is still unattainable unless developing countries cut their current CO2 emissions by 35 percent. If, less unrealistically, industrial countries reduce their emissions by 80 percent, developing countries would have to cut their current CO2 emissions almost in half—by 48 percent.



Nobody knows how developing countries can simultaneously eradicate energy poverty over the next few decades and reduce their consumption of the world’s most abundant, affordable, and reliable energy sources by 35 percent or more. Putting an energy-starved world on an energy diet obviously has the potential to be a cure worse than the alleged disease. Those who care about the world’s underprivileged people, as I believe Mr. DiCaprio sincerely does, should carefully consider the risks of climate policy as well as those of climate change."

Thursday, September 29, 2016

More Regulation Won’t Make Housing Affordable

By Randal O'Toole of Cato.
"A new Housing Policy Toolkit from the White House admits that “local barriers to housing development have intensified,” which “has reduced the ability of many housing markets to respond to growing demand.” The toolkit, however, advocates tearing down only some of the barriers, and not necessarily the ones that will work to make housing more affordable.

“Sunbelt cities with more permeable boundaries have enjoyed outsized growth by allowing sprawl to meet their need for adequate housing supply,” says the toolkit. “Space constrained cities can achieve similar gains, however, by building up with infill.” Yet this ignores the fact that there are no cities in America that are “space constrained” except as a result of government constraints. Even cities in Hawaii and tiny Rhode Island have plenty of space around them–except that government planners and regulators won’t let that space be developed.

Instead of relaxing artificial constraints on horizontal development, the toolkit advocates imposing even tighter constraints on existing development in order to force denser housing. The tools the paper supports include taxing vacant land at high rates in order to force development; “enacting high-density and multifamily zoning,” meaning minimum density zoning; using density bonuses; and allowing accessory dwelling units. All of these things serve to increase the density of existing neighborhoods, which increases congestion and–if new infrastructure must be built to serve the increased density–urban-service costs.

 
Urban areas with regional growth constraints suffered a housing bubble in the mid-2000s and are seeing housing prices rise again, making housing unaffordable. Source: Federal Housing Finance Agency home price index, all transactions.

Developers learned more than a century ago that people will pay a premium to know that the neighborhood they live in will not get denser. Even before zoning, developers used restrictive covenants to limit density because they knew people would pay higher prices for lots with such covenants. When zoning was introduced to do the same thing, many neighborhoods were built without such covenants, but that doesn’t mean the people in those neighborhoods will be happy to see four- and five-story buildings pop up among their single-family homes.


Urban areas with few regional growth constraints see only moderate changes in housing prices over time and still have plenty of affordable housing.

Planners argue the market has changed and more people want denser development. This is belied by the toolkit, which also supports the use of property tax abatements and value capture incentives (i.e., tax-increment financing) to promote higher densities. If there really were a market for higher densities, such subsidies would not be necessary.

If there really is a market for higher densities, then developers should be allowed to build such densities in areas that are not already established low-density neighborhoods. But developers should also be allowed to build low-density neighborhoods at the urban fringe to meet the demand for that kind of development. Instead, state and local planning rules in California, Florida, Hawaii, Oregon, Washington, and most New England states have essentially made such low-density developments illegal.

Moreover, there is little reason to believe that “building up with infill” will make cities more affordable. Artificial constraints on urban growth make land many times more expensive than in unconstrained areas. Mid-rise and high-rise housing costs more to build per square foot than low-rise housing.

Increasing density generally correlates with decreasing housing affordability. Source: 2010 census.

No matter how often urban planners chant, “grow up, not out,” the fact is that no urban area in the nation has ever made housing more affordable by increasing its density. In fact, as the chart above shows, there is a clear correlation between density and housing unaffordability.

The urban areas that have been increasing their densities through artificial growth constraints are precisely the ones that are having affordability problems. For example, from 1970 to 2010 the density of the San Francisco-Oakland urban area grew by 43 percent while its median home value-to-median family income ratio (a standard measure of housing affordability) grew from 2.2 to 7.1. Portland’s density grew by 14 percent and its value-to-income ratio grew from 1.6 to 3.9. Honolulu’s density grew by 23 percent and its value-to-income ratio grew from 3.2 to 6.6. Growing up has made these regions less affordable, not more.

Ultimately, what is wrong with the White House toolkit is that it is focused on local zoning which it should be focused on regional growth management. If there are no regional growth constraints, local zoning won’t make housing more expensive because developers can always build in unrestricted areas. Dallas has zoning; Houston doesn’t, yet in 2014 both had value-to-income ratios of 2.4. Only regional growth constraints make housing expensive. Every major city in America except Houston has local zoning, yet only those cities that have growth constraints have become unaffordable.

The real danger is that the White House’s policies will be imposed, via the Department of Housing and Urban Development, on areas that have few regional growth constraints today. The increased regulation advocated by the White House will make those areas less affordable, not more, while it won’t do anything at all for areas that already have lots of growth constraints.

The White House toolkit calls its proposals “smart housing regulation.” Truly smart regulation would rely on policies that work, not policies that only work in the fantasies of urban planners. The policies that do work would better be described as “smart land-use deregulation,” as they involve dramatically reducing constraints in unincorporated areas. Until that happens, housing will continue to become less affordable in constrained areas."

Ride-sharing services such as Uber significantly decrease the traffic congestion after entering an urban area

See Do Ride-Sharing Services Affect Traffic Congestion? An Empirical Study of Uber Entry by Ziru Li, Yili Hong and Zhongju Zhang of Arizona State University. Abstract:
"Sharing economy platform, which leverages information technology (IT) to re-distribute unused or underutilized assets to people who are willing to pay for the services, has received tremendous attention in the last few years. Its creative business models have disrupted many traditional industries (e.g., transportation, hotel) by fundamentally changing the mechanism to match demand with supply in real time. In this research, we investigate how Uber, a peer-to-peer mobile ride-sharing platform, affects traffic congestion and environment (carbon emissions) in the urban areas of the United States. Leveraging a unique data set combining data from Uber and the Urban Mobility Report, we examine whether the entry of Uber car services affects traffic congestion using a difference-in-difference framework. Our findings provide empirical evidence that ride-sharing services such as Uber significantly decrease the traffic congestion after entering an urban area. We perform further analysis including the use of instrumental variables, alternative measures, a relative time model using more granular data to assess the robustness of the results. A few plausible underlining mechanisms are discussed to help explain our findings."

Wednesday, September 28, 2016

President Obama's CEA chair, on the effects of housing restrictions

See Furman on zoning by John Cochrane.
"On this day (Clinton vs. Trump debate) of likely partisan political bloviation, I am delighted to highlight a very nice editorial by Jason Furman, President Obama's CEA chair, on the effects of housing restrictions. A longer speech here. The editorial is in the San Francisco Chronicle, ground zero for housing restriction induced astronomical prices. Furman:

 When certain government policies — like minimum lot sizes, off-street parking requirements, height limits, prohibitions on multifamily housing, or unnecessarily lengthy permitting processes — restrict the supply of housing, fewer units are available and the price rises. On the other hand, more efficient policies can promote availability and affordability of housing, regional economic development, transportation options and socioeconomic diversity...
...barriers to housing development can allow a small number of individuals to enjoy the benefits of living in a community while excluding many others, limiting diversity and economic mobility. 
This upward pressure on house prices may also undermine the market forces that typically determine patterns of housing construction, leading to mismatches between household needs and available housing. 
What's even more praiseworthy is what Furman does not say: "Affordable" housing constructed by taxpayers, or by forcing developers to provide it; rent controls; housing subsidies; bans on the construction of market-rate housing (yes, SF does that); bans on new businesses (yes, Palo Alto does that), and the rest of the standard bay-area responses to our housing problems.  The first few may allow a few lucky low-income people to stay where they are, as long as they remain low-income, but does not allow new people to come chase opportunities. Subsidies that raise demand without raising supply just raise prices more. As in child care or medicine.



When President Obama's CEA chair writes an oped, most of which could easily have come from Hoover or CATO, it's a hopeful day, no matter what happens tonight.

Moreover, Furman recognizes that a thousand-point federal program imposed on states and local governments by regulation is not the answer:
While most land use policies are appropriately made at the state and local level, the federal government can also play a role in encouraging smart land use regulations
We have found the enemy, as Pogo said, and it is us.

The real political economy is tough, of course. Current residents vote for restrictions, and not just out of misunderstanding.  Current residents (people like me), who buy expensive houses in this beautiful area, vote to keep things just as they are. Restrictions mean they can't sell houses for $10 million to a developer who wants to put up a 100 story office building and turn it in to Manhattan. But restrictions mean they can sell for $2 million and retire comfortably to Mendocino.  Or stay  right where they are, paying property taxes based on the 1965 value of their house (another big impediment to housing mobility and affordability) and making sure the neighbors don't sell and ruin their view. Renters vote for rent control, affordable housing mandates, and so on that applies to current residents but not to newcomers.  This behavior has a  negative externality on low-income ("low" means out of top 0.5%in SF!)  people who want to move here. But a Trumpian mini-wall of regulation keeps them out. The most local government is not always the best. The most liberal government often acts with effects that are surprisingly conservative."

Certificate of Need Laws Show Entry Barriers Can Raise Spending

By James Bailey of Mercatus. Excerpt:

"In most states, health care providers who seek to open a new hospital must obtain a “certificate of need” (CON) from a state board certifying that there is an “economic necessity” for their services. A new working paper published by the Mercatus Center finds that states with CON laws spend 3 percent more on health care than other states.

What does it take to open a new hospital or health facility? Certainly a building, equipment, supplies, and a staff of trained medical professionals. But in most states, this is not enough. Health care providers must also obtain a “certificate of need” from a state board certifying that there is an “economic necessity” for their services. 

Certificate of need (CON) laws were passed rapidly between 1964 and 1980 in the hope of restraining the growth of health spending. By 1980, every state but Louisiana had a CON program, and the federal government was pushing states to adopt CON by threatening to withhold Medicare funds from states without it.

But by 1986, the feds were no longer convinced that this approach to keeping costs down was working, and they stopped pushing CON onto states. Since then 15 states have repealed their CON laws, allowing healthcare providers to make their own decisions about how to expand.

In a new working paper published by the Mercatus Center at George Mason University, I investigate how health care spending has changed in the 15 states that repealed CON compared to the 35 states that have not. I find that CON laws have not led to lower spending. In fact, states with CON laws actually spend 3 percent more  on health care than other states.

In other words, not only have CON laws failed in their goal of reducing health care spending, they have backfired and driven spending upward.

A bit of simple economic theory explains why. High spending can be driven by two factors: a high quantity of sales, or high prices. By restraining the supply of new health care, CON laws try to push quantities down. But with fewer competitors entering the market, existing providers find themselves able to raise prices without jeopardizing their market. Because the demand for health care is relatively inelastic, supply restrictions like CON increase prices more than they decrease quantities, leading to an increase in total spending. While data on health care prices are notoriously difficult to acquire, my paper shows that hospital charges fall by 1 percent per year over 5 years in CON-repealing states relative to CON-maintaining states."

Tuesday, September 27, 2016

The dangers of Clinton’s prescription drug plan

By Joel M. Zinberg of CEI.
“Hillary Clinton’s Plan for Lowering Prescription Drug Costs” — a campaign briefing — is a blueprint for destroying our pharmaceutical industry. And it isn’t just drug companies that should be scared. The plan sets a precedent for destructive command and control in every part of the economy.
Citing ACA provisions that limit health insurance companies’ margins as a model, Clinton claims that because drug companies benefit from federally funded research and R&D incentives, they should not be allowed to reap “excessive profits” or spend “unreasonable amounts” on marketing, and should be required “to invest a sufficient amount” in R&D. I have no idea what level of drug company profits is “excessive,” what is “reasonable” to spend on marketing, or what is “sufficient” R&D investment. But I doubt government bureaucrats do either.

Two Clinton proposals — the importation of drugs from overseas and Medicare negotiation of drug prices — amount to price controls that will lower prices in the short-run but destroy drug development in the long-run, resulting in fewer choices for physicians and patients. Many countries’ governments set drug prices and ration coverage either by being the sole large drug purchaser or through regulation. Importing these countries’ cheaper drugs imports the shortcomings of their price control regimes.
Allowing Medicare, as Clinton describes it, to “use its leverage [to] drive down drug and biologic prices” would be more akin to the government price setting that occurs overseas, rather than a negotiation. The over 55 million Medicare beneficiaries are older and sicker than the rest of the population and consume a disproportionate amount of all drugs used. As a result, Medicare would effectively dictate drug prices nationwide and decree which drugs would be available.

While companies benefit from federally financed discoveries, turning those findings into usable drugs is an expensive and risky private enterprise. Only one out of thousands of investigational compounds advances through the stages of R&D, preclinical animal testing and three phases of human clinical trials to become a marketed drug. The average cost to market per new drug is $2.6 billion. Only one in three is profitable and only one in ten becomes a blockbuster. Without temporary high prices in the U.S. market before generic competition, there will be less R&D, fewer new breakthrough drugs, fewer competitor drugs developed, and ultimately no lower priced generics to follow. European countries’ price controls imposed in the 1980s prove the point. In the mid-80s, European drug R&D was 24% higher than in the U.S. After price controls, European pharmaceutical R&D grew at half the U.S. rate and today substantially trails American R&D.

Clinton shortsightedly claims we can increase competition for biologic drugs by lowering the exclusivity period before follow-on biologics can be approved from 12 to 7 years. Biologics — complex molecules manufactured in living systems — are particularly difficult to develop and manufacture. The 12-year period was a compromise crafted by the Senate Health, Education, Labor and Pension (HELP) committee to preserve innovation. The committee’s 6/27/2007 press release quoted Clinton who was a HELP member at the time as saying, “With this committee’s action today, I am proud that we will both continue the creativity and innovation that is absolutely essential to our pharmaceutical industry and the lifesaving treatments and interventions they are able to provide for us….” Reducing the exclusivity period she previously lauded will diminish the willingness of companies to risk development costs and hamstring innovation.

Clinton also proposes eliminating “corporate write-offs for direct-to- consumer advertising [DTCA].” She claims the tax deductibility of DTCA subsidizes commercial speech that increases costs by encouraging patients to ask for unnecessary, expensive medicines. But there is little evidence DTCA is harmful or drives up costs. Physicians do not fill prescriptions simply because a patient requests it. The GAO found that only 2-7% of patients who requested a prescription in response to DTCA received a prescription for it. And the FDA and others have reported that DTCA advertising increases patients’ awareness of disease conditions and new treatments and promotes dialogue with physicians. Clinton’s plan to establish mandatory FDA pre-clearance of DTC ads for accuracy and clarity to be funded by pharmaceutical company user fees is not objectionable. But it is hard to imagine how the overstretched FDA could do it.

Finally, Clinton promises to convene an expert panel to determine how much drug companies should invest in R&D “and if they do not meet their targets, boost their investment or pay rebates to support basic research.” This, along with her determination to set prices and limit profits, should alarm technology executives. Their businesses rely on the internet which was built by government. Will bureaucrats have license to label Amazon or Facebook profits excessive? Will executives learn they spent “unreasonable amounts” on marketing and have to pay back taxes? Will government experts determine what is “sufficient” R&D spending? This won’t be limited to drug companies or internet beneficiaries. The Elizabeth Warren — “you didn’t build that” — crowd thinks government should dictate profits and spending for all private companies.

First they came for the medical insurers. Then they came for the pharmaceutical companies. American innovators and tech executives should worry who’s next."

The U.S. Postwar Miracle

By David Henderson of EconLog.
"Co-blogger Scott Sumner posted yesterday some interesting facts and figures about the post World War II economic boom in the United States that came after the U.S. government cut government spending massively.

I wrote a study on this for Mercatus in 2010 and highlighted some of the findings in this post in November 2010.

Two things I want to highlight.

First, as I wrote then, "federal government spending on goods and services fell, in a period of two to three years, by over one third of GDP." That puts in perspective Gary Johnson's proposal to cut government spending by 20 percent. A 20 percent cut in federal government spending, when government spending is about 20 percent of GDP, is a 4-percentage-point cut in government spending as a percent of GDP. In other words, Johnson proposes to cut government spending as a percent of GDP by about one eighth of the percent by which Truman and Congress cut it in 2 to 3 years.
Second, I want to answer the question raised by one of the commenters who said:

Real GDP declined an unprecedented 12.7% from 1944 to 1946. Hardly "fine".

I answered this point at length in my Mercatus study. Here's an excerpt from my answer:
According to official government data, the U.S. economy suffered its worst one-year recession in history in 1946. The official data show a 12-percent decline in real GNP after the war. A 12-percent decline in one year would fit anyone's idea of not just a recession, but an outright depression. So, is the story about a postwar boom pure myth? If you ask most people who were young adults in those years (a steadily diminishing number of people, so talk to them soon) about economic conditions after the war, they will talk about "the postwar boom." They saw it as a time of prosperity.

The same commenter, Ben, wrote:
Sure, unemployment didn't increase substantially at all which was likely due to a high demand for workers as women returned home and left the workforce again.

I answered this in some detail in my Mercatus study also. I called this the "Rosie the Riveter Goes Home" explanation. It turns out that half of the additional women who entered the work force during WWII did not go home.

Here's what I wrote in the study:

"There was no surge in unemployment," goes the first explanation, "because women left the defense plants and went back to being housewives and raising families." This explanation is half true and totally misleading. First, approximately half of the women who entered the labor force in the early 1940s stayed. The number of women in the labor force rose from 14.5 million in 1941 to a peak of 19.4 million in 1944, declining to 16.9 million in 1947. In other words, of the 4.9 million women who entered the labor force between 1941 and 1944, 2.4 million stayed in the labor force. Thus, there was still a need for millions of jobs to open up for newly demobilized male soldiers. The fact that the unemployment rate stayed in the low single digits is an outstanding success story.
Second, what defense plants? Almost all of them shut down or reconverted to peacetime uses after the war. Women who wanted to stay employed had to find other private work. As [Robert] Higgs points out, "[T]he real miracle was to reallocate a third of the total labor force to serving private consumers and investors in just two years.""

Monday, September 26, 2016

Problems with the “co-benefits” argument behid EPA’s Clean Power Plan (CPP)

See Comment on EPA Power Plan's Alleged Air Pollution “Co-Benefits” by Marlo Lewis, Jr. of CEI.
"Climate activists assure us that even if we don’t consider global warming a big problem, we should still support carbon taxes, renewable energy quota, and EPA’s so-called Clean Power Plan (CPP). Such policies, we are told, will save thousands of lives, delivering billions of dollars in net benefits, by coincidentally reducing airborne concentrations of fine particulate matter (PM2.5).
There are three main problems with this “co-benefits” argument. First, EPA’s own data show that total emissions of six principal air pollutants declined 62 percent since 1980 even though carbon dioxide (CO2) emissions increased by 14 percent. What’s more, PM2.5 concentrations declined by 34 percent just since 2000 (the earliest year for which national data are available). History refutes the claim that we need carbon taxes or climate regulations to clean the air.

Second, in the U.S., today’s historically low PM2.5 levels likely pose no threat to human life, as UCLA Prof. James Enstrom and nine other experts argue a letter summarizing their work in the field. Among other points, the Enstrom team explain:

No plausible etiologic mechanism by which PM2.5 causes premature death is established. It is implausible that a never-smoker’s death could be caused by inhalation over an 80 year lifespan of about one teaspoon (~5 grams) of invisible fine particles as a result of daily exposure to 15 µg/m³ [15 micrograms per cubic meter]. This level of exposure is equivalent to smoking about 100 cigarettes over a lifetime or 0.004 cigarettes per day, which is the level often used to define a never-smoker. The notion that PM2.5 causes premature death becomes even more implausible when one realizes that a person who smokes 0.2 cigarettes/day has a daily exposure of about 750 µg/m³. If a 10 µg/m³ increase in PM2.5 actually caused a 0.61 year reduction in life expectancy, equivalent to the claim of Pope [one of the chief studies on which EPA relies], then a 0.2 cigarettes/day smoker would experience about a 45-year reduction in life expectancy, assuming a linear relationship between changes in PM2.5 and life expectancy. In actuality, never-smokers and smokers of 0.2 cigarettes/day do not experience any increase in total death rate or decrease in life expectancy, in spite of a 50-fold greater exposure to PM2.5. Furthermore, hundreds of toxicology experiments on both animals and humans have not proven that PM2.5 at levels up to 750 µg/m³ causes death.

Third, even if we assume PM2.5 pollution in the U.S. poses mortality risks in some locales, EPA’s huge PM2.5 co-benefit estimates are implausible. As Anne Smith of NERA Economic Consulting explains, 99 percent of EPA’s estimated PM2.5 co-benefits occur in areas already projected to be in attainment with the NAAQS for PM 2.5. EPA illegitimately assumes the health benefits of PM2.5 reductions from concentrations already below the national ambient air quality standard (NAAQS) for fine particulate matter are as certain as the benefits of reductions from concentrations above the NAAQS. That is inconsistent with the basic concept of the NAAQS program, which is to set concentration standards at a level “requisite to protect public health . . . allowing an adequate margin of safety.”

Once we factor in the lower probability of PM2.5 health benefits in areas where exposures are already below the NAAQS, the lion’s share of the Power Plan’s purported health benefits disappears. For further discussion, see my blog post “EPA’s PM2.5 Co-Benefits PR Trick Exposed.”"

Milton Friedman’s Morals

As Trump and Clinton bang the drums for tariffs and renegotiated deals, where’s the popular voice for trade? 

By William McGurn of the WSJ. Excerpts:
"the Resolution Foundation study reports average real income growth for lower- and middle-class workers in the U.K. was much higher than for their American counterparts, even though the U.K. has an economy that is more, not less, dependent on trade."

"the following was Friedman’s response on “Free to Choose” when a union official challenged him on his bid to eliminate all tariffs over five years:

“The social and moral issues are all on the side of free trade. And it is you, and people like you, who introduce protection who are the ones who are violating fundamental moral and social issues.
“Tell me, what trade union represents the workers who are displaced because high tariffs reduce exports from this country, because high tariffs make steel and other goods more expensive, and as a result, those industries that use steel have to charge higher prices, they have fewer employees, the export industries that would grow up to balance the imports, tell me what union represents them? What moral and ethical view do you have about their interests?”"

Sunday, September 25, 2016

The Reasons Behind the Obama Non-Recovery

By Robert J. Barro, in the WSJ. Excerpts:
"The Obama administration and some economists argue that the recovery since the Great Recession ended in 2009 has been unusually weak because of the recession’s severity and the fact that it was accompanied by a major financial crisis. Yet in a recent study of economic downturns in the U.S. and elsewhere since 1870, economist Tao Jin and I found that historically the opposite has been true. Empirically, the growth rate during a recovery relates positively to the magnitude of decline during the downturn.

In our paper, “Rare Events and Long-Run Risks,” we examined macroeconomic disasters in 42 countries, featuring 185 contractions in GDP per capita of 10% or more. These contractions are dominated by wartime devastation such as World War I (1914-18) and World War II (1939-45) and financial crises such as the Great Depression of the 1930s. Many are global events, some are for individual or a few countries.

On average, during a recovery, an economy recoups about half the GDP lost during the downturn. The recovery is typically quick, with an average duration around two years. For example, a 4% decline in per capita GDP during a contraction predicts subsequent recovery of 2%, implying 1% per year higher growth than normal during the recovery. Hence, the growth rate of U.S. per capita GDP from 2009 to 2011 should have been around 3% per year, rather than the 1.5% that materialized.

Arguing that the recovery has been weak because the downturn was severe or coincided with a major financial crisis conflicts with the evidence"

"many of the biggest downturns featured financial crises. For example, the U.S. per capita GDP growth rate from 1933-40 was 6.5% per year, the highest of any peacetime interval of several years, despite the 1937 recession. This strong recovery followed the cumulative decline in the level of per capita GDP by around 29% from 1929-33 during the Great Depression."

"The growth rate of GDP per worker from 2010-15 was 0.5% per year, compared with 1.5% from 1949 to 2009."

"What could have promoted a faster recovery by enhancing productivity growth? Variables that encourage economic growth include strong rule of law and property rights, free trade, rolling back inefficient regulations and other constraints on market activity, public infrastructure such as highways and airports, strong institutions for education and health, fiscal discipline (including a moderate ratio of public debt to GDP), efficient taxation, and sound monetary policy as reflected in low and stable inflation.

The main U.S. policy used to counter the Great Recession was increased government transfer payments."

"The absence of inflation is surprising but may have occurred because weak opportunities for private investment motivated banks and other institutions to hold the Fed’s added obligations despite the negative real interest rates paid."

Reagan ended oil price controls earlier than scheduled andthe result was a semi-boom in U.S. oil production

See Notable & Quotable: Presidential Economics from the WSJ.
"From “Can U.S. Presidents Much Affect the U.S. Economy?” by David Henderson, writing Sept. 19 at the Library of Economics and Liberty:
 
We live in a regulatory state, with hundreds of thousands of regulations and tens of thousands of new regulations (both large ones and tweaks to current ones) annually.

Regulation has a big effect on the economy and a determined president can either increase regulation and make it much more punitive or decrease regulation and make it less punitive. Most regulations . . . are like a boulder in a strong-flowing river. Throw in one boulder and the river finds ways around it. But throw in a thousand boulders and the river’s flow slows considerably. . . .

When Ronald Reagan came into office on January 20, 1981, he inherited price controls on oil and gasoline that were originally imposed by Richard Nixon and extended by Gerald Ford. Jimmy Carter, even though he was a regulator at heart, saw some of the damage done by price controls and signed a bill in 1980 that phased out price controls so that they would end in October 1981. The bill, however, gave the president discretion to end the controls earlier. Reagan, who understood the effects of price controls and had spoken out against them during the campaign, used that discretion to end the controls on January 28, 1981, 8 days after getting inaugurated. . . .

The result was a semi-boom in U.S. oil production (from 8.6 million barrels per day (mbd) in 1980 to a 1980s peak of 9.0 mbd in 1985) and a body blow to the OPEC cartel. Oil prices fell and that helped the 1983-1984 economic boom. . . .

Now to President Obama. He has some of the most hostile regulators in recent U.S. history. One regulatory agency can hold up a pipeline, another can cause people to line up at airports (although George W. Bush did most of the damage on that front), another can reset the threshold pay after which employers have to pay overtime and can change the rules for unionization to make it easier for unions to monopolize the supply of labor to particular firms or industries, etc. Those boulders add up."

Saturday, September 24, 2016

Obama's Overtime Rules Hurt the Economy and American Workers

By Diana Furchtgott-Roth. Diana Furchtgott-Roth is a senior fellow and director of Economics21 at the Manhattan Institute. Excerpts:
"Twenty-one U.S. states filed suit on Sept. 20 to overturn the Obama administration’s new overtime rule, which requires employers to pay white-collar workers overtime if they earn less than $47,476 annually, instead of less than the current level of $23,660. (Manual workers generally have to be paid overtime at all earnings levels.) The rule is set to take effect Dec. 1."

"Consider what could be a real-life example: Peter, a fellow at a think tank who earns a salary of $45,000 a year. Now if he works late one night, he can come in later the following day, or take extra time off. He can duck out of the office to get a haircut without reporting to his boss. If he feels sick, he can ask to work from home. He can come home for dinner and catch up with his work in the evenings. His employer is free to say, “Peter, you worked a lot of evenings this week. Take some extra days off with your family over Thanksgiving.”

On Dec. 1, Peter and his employer will no longer be able to have such an arrangement. Along with others who make under $47,476 annually, Peter will have to keep track of his hours by clocking in and out. Because of his employer’s requirement to track his hours, telecommuting will be difficult. If he works longer in one particular week, his employer will not be legally allowed to give him “comp time” (time off instead of the extra hours), but will have to pay him overtime instead.

And for all that paperwork, Peter won’t necessarily earn more than what he is making now. His employer might tell him to make sure he never works more than 40 hours in a week. Or, since he makes more than minimum wage, his boss could lower his hourly pay rate to make up for the extra hours worked.

Even the Labor Department admits that most workers affected by the rule will never get the chance to work over 40 hours per week. The administration estimates that about 4.2 million workers would qualify for overtime in 2017, and they would earn $1.2 billion more in overtime payments.

In contrast, setting up the system for monitoring the employees could cost almost $20 billion in the first year because of the additional administrative costs.

One cost is familiarization, the time and effort that each employer must expend to understand the requirements and assess what needs to be done. Most employers reading this now have no idea that they will have to put in place different systems to track employees on Dec. 1.

Another cost is identifying each employee affected by the higher salary test, to decide for each case whether to raise their salary to the new threshold or to convert the status to non-exempt hourly. Converting salaried employees to hourly employees requires deciding what base hourly rate the employee earns. Plus, employers have to decide on a weekly hours requirement and policies to set for assignment and approval of overtime hours.

A third cost is management. Someone has to supervise the employees to make sure they fill in the time sheets and don’t work more hours than they are supposed to work — and pay them for extra hours worked.

The costs of the new rule could total $18.9 billion the first year — over 15 times greater than the $1.2 billion of increased wages that the administration estimates will be received by workers. In subsequent years, the ongoing management supervision costs imposed by the rule could total around $3.4 billion each year.

President Obama’s overtime rule would hurt those whom it is trying to help, by reducing flexibility in the workplace and discouraging job creation. The lawsuits are a common-sense reaction to harmful federal overreach."

Estate taxes are highly distortive of economic activity

See Stephen J. Entin on raising estate taxes from Marginal Revolution.
"The transfer [estate] taxes are highly distortive of economic activity. In fact, they probably do the most damage to output and income per dollar of revenue raised of all the taxes in the U.S. tax system. There are two reasons. First, they are an additional layer of tax on saving and investment, activities that are highly sensitive to taxation and very likely to shrink in response to the tax. Second, the transfer taxes are levied at very high, steeply graduated marginal tax rates on a very narrow tax base. The high rates discourage saving and investment at the margin, while the average tax rate and tax revenues are held down by the credit. A tax that has a large differential between its average and marginal tax rates does far more damage per dollar of revenue raised than a flatter rate tax on a broader base.
Here is the full study and pdf."

Friday, September 23, 2016

Can the economy handle a 20% fiscal contraction (at near zero interest rates?)

By Scott Sumner.
"Here's Matt O'Brien:
Libertarian presidential candidate Gary Johnson is a friendly guy, seems pretty moderate. But he could tank the economy. That's what trying to balance the budget all at once would do. Which, of course, is what Johnson says he would. He wants to cut spending by 20 percent next year to get the government back in the black, and then veto any legislation that would make the red ink return.
This probably wouldn't end well. The problem is the Federal Reserve might not be willing or able to really counteract this. In normal times, you see, the Fed cuts interest rates when the government cuts the deficit so that the private sector can pick up the slack for the public sector. But even eight years after the Great Recession, these are still not normal times. The Fed can't cut interest rates right now, because they're barely above zero. Now, it's true that the Fed could print money instead -- that's how it stopped austerity from starting a recession in 2013 -- but Johnson doesn't want the Fed to do that. He's said that quantitative easing, which is when the Fed buys bonds with newly created dollars, is just an attempt to "override the free market" that will only lead to "malinvestment, inflation, and prolonged unemployment." And since he would not only get to pick two Fed members in 2017, but also a new Fed chair in 2018, what he thinks matters.

I mostly agree with this, but not entirely. I certainly agree that the Fed could offset fiscal austerity with a more expansionary monetary policy, just as they did in 2013. But it's not quite right to say that the Fed cannot cut interest rates. The current rate of interest on reserves is 0.5%, and that rate could be cut by 100 basis points, to minus 0.5%. Nonetheless O'Brien is right that interest rate cuts might not be enough. 
But I also think he slightly exaggerates how much we know about this issue. Let's take the policy environment at the end of WWII. Here are some relevant facts:
1. There was massive austerity, as government spending fell by far more that 20%. We suddenly went from a deficit of 20% of GDP, to a surplus.
2. Several top Keynesian economists warned that this austerity would lead to another post-war depression.
3. Short term interest rates were very low, but slightly above zero (0.38% on T-bills, for instance.)
Sound familiar? Here's what happened next. After WWII, the Fed did not cut rates at all to offset the fiscal austerity. Indeed after holding them at 0.38% for about 2 years, they began gradually raising them in mid-1947. Nor did they do any QE. And despite all that, the economy remained fine, with the unemployment rate fluctuating between 3% and 5% throughout 1946, 1947, and 1948, despite millions of men suddenly being discharged from the military. The Keynesian predictions did not come true.

I recognize that there are lots of differences between 1946 and today. But even so, it should give us all pause to consider that well-informed Keynesian economists got this wrong. Maybe there is something wrong with the model.

And let's not forget that Johnson is also proposing many positive initiatives that would boost aggregate supply, such as tax reform.

This is not to say that I agree with everything Johnson is proposing. I disagree with him on the wisdom of balancing the budget so quickly, and I disagree with him on QE, at least as an option. Nonetheless, the post-WWII experience should make us all very cautious about predicting the impact of fiscal austerity.

PS. A few months back Trump proposed paying off the entire national debt in 8 years, which is an even more contractionary proposal. These sorts of proposals need to be taken with more than a grain of salt.

PPS. I recommend this David Beckworth interview of Jason Taylor, which touches on some of these issues. Taylor says that Keynesians predicted 25% to 35% unemployment if the government suddenly discharged 10 million soldiers, and also suddenly slashed massive military spending. The government did exactly that, and unemployment averaged 3.9% in 1946 and 1947. (The specific discussion occurs after the 47 minute mark.)"

Medical Marijuana Seems to Reduce Deaths From Pharmaceuticals

A new study adds to the evidence that patients are substituting marijuana for opioids

By Jacob Sullum of Reason.

"While sounding the alarm about an "opioid epidemic" that included a record number of painkiller-related deaths in 2014, the federal government insists marijuana has "no currently accepted medical use." As I explain in my latest Forbes column, that dogmatism may be deadly:
Insys Therapeutics, the Arizona-based pharmaceutical company that recently became the biggest financial supporter of the campaign against marijuana legalization in that state, makes an oral spray that delivers the opioid painkiller fentanyl and plans to market another one that contains dronabinol, a synthetic version of THC. Insys says it gave $500,000 to the main group opposing Arizona's legalization initiative because the measure "fails to protect the safety of Arizona's citizens, and particularly its children." But one needn't be terribly cynical to surmise that Insys also worries about the impact that legalization might have on its bottom line, since marijuana could compete with its products.
A new study suggests Insys has good reason to worry. In an article published last week by the American Journal of Public Health, Columbia University epidemiologist June Kim and her colleagues report that fatally injured drivers are less likely to test positive for opioids in states that allow medical use of marijuana. That finding, together with the results of earlier studies, indicates that making marijuana legally available to patients saves lives by reducing their consumption of more dangerous medications."

Thursday, September 22, 2016

Four Policy Implications of National Academies Report on Immigration

By David Bier of Cato.
"The National Academies of Sciences, Engineering and Medicine released a major new report on the fiscal and economic impacts of immigration on the United States yesterday. The report is being heralded by all sides of the immigration debate as the most important collection of research on this issue. This reception could be due to the Academies’ meticulously avoiding any policy implications from their research, allowing policy wonks to draw their own conclusions. Here are my top four policy implications of the new research:
1) Dramatically expanded high skilled immigration would improve federal and state budgets, while spurring economic growth. The fiscal and economic benefits of high skilled immigration are tremendous. The net value to the federal budget is between $210,000 and $503,000 for each immigrant with a bachelor’s degree over their lifetime (the full chart below highlights the overall impact). The sections on immigrant entrepreneurship and innovation are also universally positive. “High-skilled immigrants raise patenting per capita, which is likely to boost productivity and per capita economic growth,” they conclude (p. 205).


Exempting spouses and children of legal immigrants, as Congress intended, would double the flow of high skilled immigrants, allowing the United States to capture these benefits.

2) Legalization could hasten assimilation. One conclusion of the report is that wage and language assimilation is lower among the 1995-1999 cohort of immigrants than among the 1975-1979 cohort. The rise of illegal immigration likely explains much of this difference. More than one in four immigrants today is illegally present in the United States. As Douglas Massey has shown, documented and undocumented immigrants had roughly the same wages until the 1986 law banning employment of undocumented immigrants, which depressed the wages of undocumented immigrants. Legalization would reverse this.

Moreover, other studies have shown that immigrants who are legalized rapidly increase their earnings and invest in skills, including language acquisition. A legalization program that specifically required language classes, education, and workforce participation while restricting welfare, as the 2013 Senate-passed bill did, would further enhance the gains from legalization.

3) A large guest worker program can mitigate the negative fiscal impacts of low-skilled immigration. The most negative finding in the report is that the lowest skilled immigrants have negative fiscal impacts, but those impacts are entirely driven by costs in childhood and retirement, as the figure below from the report shows (p. 331). A large guest worker program that allowed low-skilled immigrants with less than a high school degree to enter during their prime years and retire in their home country would be a strong fiscal gain for the United States.

4) Governments should strengthen the wall around the welfare state. The positive fiscal gains from immigration could be improved by limiting immigrants’ access to benefits. As I have shown before, immigrants overall did very well after benefits were partially restricted in 1996, and my colleagues have detailed a number of ways that these barriers could be reinforced. One particular insight of the report is that most of the welfare usage comes after retirement, so that should be a focus of reform.

There are many other implications of this report, but these four are enough for Congress to get started on."

Firms that Discriminate are More Likely to Go Bust

From Alex Tabarrok.
"Discrimination is costly, especially in a competitive market. If the wages of X-type workers are 25% lower than those of Y-type workers, for example, then a greedy capitalist can increase profits by hiring more X workers. If Y workers cost $15 per hour and X workers cost $11.25 per hour then a firm with 100 workers could make an extra $750,000 a year. In fact, a greedy capitalist could earn more than this by pricing just below the discriminating firms, taking over the market, and driving the discriminating firms under. The basic logic of employer wage discrimination was laid out by Becker in 1957. The logic implies that discrimination is costly, especially in the long-run, not that it doesn’t happen.

A nice test of the theory can be found in a paper just published in Sociological Science, Are Business Firms that Discriminate More Likely to Go Out of Business? The author, Devah Pager, is a pioneer in using field experiments to study discrimination. In 2004, she and co-authors, Bruce Western and Bart Bonikowski, ran an audit study on discrimination in New York using job applicants with similar resumes but different races and they found significant discrimination in callbacks. Now Pager has gone back to that data and asks what happened to those firms by 2010? She finds that 36% of the firms that discriminated failed but only 17% of the non-discriminatory firms failed.

The sample is small but the results are statistically significant and they continue to hold controlling for size, sales, and industry.

As Pager notes, the cause of the business failure might not be the discrimination per se but rather that firms that discriminate are hiring using non-rational, gut feelings while firms that don’t discriminate are using more systematic and rational methods of hiring.
As she concludes:
…whether because of discrimination or other associated decision making, we can more confidently conclude that the kinds of employers who discriminate are those more likely to go out of business. Discrimination may or may not be a direct cause of business failure, but it seems to be a reliable indicator of failure to come."

Wednesday, September 21, 2016

Globalization is not the problem, it's the solution

By Scott Sumner.
Here's Dani Rodrik:
A Chinese student once described his country's globalization strategy to me. China, he said, opened a window to the world economy, but placed a screen on it. The country got the fresh air it needed -- nearly 700 million people have been lifted from extreme poverty since the early 1980s -- but kept mosquitoes out. China benefited from the flourishing of trade and investment across national borders. For many, this was the magic of globalization.
But it's not the whole story. Look closely at the economies that converged with richer counterparts -- Japan, South Korea, China -- and you see that each engaged globally in a selective, strategic manner. China pushed exports, but it also placed barriers on imports to protect employment in state enterprises and required foreign investors to transfer know-how to domestic companies.
Other countries that relied on globalization as their growth engine but failed to put in place a domestic strategy became disillusioned. For example, few countries tried as hard as Mexico to integrate with the world economy, through Nafta and liberal trade and financial policies. Yet the country's economic growth in recent decades has been sluggish, even by the modest standards of Latin America.

This is a misleading comparison. Mexico is nothing like East Asia; for instance its educational levels are much lower. Why not compare China, Korea and Japan to East Asian countries that were less protectionist, such as Hong Kong, Taiwan and Singapore? That would make more sense, but then that group of countries are richer than the three cited by Rodrik. (And even the three he cites were not particularly protectionist.)

In fact, economists who have studied East Asia tend to find that protectionist policies actually slowed development. The parts of the Chinese economy that have done best are not the protected SOEs, but rather the more competitive private firms. The busybodies at MITI are now seen as having slowed Japanese growth.

Rodrik also ignores the fact that the protectionist policies that were adopted in Latin American in the 1950s through the 1970s were later regarded as an abject failure. At the time, East Asia was less protectionist, and grew much faster than Latin America. Indeed the stark comparison between "open" East Asia and "closed" Latin America is one thing that led to the neoliberal revolution after 1980. And the more neoliberal parts of Latin America (such as Chile) have tended to do better than the less neoliberal areas.

Given his concern about the slow growth in Mexico, you might expect Rodrik to propose polices that would help Mexico to do better. But no such policy reforms are offered; instead he makes Trump-like arguments such as the following:
For example, imports from countries that are gross violators of labor rights, such as Pakistan or Vietnam, may face restrictions when those imports demonstrably threaten to damage labor standards at home.
I'm not sure what he means by "at home", but I very much doubt whether workers in Germany, the Netherlands and Switzerland (with their massive current account surpluses) are worried about losing their jobs to exploited labor in Pakistan. Nor are workers in (CA deficit) countries such as Australia. There is a serious intellectual argument that a global labor market hurts a subset of workers in wealthier countries, based on the factor price equalization theorem. But this argument has nothing to do with "gross violations of labor rights". Rather it simply reflects the fact that wages tend to be much lower in very poor countries, and that capital can move more easily than labor. If a protectionist wants to slow the development of Asia with trade barriers to protect a few far richer western workers from Vietnamese exports, they should simply say so. Talk about labor rights violations simply muddies the waters. And hasn't the West already done enough to poor Vietnam? How much more misery do we plan to inflict on those people?

Some simple principles would reorient us in the right direction. First, there is no single way to prosperity. Countries make their own choices about the institutions that suit them best. Some, like Britain, may tolerate, say, greater inequality and financial instability in return for higher growth and more financial innovation. They will opt for lower taxes on capital and more freewheeling financial systems. Others, like Continental European nations, will go for greater equity and financial conservatism. International firms will complain that differences in rules and regulations raise the costs of doing business across borders, but their claims must be traded off against the benefits of diversity.
That sounds fine, but how is it different from the status quo? There is already a great deal of diversity in places like Europe, with some countries having a much more extensive welfare states than others. Some have minimum wage laws and capital gains taxes, while others do not. And that's within the EU, perhaps the most intrusive international organization on Earth.

One of the most basic ideas in international economics is that policies that reduce productivity, such as environmental controls, do not make a country uncompetitive. Rather the real exchange rate adjusts via either a lower price level or currency devaluation, until the international flow of goods, services and assets is again in equilibrium. Yes, lots of health, safety and environmental regulations might lead to lower wages, but that's exactly as it should be. Voters should be told that there are no free lunches. You don't want GMO foods? Fine, but then you'll have to pay more for food.

I don't like either side of the current trade debate. I don't want to stop trading with countries that have objectionable policies, with the possible exception of those that threaten world peace (say North Korea, and even on that issue I'm rather agnostic.) But I also don't think we should use trade negotiations to force other countries to adopt our regulatory or intellectual property rights rules, as some of the advocates of globalization who work within government want to do. Let each country set its own course, and trade freely with the rest of the world. Anything else is a recipe for conflict."

Tuesday, September 20, 2016

Kocherlakota’s paper on the Fed and the Taylor Rule is wrong in a number of ways

See Kocherlakota on the Fed and the Taylor Rule by John Taylor.

"The use of policy rules to analyze monetary policy has been a growing area of research for several decades, and the pace has picked up recently. Last month Janet Yellen presented a policy framework for the future centered around a Taylor rule, noting that the Fed has deviated from such a rule in recent years.  A week later, her FOMC colleague, Jeff Lacker, also showed that the Fed has deviated from a Taylor rule benchmark, adding that now is the time to get back.  Last week, the Mercatus Center and the Cato Institute hosted a conference with the theme that deviations from policy rules—including the Taylor rule discussed in my lunch talk—have caused serious problems in recent years.  And this week former FOMC member Narayana Kocherlakota argued that the problem with monetary policy in recent years has not been that it has deviated from a Taylor rule but that it has been too close to a Taylor rule! Debating monetary issues within a policy rule framework is helpful, but Kocherlakota’s amazingly contrarian paper is wrong in a number of ways

First, the paper ignores many of the advantages of policy rules discovered over the years, and focuses only on time inconsistency and inflation bias. I listed the many other advantages in my comment on the first version of Kocherlakota’s paper with the same title: “Rules Versus Discretion: A Reconsideration.” Research by me and others on policy rules preceded the time inconsistency research, and, contrary to Kocherlakota’s claim, the Taylor rule was derived from monetary theory as embodied in estimated models not from regressions or curve fitting during particular periods. (Here is a review of the research.)

Second, Kocherlakota ignores much historical and empirical research showing that the Fed deviated from Taylor type rules in recent years both before and after the crisis, including the work by Meltzer and Nikolsko-Rzhevskyy, Papell, Prodan.

Third, he rests his argument on an informal and judgmental comparison of the Fed staff’s model simulations and a survey of future interest rate predictions of FOMC members at two points in time (2009 and 2010).   He observes that the Fed staff’s model simulations for future years were based on a Taylor rule, and FOMC participants were asked, “Does your view of the appropriate path for monetary policy [or interest rates in 2009] differ materially from that [or the interest rate in 2009] assumed by the staff.”  However, a majority (20 out of 35) of the answers were “yes,” which hardly sounds like the Fed was following the Taylor rule. Moreover, these are future estimates of decisions not actual decisions, and the actual decisions turned out much different from forecast.

Fourth, he argues that the FOMC’s reluctance to use more forward guidance “seems in no little part due to its unwillingness to commit to a pronounced deviation from the prescriptions of its pre-2007 policy framework – that is, the Taylor Rule.” To the contrary, however, well known work by Reifschneider and Williams had already shown how forward guidance is perfectly consistent with the use of a Taylor rule with prescribed deviations.  I would also note that there is considerable evidence that the Fed significantly deviated from it Taylor rule framework in 2003-2005.

Fifth, Kocherlakota argues that the Taylor rule is based on interest rate smoothing in which weight is put on an “interest rate gap.” He argues that this slows down adjustments to inflation and output. But that is not how the rule was originally derived, and more recent work by Ball and Woodford deriving the Taylor rule in simple models does not have such a weight on interest rate gaps.

The last part of Kocherlakota’s paper delves into the classic rules versus discretion debate. Here he mistakenly assumes that rules-based policy must be based on a mathematical formula, and this leads him to advocate pure discretion and thereby object to recent policy rules legislation as in the FORM Act that recently passed the House of Representatives. However, as I explain in my 1993 paper and in critiques of the critiques of the recent legislation, a monetary policy rule need not be mechanical in practice."

Want Lower Drug Prices? Start by Fixing the FDA

By Marc Joffe of Mercatus.

"In the early 1960s, pictures of babies with tragic birth defects, including shortened or missing limbs, caused an international scare. The malformations were quickly linked to the drug thalidomide, an over-the-counter sedative that had recently been developed in Germany and was being used by pregnant women to alleviate morning sickness.

One result of the thalidomide crisis was the passage of a new law, the Kefauver Harris Amendment, that gave the U.S. Food and Drug Administration most of the power it now exerts in regulating drugs.
Today, we face a drug crisis of a much different sort. Recent drug pricing scandals are leading to calls for government action. But the prices causing so much outrage today are often then unintended result of that 1962 law and others that gave the FDA its current power. By limiting the number of drugs and other treatments available, the FDA reduces the options available to patients and gives pharmaceutical firms excessive pricing power.
Some have responded to soaring drug prices by calling for government price controls. That’s a risky option, as we can see in Venezuela, where residents are struggling to find daily necessities because of government price restrictions. In our country, gas lines, common during the 1970s owed their origins to price controls imposed by Richard Nixon. The problem with price controls is that the government entity imposing them lacks the information on supply and demand necessary to set them optimally. Thus, price ceilings will most likely be set either too high to have an impact or so low that they trigger a shortage.

A better alternative is to rely on competition to drive prices down. As many of us learned in our introductory economics classes, price equals the marginal cost of production in a perfectly competitive market. While this may not happen in the real world of imperfect competition, prices well above production costs represent an invitation to new firms to enter the market.

But FDA regulations restrict market entry. Often, the FDA gives only one company the right to produce a generic drug, creating an artificial monopoly not justified by the usual intellectual property arguments. In some cases, under an FDA program launched ten years ago, drugmakers have been granted exclusive rights to market medications that have been commonly used for decades or longer — typically at increased prices.

A drug whose patent has expired has already financially rewarded its inventor. Once patent protection lapses, any company that manufacture the drug safely and inexpensively should be free to do so. Yet in a number of cases, pharmaceutical companies have been able to raise prices on older drugs that no longer enjoy patent protection because of a lack of competition. Before the recent outrage over Mylan’s EpiPen price hike, Turing Pharmaceuticals raised the price of Daraprim, used to treat parasitic infections, from $13.50 to $750 per pill and Valeant Pharmaceuticals jacked up the price of Isuprel, a heart medication, by 525 percent.

The FDA has also imposed an expensive and onerous new drug approval process that is preventing patients from accessing many life-saving and life-enhancing tests and treatments. Among the examples that Richard Williams, Ariel Slonim and I report in a new Mercatus Center study are a treatment for diabetic foot ulcers, cultured stem-cell therapies for orthopedic conditions, genetic tests and anti-aging treatments.

This last category is particularly telling. Because the FDA has not considered aging to be a disease, it is not clear what criteria the agency might apply to anti-aging treatments or whether it would consider them at all. Much the same is the case with treatments that increase our physical capacity: They don’t treat a specific disease, so they lack a clear path to approval.

In other cases, FDA restrictions prevent terminally ill patients from taking new medications that are under review. Since these patients may not survive through the clinical trial period, they should have the opportunity to try new medications before it’s too late. The FDA provides exceptions to some terminal patients under its expanded use program, but qualifying for expanded use involves a lengthy bureaucratic process of its own. A more promising alternative is “right to try” laws enacted at the state level, which allow patients to take investigational new drugs without FDA approval.

The justification for the FDA’s drug approval process is that it protects patients from dangerous or ineffective drugs. But the FDA cannot guarantee safety: Approved drugs used individually or in tandem can have unexpected, and sometimes fatal, side effects. Meanwhile, overly restrictive regulations can kill patients by preventing them from accessing new medications.

Even thalidomide, the widely vilified sedative, ultimately proved to be a useful treatment. In 1964, an Israeli doctor gave thalidomide tablets to a leprosy patient suffering extreme pain. The medication not only allowed the patient to sleep, but reversed his symptoms. Eventually, thalidomide became a common treatment for leprosy and was later found to be effective against AIDS and cancer. None of these indications would have been possible had thalidomide not been approved in Germany (and elsewhere) in what proponents of the Kefauver Harris Amendment regarded as an overly lax regulatory regime.

Congress is considering legislation that would expedite FDA approvals for new drugs. But even if this legislation is enacted, securing new drug approvals will continue to be an expensive and onerous process for pharmaceutical companies, with many choosing not to undertake the effort at all. The result will be continued monopoly pricing of off-patent drugs and delays in the availability of new medical innovations.

Some more fundamental reforms would be more effective. One would be to allow multiple organizations to approve drugs, providing competition to the FDA. A private drug adjudication industry would have to be carefully structured and regulated to ensure that approving organizations do not have perverse incentives. Another option is to rely on the courts: Let pharmaceutical companies sell whichever medications they believe to be safe and effective — with the understanding that patients can win large judgments if the companies fail to produce and market their treatments responsibly.

The American people deserve access to the widest variety of affordable treatments. Rather than demanding that the government do more to achieve this goal, perhaps it is time that we ask one government agency — the FDA — to do less."

Monday, September 19, 2016

Sound Economics Protects Us from Locos

From Don Boudreaux.
"Here’s a passage from pages 84-85 of Arnold Kling’s hot-off-the-press – and superb – new book, Specialization and Trade: A Re-introduction to Economics:
Many people believe intuitively that it saves resources to “buy local.”  Surely, we think, cheese and vegetables from a local farm must save on the energy required for transportation.  However, if the grocery store sells cheaper produce that comes from hundreds of miles away, some factor must offset the higher transportation costs.  Chances are, the land elsewhere is more suited to growing crops, so that fewer acres are used to produce a given amount of output.  The local land might be better used for housing or as wilderness.
Water or other resources may be used more heavily locally than on distant farms.  Whenever produce from distant farms is cheaper than locally grown produce, the price system is telling us that “buying local” wastes resources.
Some of my friends respond to the above argument by insisting that the case for buying local rests on the fact that locally grown and locally butchered foods taste better than, or are more nutritious than, ‘distantly’ grown and slaughtered foods.  This fact might well be true; indeed, I’m sure that it’s true in some cases.  And when it is true, it makes economic sense for someone to pay the higher prices for these tastier and more nutritious local foods if that someone values the better taste or higher nutrition by more than he or she values whatever it is that he or she gives up by spending more money on these local foods.

I myself, for example, typically pay a premium for better tasting foods and wines.
But this case for buying local is misleading, for at least two reasons.

First, this case is not really one for “buying local”; instead, it’s a case for “buying tastier” or “buying healthier.”  Why conflate one’s understandable desire for better taste and better nutrition with a desire to buy local?  “Buy tastier” or “buy healthier” fully capture the goal of the consumer.  Calling it “buy local” only confuses the issue.

It won’t do to respond that “buy local” is nevertheless a good goal and guide because the taste and nutritional quality of locally grown and slaughtered foods are so generally superior to ‘distantly’ grown and slaughtered foods that “buy local” suffices to describe an economically sensible action.  This response would be true only if its premise were true.  But the premise – namely, that locally grown and slaughtered foods typically taste better than, or are more nutritious than, ‘distantly’ grown and slaughtered food – strikes me as false.  Locally grown corn, tomatoes, eggplant, and strawberries are, to my taste, often better than ones bought from supermarkets.  But are locally grown bell peppers, chili peppers, pineapples, apples, oranges, ornamental pumpkins, cherries, peaches, cranberries, cauliflower, broccoli, and onions better than ‘distantly’ grown ones?  If so, my taste buds are too incompetent to detect this difference when they’ve tried.  (In some cases, they’ve never tried: living all my life east of the Mississippi,* but never in Florida, I’ve never tasted a locally grown – as in, for example, a Louisiana or Virginia grown – orange or pineapple.)

Likewise, my taste buds detect no difference between high-quality ‘distant’ meats and fish bought at supermarkets and ‘local’ meats and fish bought at farmers’ markets.

Second, and according to the logic of the environmentalist creed that often is inextricably intertwined with the buy-local movement, to buy local because locally produced foods taste better is to selfishly damage the environment.  The lower prices of ‘distantly’ grown foods sold in supermarkets mean that their production and distribution consumes fewer resources than do their locally grown alternatives.  That is, supplying these lower-priced ‘distantly’ grown foods is better for the environment than is supplying their locally grown alternatives.  Because of this fact, environmentalists should condemn as greedy, thoughtless, and environmentally careless anyone who pays a premium for foods simply because such foods taste better!

Please understand that I don’t share this typical environmentalist arrogance: if you so value the better taste of some locally grown kumquat or a locally slaughtered pig over the tastes of their ‘distantly’ supplied alternatives, the fact that more resources are required to produce these tastier local foods does not mean that such resource use is wasteful or otherwise to be condemned.  But the (il)logic and arrogance that is at the heart of the typical modern environmentalist’s assessment of food production and distribution should lead these environmentalists to be in the front lines of those who criticize people who, simply for taste reasons, buy higher-priced local foods.
….
I remind readers that the single best book on the many myths and illogical turns of reasoning of the ‘buy local’ movement is Pierre Desrochers and Hiroko Shimizu 2012 volume, The Locavore’s Dilemma."

State marijuana legalizations have had an absence of significant adverse consequences

By Jeffrey Miron, writing for Cato. He teaches economics at Harvard.

"From my new policy analysis (joint with Angela Dills and Sietse Goffard) on state marijuana legalizations:
In November 2012 voters in the states of Colorado and Washington approved ballot initiatives that legalized marijuana for recreational use. Two years later, Alaska and Oregon followed suit. As many as 11 other states may consider similar measures in November 2016, through either ballot initiative or legislative action. Supporters and opponents of such initiatives make numerous claims about state-level marijuana legalization.
Advocates think legalization reduces crime, raises tax revenue, lowers criminal justice expenditures, improves public health, bolsters traffic safety, and stimulates the economy. Critics argue that legalization spurs marijuana and other drug or alcohol use, increases crime, diminishes traffic safety, harms public health, and lowers teen educational achievement. Systematic evaluation of these claims, however, has been largely absent.
This paper assesses recent marijuana legalizations and related policies in Colorado, Washington, Oregon, and Alaska.
Our conclusion is that state marijuana legalizations have had minimal effect on marijuana use and related outcomes. We cannot rule out small effects of legalization, and insufficient time has elapsed since the four initial legalizations to allow strong inference. On the basis of available data, however, we find little support for the stronger claims made by either opponents or advocates of legalization. The absence of significant adverse consequences is especially striking given the sometimes dire predictions made by legalization opponents."