By David Neumark, in the WSJ. Mr. Neumark is an economics professor and director of the Center for Economics and Public Policy at the University of California, Irvine. Excerpts:
"Economists have written scores of papers on the topic dating back 100 years, and the vast majority of these studies point to job losses for the least-skilled. They are based on fundamental economic reasoning—that when you raise the price of something, in this case labor, less of it will be demanded, or in this case hired.
Among the many studies supporting this conclusion is one completed earlier this year by Texas A&M’s Jonathan Meer and MIT’s Jeremy West, which reaffirmed that “the minimum wage reduces job growth over a period of several years” and that “industries that tend to have a higher concentration of low-wage jobs show more deleterious effects on job growth from higher minimum wages.”
The broader research confirms this. An extensive survey of decades of minimum-wage research, published by William Wascher of the Federal Reserve Board and me in a 2008 book titled “Minimum Wages,” generally found a 1% or 2% reduction for teenage or very low-skill employment for each 10% minimum-wage increase.
That has long been the view of most economists, although there are some outliers. In 1994 two Princeton economists, David Card (now at Berkeley) and Alan Krueger, published a study of changes in employment in fast-food restaurants in New Jersey and Pennsylvania after the minimum wage went up in New Jersey. The study not only failed to find employment losses in New Jersey, it reported sharp employment gains. The study has been widely cited by proponents of a higher minimum wage, even though further scrutiny showed that it was flawed. My work with William Wascher showed that the survey data collected were so inaccurate that they badly skewed the study’s findings.
More recently, a 2010 study by Arindrajit Dube of the University of Massachusetts-Amherst, T. William Lester of the University of North Carolina at Chapel Hill, and Michael Reich of the University of California, Berkeley, found “no detectable employment losses from the kind of minimum wage increases we have seen in the United States.”
This study and others by the same research team, all of whom support a higher minimum wage, strongly contest the conclusion that minimum wages reduce low-skill employment. The problem, they say, is that state policy makers raise minimum wages in periods that happen to coincide with other negative shocks to low-skill labor markets like, for instance, an economic downturn.
They argue that the only way to accurately discover whether minimum wages cause job losses is by limiting control groups to bordering states and counties because they’re most likely to have experienced similar economic conditions. This approach led to estimates of job losses from minimum wages that are effectively zero.
But as Ian Salas of Johns Hopkins, William Wascher and I pointed out in a 2014 paper, there are serious problems with the research designs and control groups of the Dube et al. study. When we let the data determine the appropriate control states, rather than just assuming—as Dube et al. do—that the bordering states are the best controls, it leads to lower teen employment. A new study by David Powell of Rand, taking the same approach but with more elegant solutions to some of the statistical challenges, yields similar results.
Another recent study by Shanshan Liu and Thomas Hyclak of Lehigh University, and Krishna Regmi of Georgia College & State University most directly mimics the Dube et al. approach. But crucially it only uses as control areas parts of states that are classified by the Bureau of Economic Analysis as subject to the same economic shocks as the areas where minimum wages have increased. The resulting estimates point to job loss for the least-skilled workers studied, as do a number of other recent studies that address the Dube et al. criticisms."
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.