Commentary Articles
In The News
News Releases

Media Inquiries

Kim Cloidt
Director of Marketing & Communications
(510) 632-1366 x116
(202) 725-7722 (cell)
Send Email

Robert Ade
Communications Manager
(510) 632-1366 x114
Send Email


Facebook Facebook Facebook Facebook

Your participation will advance liberty. Join us as an Independent Institute member.

Contact Us
The Independent Institute
100 Swan Way
Oakland, CA 94621-1428

510-632-1366 Phone
510-568-6040 Fax
Send us email

Interested in working with us?  Click here for more information.


Back to Work


The word “unemployment” wasn’t even invented until 1888. During the first three decades of this century, government did very little about unemployment. Indeed, government involvement in labor markets was negligible: there was no federal minimum wage, no Davis-Bacon Act, no federal unemployment insurance, no anti-discrimination laws, no laws promoting unionization, no AFDC or food stamps. Policymakers did not try to “stimulate” the economy with “countercyclical” fiscal policy during downturns.

Since the 1930s, all this has changed. Unemployment has become a dominant topic of public policy debate throughout the industrialized world. We have tried to stimulate demand by increasing disposable income and lowering interest rates; we have created a myriad of regulations to “protect” workers on and off the job; we have built massive training programs; we have passed anti-discrimination laws; we have introduced massive transfer payments to alleviate the problems associated with being out of work. In short, we have created a welfare state.

These efforts have been mostly well-intentioned — and largely unsuccessful. Indeed, they have probably destroyed more jobs than they have created, lowering both productivity and income in the process. In the past three decades — the years of the most active intervention — the average unemployment rate has risen to nearly 6.1%. That’s an increase of nearly 30% over the 4.7% rate of this century’s first three decades.

Some would argue that, despite this increase in unemployment, the negative consequences of joblessness have fallen because the government-provided safety net has made unemployment more bearable. This argument is weak. While it is true that unemployment insurance reduces the pain of being unemployed, it does so in a costly and inefficient fashion; a private system would do a far better job. Moreover, many Americans now have a new form of income protection: their spouses. The present unemployment compensation system was conceived in an era when two-worker households were rare. Today, few families face starvation if one member loses a job.

Other defenders of the status quo might argue that sharp fluctuations in unemployment are a thing of the past. But painstaking research by Christine Romer at Berkeley has shown that business fluctuations were no greater in the era prior to interventionist government policies than today.

In short, after expending much money and effort, we are left with higher unemployment and no greater economic stability. On top of that, taxes are higher, and all the activism they are financing is crowding out private investment: while public spending as a percentage of total output has risen since the 1920s, private investment as a percent of total output has fallen. This drop has reduced our ability to modernize our capital stock, slowing productivity and, therefore, keeping the standard of living from rising as quickly as it could.

Creating Unemployment

In our book Out of Work: Unemployment and Government in Twentieth Century America, we argue that unemployment results when the price of labor rises above a level that will clear labor markets. What we call the adjusted real wage will rise if money wages go up or if prices or productivity falls. Most of the book is given over to documenting that this relationship exists, and that many recessions were caused by well-intentioned but inappropriate government interventions that raised the price of labor.

It is instructive to compare the depression of 1920—21 to the Great Depression of the 1930s. The 1920-21 downturn was, by most measures, more severe than the first six or seven quarters of the depression that followed 1929. Industrial production fell more, as did prices. Yet we recovered quickly from the 1920-21 downturn, while the Great Depression lingered for a decade.

What made the difference? In 1920-21, the government did little or nothing to end the recession. As the downturn began, President Woodrow Wilson had a stroke and refrained from intervention. (As one colleague once rather tastelessly put it, this was truly a stroke of luck.) He was replaced, five quarters into the downturn, by Warren Harding, who was committed to a policy of non-intervention.

Compare this with 1929. Within a month of the stock market crash, President Herbert Hoover summoned the nation’s business leaders to a conference at the White House, where he urged them to keep wages high in order to stimulate consumption. They followed his advice, which resulted in higher unemployment, deteriorating corporate balance sheets, and a decline in ability to repay bank loans. This in turn brought down the market value of bank assets, leading to a decline in investor and then depositor confidence in banks, which in turn ultimately led to the banking crisis of late 1930. Compounding the high-wage folly were the Smoot-Hawley tariff, deflationary policies at the Fed, and a large increase in the income tax.

In 1932, Franklin Roosevelt was elected president on a platform of fiscal conservatism. But once in office, FDR continued Hooverism with a vengeance, institutionalizing the high-wage policy in legislation that priced labor out of the market. From March to August 1933, the unemployment rate fell over 5%. Then Roosevelt’s National Industrial Recovery Act, passed in June, started to take effect. The NIRA’s minimum wage provisions led to an extraordinary increase in factory wages of over 20% in just six months. The fall in unemployment came to a screeching stop, and the country became mired in 20% unemployment for two years.

Finally, in 1935, the Supreme Court ruled the NIRA unconstitutional. The adjusted real wage then started to fall significantly, bringing unemployment rates down to around 13% by early 1937. At that point, though, the Wagner Act took effect, leading to massive unionization, another double-digit wage increase, and 20% unemployment rates. New Social Security and unemployment insurance taxes didn’t help matters either, as they pushed the real cost of hiring workers up further.

The New Deal didn’t restore prosperity. It prolonged the economic misery. The Great Depression is often depicted as a spectacular market failure. In fact, it is the best example of government failure in this century.

The Mark of Keynes

The old classical economics that had more or less prevailed until the late 1920s had argued that market adjustments in wages, prices, and/or productivity will reduce labor costs, increasing hiring and thereby alleviating unemployment. Undergirding the Hoover-Roosevelt policy of increasing government expenditures to offset “underspending” in private markets, however, was the new economics of John Maynard Keynes, who argued that massive increases in budget deficits will stimulate demand and thus employment, while reduced spending will lead to rising unemployment.

On a superficial level, World War II seemed to justify Keynes’s theory: U.S. budget deficits soared and unemployment decreased. Yet over half of this decline in unemployment actually occurred before Pearl Harbor. Moreover, much of the decline can be attributed to conscription. Take ten million people out of the civilian labor market and unemployment is bound to be reduced, but that’s hardly a long-term means of curing unemployment in a free society.

By 1945, all the Keynesian gurus were predicting that double-digit unemployment would return after the war. Government spending was reduced by about 75% almost overnight. Within a year, 10,000,000 people lost defense-related jobs. The huge deficit gave way to a sharp budget surplus, the equivalent today of moving from a more than $1 trillion deficit to a $250 billion surplus in one year. But despite the Keynesian warnings of a howling depression, unemployment never rose above 4%.

Why didn’t the depression come? Because markets worked. Real wages (adjusted for the change in productivity) fell, allowing millions of civilian workers to be absorbed back into the economy. Truman did not initiate a job stimulus program — indeed, he pursued the most severely contractionary fiscal policy in modem American history — but the expected depression did not come, because the falling real price of labor made hiring attractive.

Ever ingenious, the Keynesians quickly came up with a theory to explain what had happened: Postwar prosperity resulted from pent up demand for consumer goods. But while that demand certainly existed, the relatively painless shift to a low-unemployment peacetime economy occurred long before civilian durable goods production and housing construction returned to normal levels.

In short, the most successful postwar economic transition in American history came during a period of sharp reduction in federal involvement in labor markets, a period when wartime wage and price controls ended.

The late ’40s and the 50’s saw more economic good times. There were several increases in the federal minimum wage during this period, but on the whole, the government avoided serious new interventions in labor markets. The Taft-Hartley Act trimmed union power, and union membership began its ongoing decline. Keynesian-style fiscal policy was used tepidly, if at all. The real per capita federal public debt was smaller at the end of the Truman administration than at the beginning. By the end of the Eisenhower presidency, it was smaller still.

The ‘60s marked the high tide of Keynesianism. The economy was strong, with unemployment falling —though not to levels as low as were sometimes reached in the ‘40s and ‘5Os. Real output rose sharply, and the country went 100 months without a downturn. This success seemed to be accomplished with Keynesian policies. Presidents Kennedy and Johnson deliberately induced inflation, reducing real wages and increasing the attractiveness of labor. Richard Nixon not only continued the policies into the early ‘70s, but expanded the role of government further. “We are all Keynesians now,” he explained.

Yet the prosperity of the late ‘60s set the stage for the stagflation of the next decade. It didn’t take long for workers and creditors to catch on to what was happening and alter their behavior. Workers demanded greater wage increases to compensate for the inflation. Bankers demanded higher interest rates. Inflation became anticipated, and government fiscal stimulus became ineffective. It had worked in the past only because people were fooled into accepting abnormally low real wages or interest rates.

The Keynesians of the stagflation era — Democrats and Republicans alike — thought the solution was still more stimulus, more jobs programs, more intervention. From 1970 on, budget deficits were the rule. But unemployment remained high. In 1980, we had both double-digit inflation and 7% unemployment for the first time in American history. The misery index —the sum of the unemployment and inflation rates — had been running in the single digits in the ‘50s and ‘60s, but hit 20 in 1980, the highest level in modern history.

Enter Fed Chairman Paul Volcker and President Ronald Reagan. The Great Inflation was dramatically and unexpectedly reduced, leading to a temporary surge in the purchasing power of wages that triggered the short but potent 1982 recession. In 1983, however, the longest peacetime recovery in American history began.

The ‘80s boom was further fueled by a retreat from government intervention. Labor unions became even weaker, aided by Reagan’s symbolically important firing of air traffic controllers after their illegal strike. In real terms, the federal minimum wage fell by more than a fourth. Deregulation increased the mobility of resources and the ability of market forces to reallocate resources efficiently. Labor productivity rose much faster than in the 1970s, lowering the real cost of labor per unit of output produced. Nearly 20 million jobs were created, the stock market more than tripled in value, and prosperity reigned.

The new approach might best be understood by contrasting Reagan’s response to the 1987 stock market drop to Hoover’s response to the crash of 1929. Reagan did, roughly speaking, nothing, and the 1987 crash quickly passed into memory. Today the Dow Jones is more than double what it was after the events of October 1987.

The election of George Bush brought in renewed government activism, resulting in a new stagflation barely a year after Reagan left office. On May 1, 1990 the minimum wage went up for the first time in nine years, followed by another large increase a year later. The total increase was 27%, creating a wage inflation that made labor too expensive and led to rising unemployment.

Compounding the problem were some new federal policies that increased employer uncertainty and raised the potential financial liabilities of hiring workers. The Americans with Disabilities Act threatened labor productivity and thus the real cost of each unit of labor-produced output. A new civil rights act raised fears that workers would have to be hired and paid according to considerations other than productivity. New environmental legislation also raised costs, as did a big 1990 tax increase.

You can shock markets, you can mutilate markets, you can spook markets, but they always come back. Despite the awful policies of the Bush era, labor markets adjusted. Declining real wages gradually made American labor more attractive, as did soaring labor costs in other nations. A year or so ago, job expansion resumed at a healthy clip. A recovery was underway.

More than four million jobs have been created over the past couple of years, and nationwide unemployment has fallen below 60/o. This job expansion has been aided by what some would call obstructionism and grid-lock. Bill Clinton and his key aides have an interventionist domestic agenda that would dramatically increase labor costs. Fear of those plans kept job growth modest in the early months of the administration, but there has been a growing realization recently that Clinton will not be able to push through the more extreme elements of his program.

In particular, the tabling of health care ‘reform” removed, at least temporarily, a threat of significantly increased labor costs. Had that legislation passed, as many as one million jobs would have been lost, and there would have been a sharp reduction in real wages. In addition, Labor Secretary Robert Reich has failed to persuade President Clinton to push for a higher minimum wage, and the 10% decline in the real minimum wage over the past 41 months has aided job expansion. Other threats to employment, such as the striker replacement bill and a stronger plant-closing law, have diminished with the president’s shrinking popularity.

But some fear remains. Employers worry that under Clinton, full-time workers will become a fixed cost that cannot be reduced in downturns. To an unprecedented extent, they have resorted to paying existing workers over-time and hiring part time and temporary help.

The lessons of history are clear. Unemployment is generally low when markets are left unfettered. Major upsurges in joblessness often reflect labor cost shocks, usually resulting from well-intentioned but damaging government policies.

Paying People Not To Work

Meanwhile, the modern welfare state offers an alternate source of income, allowing many to choose to be unemployed. Unemployment compensation raises unemployment rates by as much as 1% by raising what in economics jargon is called the “reservation wage” — the minimum acceptable wage. If someone making $300 a week loses her job, why should she take another job at $200 a week when she can receive about that amount in unemployment compensation? Why work when you can stay home and watch General Hospital and live just as well?

The welfare state’s impact on joblessness varies enormously among different groups. In the era of Jim Crow laws, KKK terror, and lynch mobs, the unemployment rate among black Americans was about the same as it was for whites. But in the past 40 years, as the lives of African Americans have improved in so many other respects, the black unemployment rate has typically been about twice that of whites. Why has black unemployment risen so much?

Part of the answer relates to demographics, but much of it reflects the fact that the modern welfare state induces incongruous numbers of blacks to choose public assistance payments over work. Welfare has raised the reservation wage. If a head of a household making $1,200 a month in cash and noncash welfare benefits takes a job paying the same amount, he will lose virtually all of his benefits. His short-run gain from working is zero. In essence, the government is imposing a 100% work tax on the poor. This applies to people of all races, of course, but African Americans are disproportionately poor, and thus disproportionately eligible for welfare. Accordingly, a large number of blacks have, quite rationally, chosen to be unemployed.

Thus, the welfare state has destroyed job opportunities for the most disadvantaged groups in society. The welfare state is keeping people poor. When a person goes on relief, he may improve his immediate economic position, but he will miss out on far more important long run benefits. Job experience leads to higher incomes. Welfare experience does not.

Toward Full Employment

In August 1994, West Virginia’s unemployment rate was 9%; Nebraska’s was 2.4%. This difference is not new; throughout the past third of a century, unemployment rates have always been dramatically lower in Nebraska than West Virginia. Why?

While our analysis is still preliminary, statistical evidence suggests that states with relatively high long-term unemployment rates tend to have relatively high levels of public assistance, rising state and local tax burdens, and very high rates of unionization. In short, anywhere public policy and institutional rigidities tend to make labor expensive, more people will be with-out jobs.

To improve job opportunities, many restraints on labor markets, such as minimum wages, should simply be abolished. In addition, welfare should be privatized. John Fund of The Wall Street Journal has proposed a system that would freeze welfare spending but grant tax credits to citizens wishing to help the poor through private charities such as the Salvation Army. This would dramatically reduce the work disincentives of the current welfare system; few private charities would give someone $1,200 in monthly benefits for extended periods of time.

There are other ingenious policies that would maintain the intent of current legislation but lead to much greater efficiencies and higher employment in labor markets. Americans would be immensely better off if they privatized the unemployment compensation system — and we do not mean merely turning its administration over to private insurance companies. One good idea would build on the idea of the Individual Retirement Account. Individuals could deposit tax-free dollars in an Income Security Account, or ISA; withdrawals could be made after a worker loses his or her job or retires. Withdrawals might also be made for non-routine medical expenses, as in a Medical Savings Account, solving many of our health care problems.

By privatizing unemployment insurance, we would reduce the work disincentives in the present system. A worker who lost her job could draw on her ISA, and thus receive income protection. Yet she would have every incentive to quickly find a new job, as she would be living off personal savings, not off the taxpayer.

Another sensible reform would be to reduce the hidden taxes of regulation by forcing regulators to take the private costs of their actions into account when making pronouncements about private economic behavior. We can end damaging and ineffective attempts at fiscal policy stimulus through a variety of constitutional means, ranging from a balanced budget amendment to line-item vetoes to requirements for voter or supermajority approval of major tax or spending initiatives.

In short, the worthy goals of the welfare state can be achieved in ways that empower individuals, that restore a sense of individual responsibility, that unleash the spirit of enterprise, and that reduce disincentives to employment of both physical and human capital. We can aim to achieve the low unemployment of the pre-Keynesian era, when the joblessness rate averaged under 5% and periods of prolonged high unemployment were rare. In short, we can restore the labor market vitality that once reigned in our nation, a vitality that made the United States the leading economic power in the world.

Lowell E. Gallaway is Professor of Economics at Ohio University and and co-author of the award-winning Institute book, Out of Work: Unemployment and Government in Twentieth-Century America.

Richard K. Vedder is a Senior Fellow at the Independent Institute in Oakland, Calif., Distinguished Professor of Economics at Ohio University, and co-author (with Lowell Gallaway) of the award-winning Institute book, Out of Work: Unemployment and Government in Twentieth-Century America.

Can Teachers Own Their Own Schools?From Richard K. Vedder
CAN TEACHERS OWN THEIR OWN SCHOOLS? New Strategies for Educational Excellence
In Can Teachers Own Their Own Schools?, Richard Vedder examines the economics, history, and politics of education and argues that public schools should be privatized. Privatized public schools would benefit from competition, market discipline, and the incentives essential to produce cost-effective, educational quality, and attract the additional funding and expertise needed to revolutionize school systems. Learn More »»

Home | About Us | Blogs | Issues | Newsroom | Multimedia | Events | Publications | Centers | Students | Store | Donate

Product Catalog | RSS | Jobs | Course Adoption | Links | Privacy Policy | Site Map
Facebook Facebook Facebook Facebook
Copyright 2014 The Independent Institute