The Central Economic Fallacy of the Century
The Economy Cannot Be Micromanaged from a Central Point
NOVEMBER 01, 1997 by STEVEN YATES
Dr. Yates is adjunct research fellow with the Acton Institute for the Study of Religion and Liberty and the author of Civil Wrongs: What Went Wrong with Affirmative Action (San Francisco: ICS Press, 1994).
The late Murray N. Rothbard once published a major article titled “Ten Great Economic Myths.” Included on Rothbard’s hit list were the notions that deficits are the cause of inflation and that economists can accurately forecast the future. One myth that he didn’t cite dominates Washington today: that the economy can be successfully “managed” from some central point. This idea underlies, directly or indirectly, all of the others Rothbard mentions.
Unfortunately, society’s intellectual, political, and economic “mainstream” still accepts what should be called the Central Economic Fallacy of the Twentieth Century. The “mainstream” just doesn’t get it. Thus, we continue to see a basic progression. First, government subsidizes x or regulates y to correct for some government-diagnosed problem z. Unwanted side effects result, and z, assuming it exists, often grows worse. Government intervenes again to fix the side effects and redouble its efforts to battle z. More undesirable side effects result. And the process continues, with government growing inexorably as interventions accumulate. More and more of the economy is micromanaged through increasing webs of subsidy, regulation, and quick fix. The logical end result, as Ludwig von Mises has shown in great detail, is socialism.
Economic micromanagement has been developing at a steadily increasing pace since the Progressive Era, which initiated the social-activist view of government—that only government can effectively address social problems like poverty. Progressivism began a new round of interventions in an economy in which major industries were already well subsidized. Federal Reserve manipulation of the currency—namely massive credit expansion followed by deflation—caused the stock market crash of 1929 and the Great Depression. Then Franklin Delano Roosevelt’s interventionist policies deepened rather than relieved the economic crisis. (See, for instance, Rothbard’s America’s Great Depression.)
World War II gave an entire generation of young men and women something to do when there were few jobs at home. But what would veterans do when they returned home? The federal government quick-fixed the problem with the G.I. Bill, creating a new national myth: everyone should go to college. Colleges, rearmed with massive quantities of federal and state dollars, became universities and opened their doors to more and more people. The supply of college graduates in the labor market soared. Soon advanced degrees began to decline in value.
Here we see perhaps the worst feature of the Central Economic Fallacy: massive overproduction in certain areas and equally significant shortages in others. (The Soviet economy was only the extreme case of this phenomenon.) In the United States, the growth of university graduate programs has led to a glut of Ph.D’s, many of whom are unable to find desired academic employment. This situation has now spread to the hard sciences and includes people such as Alan Hale, co-discoverer of the much-watched Hale-Bopp comet. On the other hand, labor shortages have developed in a variety of occupations not requiring a college degree: carpentry, masonry, and other skilled trades best learned through the apprenticeship and therefore not amenable to the assembly-line approach taken by government-supported schools.
The welfare system is another consequence of the Central Economic Fallacy. The War on Poverty, one of the mainstays of the 1960s, has failed. It left an entire generation with a sense of entitlement and destroyed families by making fathers superfluous. Overall, the system rewarded a range of irresponsible conduct and encouraged dependency, reducing recipients’ need to mature, set goals, and become productive members of society. Sons, in particular, lacked responsible role models. An unfulfilled sense of entitlement helped generate resentment and encouraged criminal violence.
Dimly aware that something is wrong, the federal government is now desperately maneuvering to cut at least some of its dependents loose through “welfare reform.” Thus far, these efforts do not question the Central Economic Fallacy. For government needs to end, not reorganize welfare, and at the same time dismantle the subsidies and regulations making jobs so hard to come by.
The Central Economic Fallacy has given the country a soaring national debt and myriad job-destroying regulations, diminished the value of higher education, inflamed racial turmoil and other social divisions, pushed taxes upward, devalued the currency (“inflation”), increased the population of chronic dependents, and worsened crime. In fact, as documented by James Bovard in Lost Rights, the federal government now undertakes many activities more worthy of a police state than a free society. At the same time, our nation faces serious moral and cultural crises, threatening its very foundations.
For decades, critics of the Central Economic Fallacy have been ignored or dismissed out of hand. But so disastrous have been its consequences that even fans of expanded government have a difficult time denying that the Central Economic Fallacy has run its course. That anything as complex, intricate, and constantly changing as the American economy in the 1990s can be micromanaged from a central point is the overwhelming folly of our time. We have no alternative but to get rid of it. And we have to do so while recognizing that many leaders in academia, business, the media, and politics may never get it.