The Growth of Government in the United States


Big government—we heard a lot about it when Ronald Reagan was first seeking the Presidency. Lately the topic has attracted less attention from politicians, commentators, and scholars. But the thing itself has not disappeared. Over the past decade, as over the past century, American government has continued to grow.

Our nation was founded by men who believed in limited government, especially limited central government. They were not anarchists; nor did they espouse laissez faire. But they did believe that rulers ought to be restrained and accountable to the people they govern. If the founders could see what has happened to the relation between the citizens and the government in the United States during the past two centuries, they would be appalled.

The size and scope of government are important for many reasons. By virtue of their taxing, spending, and regulating, governments affect the allocation of economic resources, the distribution of wealth, and the rate of economic growth. Governments determine the very nature of our political economy, the character of the social organization within which we may lawfully conduct our affairs and pursue our goals. The size and scope of government determine—they are, so to speak, the opposite side of—our freedoms.

All but the few anarchists among us recognize that effective liberty requires some government, if only to define and protect rights to life and property. Beyond a point, however, bigger government begins to cut into our liberties; then the growth of government becomes synonymous with the sacrifice of liberty. In the United States, we entered this stage a long time ago.

Charting Government’s Growth

When we say that government has grown, what do we mean? Government is not a single thing, measurable along a scale like inches of height or pounds of weight. The size of government can change in different dimensions, many of them incommensurable.

One dimension of government is the burden of taxation. In the early years of the 20th century, federal, state, and local governments took in revenues equal to 6 to 7 percent of the gross national product (GNP). By 1950, revenues had risen to 24 percent of GNP. Over the past 40 years the tax proportion has drifted irregularly upward, and now stands at about 32 percent of GNP

Many people seem to think that taxes were cut in the 1980s, perhaps because certain politicians worked hard to create the impression and to take credit for it. The truth, however, is that overall—that is, considering all taxes together—taxes as a percentage of GNP were slightly higher at the end of the 1980s than they were at the beginning of the decade. The tax laws were changed repeatedly, and some tax rates were reduced, most notably the top rate of the Federal individual income tax. But other taxes were increased, most notably the payroll tax rate and the base earnings on which it is levied. Altogether, there has been no tax cut.

Another dimension of government—and an even more appropriate index of its fiscal burden than tax revenues—is government spending. In the early years of the 20th century, federal, state, and local governments spent an amount equal to 6 to 7 percent of the gross national product. By 1950, government outlays, net of intergovernmental grants, had risen to 21 percent of GNP. Over the past 40 years the spending proportion has drifted irregularly upward, and now stands at about 34 percent of GNP.

Many people seem to think that a so-called Reagan Revolution cut government spending in the 1980s. In fact, nothing of the kind happened. Government spending continued to grow rapidly, and relative to GNP it was slightly greater at the end of the 1980s than at the beginning of the decade. Of course, some forms of spending grew relatively slowly, some relatively rapidly, but overall government spending grew faster than the private economy.

Still another index of the size of government is government employment. Early in the 20th century, federal, state, and local governments employed about 4 percent of the civilian labor force. By 1950, government employment had risen to about 10 percent. During the past 40 years, government employment rose and fell: it reached a peak in the mid-1970s at nearly 16 percent, then fell to its present level of roughly 14 percent—that is, one worker in every seven. (This figure doesn’t include the two million members of the armed forces.)

Although the employment ratio seems at first glance to indicate a recent decline in the government’s size, one should not jump to that conclusion. Many people who are classified as members of the private labor force actually work for governments as contractors (or employees or subcontractors of contractors). Between 1980 and 1987, for example, about a million additional workers found jobs in the defense industries—virtually all of them, obviously, working on projects set in motion and financed by the government. Similar changes have occurred elsewhere as governments have “privatized” more functions by contracting out the performance of tasks previously performed by workers on the regular government payroll. It would be a mistake to suppose that government has shrunk just because regular government employment hasn’t kept pace with the growth of the labor force.

Increased Regulation

Indexes of government taxing, spending, and employing tell us something about the growth of government, but what they tell us is far from the whole story. Even if government had grown in none of these dimensions, it might have become a bigger factor in determining the allocation of economic resources, the distribution of wealth, and the rate of economic growth. It could have done so—and in fact it has done so—by means of in creased regulation.

I refer to regulation in its broadest sense, including the legal requirements expressed in statutes, executive orders, and judicial decisions as well as the directives of regulatory agencies such as the Environmental Protection Agency and the Securities and Exchange Commission. The shelves are sagging under the growing weight of such authoritative commands—just skim through the Federal Register some time if you are willing to risk becoming deeply depressed, or, for an even more profoundly depressing experience, attempt to read any recent regulatory statute, say, a consumer protection law. Tax laws also are de facto regulatory statutes, and perhaps the most incomprehensible of all legislation.

In regulation we come face to face with the visible hand of government at work imposing largely hidden costs. The costs are hidden in part because they are borne by private parties in the process of compliance—meeting prescribed standards, avoiding prohibited actions, and so forth—and spread across the consuming public in the form of higher prices for goods and services.

But that cost, great as it is, is not the whole. In large part the costs of an economy extensively controlled by government requirements and prohibitions take a form no one can compute: these costs arise when governments distort the price structure so that mutually beneficial exchanges are never made, so that new products never reach the market, so that new competitors never gain entrance into an industry, so that innovations of countless different sorts are never made, there being no prospect of profit to stimulate their development in the first place.

Americans, despite much habitual grumbling, are proud of their economy, which continues to rank among the world’s most productive. What we have done, we can see and appreciate. But what we might have done, the miracles we might have wrought had we been free to do so, we shall never know. In this sense, the costs of an ever more regulated economy are truly incalculable.

But what of the recent deregulation we have heard so much about? Yes, something did happen along those lines. In energy, communications, transportation, and certain financial services, the heavy hand of government lightened somewhat in the late 1970s and early 1980s. By the mid-1980s, however, the steam had gone out of the deregulation movement, and little significant progress has occurred during the past five years.

Meanwhile, offsetting increases of regulation were taking place in other areas, including international trade and finance, environment, safety, agriculture, aerospace, insurance, and health services. The multi-billion-dollar bailout of the farm credit system might have deserved prominent mention had it not been completely overshadowed by the gargantuan bailout of bankrupt savings and loan institutions. In view of the mind-boggling magnitude, it is remarkable how little political debate surrounded this transfer of—who knows? $300 billion? $400 billion?—from the taxpayers to a select group of hard-lobbying beneficiaries.

Standing back and surveying all the regulatory changes, plus as well as minus, during the past decade, what can we conclude? I defer to the judgments of William Niskanen and Murray Weidenbaum, two of the best informed students of the subject. Niskanen concludes that “the net amount of regulations and trade restraints has increased” since 1980, and Weidenbaum says that “the federal government, objectively measured, is a bigger presence in the American economy today than when Jimmy Carter left office.” So much for deregulation.

The Reasons for the Growth

So, government is still big, and government is still growing in the United States. Why? To answer this question, we need to understand some history. To start with, we need to find out whether American government was ever really small and, if it was, what made it get bigger.

You may recall from your school history text that the United States government in the mid- and late-19th century adhered to the doctrine of laissez faire—the doctrine of hands off. Well, that lesson conceals more than it reveals. In fact, in important respects the label of laissez faire shouldn’t be applied at all. At no time did the United States fully achieve the condition denoted by the term laissez faire.

From about the 1840s to the 1890s, however, the United States approximated perhaps as closely as any large society ever did a condition we might call the minimal state. Certainly, governments didn’t spend or tax on anything like the modern scale—5 percent of GNP would probably overstate the ratio. (We must rely on imprecise estimates, so we can’t say for sure.) Not many people worked either directly or indirectly for governments, certainly no more than a few percent of the labor force even at the end of the 19th century. By these familiar indexes of the size of government, the 19th-century government appears to qualify as a minimal state.

And yet, to say that government was much smaller in these dimensions is not to say that the governments of the 19th century were unimportant or that Americans were reluctant to politicize essential economic questions.

Most important, governments established the legal framework of property rights within which the price system Could operate to allocate resources. As economic conditions changed, governments molded the law to new conditions and allowed economic growth to continue relatively unimpeded by obsolete legal restraints. Innovation of the doctrine of eminent domain, for example, played a crucial role in permitting construction of the canals and railroads that did so much to facilitate economic development.

In addition, the central government episodically committed the nation to war—at times to wars of conquest such as the Mexican War that added vast territories to the country’s expanse. Governments disposed of the public domain, transferring the bulk of it, by sale and giveaway, into private ownership. Governments collected various taxes, including the tariff that was employed to carry out what nowadays would be called an “industrial policy.” Governments invested in and regulated banks and transport systems; they Supplied education; and at the local level they conducted the countless interventions subsumed under the heading of “the police powers,” many of which would be found intolerable in our time. Before the Civil War, governments sustained the institution of slavery, a matter of momentous economic consequence as well as an arrangement so violently incompatible with higher American ideals that today no one would defend it. In sum, governments in the 19th century, though in most respects far more limited than governments today, were hardly insignificant.

In certain respects, 19th-century conditions made it possible for governments to be much smaller than they are today and still wield great power over the economy. Nineteenth-century governments didn’t spend a lot of money to enrich politically influential parties. But they enforced slavery, a momentous matter all by itself; they dis. posed of the public domain (Federal railroad subsidies alone transferred an area roughly twice the size of Colorado); and they managed property rights in fundamentally important ways. One reason modern governments do so much more by means of taxes, subsidies, and pecuniary transfers is that they lack some of the powerful means available to their predecessors—millions of workers to hold in thrall, a continent to give away.

Having acknowledged that laissez faire never was the case, and that even at its smallest, American government engaged in extremely important activities, we must recognize that governments still might have been much bigger than they were during the 19th century. One way to confirm this potential is to notice that, occasionally, government did get much bigger. During the Civil War the U.S. government not only increased its taxing and spending hugely; it also printed and spent fiat paper money, overrode a variety of civil rights, including the writ of habeas corpus, and conscripted men to serve in the army. After the war, however, the government shrank—not quite to its prewar dimensions, but back to a more traditional scope nonetheless.

At the end of the 19th century, James Bryce, a perspicacious British commentator, noted that America’s poor, long invested with political rights, might easily have turned on the rich and “thrown the whole burden of taxation upon them, and disregarded in the supposed interest of the masses what are called the rights of property.” But, Bryce went on to say, “not only has this not been attempted—it has been scarcely even suggested . . . and it excites no serious apprehension.” There was, he observed, “nothing in the machinery of government that could do more than delay it for a time, did the mass-es desire it.” What prevented such sweeping redistribution was, in Bryce’s judgment (and mine), the prevailing ideology. In Bryce’s words, “equality, open competition, a fair field to everybody, every stimulus to industry, and every security for its fruits, these [the Americans] hold to be the self-evident principles of national prosperity.”

A Revolution in Ideology

Obviously, somewhere along the line, the dominant ideology of the United States has undergone a complete revolution. I exaggerate only a little if I say that now most Americans believe that governments may legitimately give to people or take away from them virtually anything, any time, any place—checked only by the license conveyed by government officials’ having been elected in the Constitutionally sanctioned manner. Where once Americans viewed the powers of government as properly limited and the rights of individuals as primary and natural, Americans now view the powers of government as properly unlimited and the rights of individuals as subordinate to the pursuit of any declared “public policy.” How did so many activities once viewed as “not the proper business of government” come to be undertaken by governments and accepted as legitimate?

I have no short, definitive answer. The process by which the dominant ideology of the American people changed over the past century is surely complex, and no one understands it fully. It is possible, however, to identify certain critical aspects of the process.

Ideologies are, to a large degree, the product of people’s social experience. Although polemicists and propagandists are always at work, they don’t work in a vacuum. Ideas appeal to the public more or less, depending on how they seem to fit the broad contours of reality. When great events happen, ideologues always stand ready to interpret in a preconceived way what has happened, but again they are constrained by undeniable facts. It just wasn’t possible, for example, to interpret the Great Depression as a triumph of capitalism.

With the economic transformation of the United States in the late 19th century, a process that witnessed rapid urbanization and the growth of big business as well as many other striking developments, collectivist views began to gain adherents here, as they did throughout the Western world. The ideological shift became quite striking during the Progressive Era at the beginning of the 20th century. It was not unconnected with such consequential institutional changes as the Income Tax Amendment to the Constitution and the creation of the Federal Reserve System. So, clearly a tendency existed, rooted in the changing character of American society and economy and related developments abroad, moving American opinion leaders away from support for individualism and private property rights and toward support for collectivism and more active government involvement in economic affairs. By itself this tendency would have helped to promote the growth of government. But the secular tendency of ideology was hardly the only aspect of ideological change to affect our political economy in the 20th century.

National Crises Contribute to Shifting Views

Even more important, in my view, was the succession of national emergencies that struck the country between 1914 and 1945, and to a lesser degree during the postwar era as well. Clearly the world wars and the Great Depression had the greatest impact, although the period from the mid-1960s to the mid-1970s also witnessed many significant events. How did these crises contribute to shifting American views about the proper role of government in economic life?

In brief, the process worked as follows. First, each crisis gave rise to public clamor that the government “do something.” In the post-Progressive era, no government wished, nor could rulers expect to prosper politically if they chose, to keep their hands off the economy when a problem of overriding public concern had arisen. So, whether to prosecute a war or to alleviate a depression or to suppress a great labor upheaval, the government adopted interventionist policies to deal with the crisis.

Any government policy entails costs. The greater the costs, the less willing the public is to support the policy. Hence governments face a hard choice: on the one hand they cannot just stand by, because the public demands that they act; on the other hand, any policy they adopt is subject to the law of demand, which means that, in the extreme, the public will reject a government that imposes unbearably large sacrifices on the citizenry. How can the government get off the horns of this dilemma?

The short answer is: adopt policies that obscure the costs as much as possible. One way to do so is to avoid policies that entail pecuniary (and therefore easily counted, aggregated, and publicized) costs; instead, adopt command-and-control policies whose costs tend to be bidden or extremely difficult to compute and aggregate.

For example, if an outright gift of public funds is made to farmers, everyone can see how much the government is taking from taxpayers in order to give to farmers. But if the government adopts crop restriction programs, the costs are spread across all those, including foreigners, who purchase the farm products whose supplies have been restricted. Who can say how great the transfer is? Indeed, many people will never appreciate the redistributive aspect of the program, as they would if an explicit farmer-benefit tax had been enacted and added, say, as a separate item on the income tax return.

Other prominent examples include the conscription of men into the armed forces, the suppression or restriction of certain industries or products during wartime, the establishment of priorities for the supply of selected goods and services, the rationing of consumer goods, and a whole array of price, wage, and rent controls that distort the structure of prices and alter the allocation of resources, benefiting some while placing burdens on others. The common aspect of all such policies is that their costs are more or less hidden, and hence the political reaction to them is muted.

When the government adopted cost-obscuring policies during the great national emergencies, officials also undertook massive propaganda efforts to justify their actions. No doubt many citizens believed what their leaders told them about the virtues of the policies. In addition, during each crisis the administrators of the controls finessed them to eliminate some of their most objectionable aspects, and people more readily swallowed the medicine when its bitterness was diminished. All the while, people tried to make the best of a bad situation, and many discovered that even a controlled economy offers certain avenues to personal success, either within the government itself or within the favored sectors of the remaining “private” economy. People adapt.

But here is the crux; they adapt not just their actions but eventually their thinking, too. In William Graham Sumner’s telling phrase, “it is not possible to experiment with a society and just drop the experiment whenever we choose. The experiment enters into the life of the society and never can be got out again.” People who had experienced the abruptly enlarged government programs of the national emergency periods came away from those experiences with an altered view of the benefits and costs, virtues and vices, of an expanded government presence in American economic life. Further, in each case, committed collectivists took advantage of the event to hammer home the point that what the government had done, apparently with success, during the crisis demonstrated the great potential for good that lay in expanded government action even during normal times. To many people, the argument seemed to make sense. After all, we had won the war, we had got out of the depression.

In retrospect and with careful study, one can see that people were committing the fallacy of post hoc, ergo propter hoc. In many cases, if not in all, the genuine benefits accruing to the nation as a whole—scape from the depression, defeat of the Nazis—me forth in spite of, not because of, the government’s imposition of sweeping controls. But, again, people in general didn’t reach this conclusion. Rather, they tended to accept the collectivist claims or, more cynically, to appreciate that even if the country at large might suffer, they themselves now had a grip on a personally rewarding piece of the statist program.

By the end of World War II the United States had altered its effective Constitution radically from the regime in place at the beginning of the 20th century. Now virtually any government action whatever in relation to the economy could be taken, so long as an electoral majority could be obtained by its initiators. In a political economy so corrupted by interest-group politics, an electoral majority was itself something that could be bought, indirectly if not directly. There remained no fundamental check on the growth of government—nothing to perform the restraining functions that the old Constitution and the dominant, limited-government ideology had performed in the 19th century. Politicians now could offer to sell virtually any economic policy whatever, no matter how few the gainers and how many the losers. Of course, such conditions were tailor-made to bring forth special interests prepared to buy the policies from the politician-suppliers.

In the welter of largely contradictory policies, deadweight costs mounted enormously. More and more resources were devoted simply to working for or against economic policies and to circumventing or adapting to the proliferating requirements imposed by government. Not surprisingly, more and more latent interest groups saw the need to organize for political action. By the 1970s the entire economy had been thoroughly politicized, so that even the most intimate matters, such as babysitting services or nursing homes or the religious calendars of employees, had become subject to government intervention. A few years ago, Grace Commission investigators discovered that the federal government alone was conducting 963 separate social programs, many of them designated “entitlements.” America’s political process has become the locus of organized predation on a massive scale.

The growth of government cannot continue forever. An economy totally dominated by government isn’t viable—even the Communists now recognize this. Eventually the government will eat up so much of the private sector that it will destroy the means of its own sustenance. At some point the balance of political power will swing away from support for bigger government in an effort to revive the dying goose that lays the golden eggs. If such reaction can occur in the Soviet Union and Eastern Europe, it certainly can occur here.

But that glorious day, in my judgment, is not yet in sight. Despite the plethora of burdens laid on the American people, the private economy retains sufficient vitality to limp along at a modest pace, albeit far slower than a truly free economy could progress. And the American people continue to demand, or at least to tolerate, a multitude of government programs promising solutions to almost every conceivable problem. So long as the dominant ideology lends support to collectivist measures and acquiesces in a political system dominated by special-interest deals, no far-reaching reform of our political economy is possible. So, as we look into the future of the United States in 1990, as far as the eye can see, we behold only big government and more big government.


August 1990



Robert Higgs is a Senior Fellow in Political Economy at The Independent Institute. He is also the Editor at Large for The Independent Review, the Institute's quarterly journal.

comments powered by Disqus


* indicates required
Sign me up for...


September 2014

For centuries, hierarchical models dominated human organizations. Kings, warlords, and emperors could rally groups--but also oppress them. Non-hierarchical forms of organization, though, are increasingly defining our lives. It's no secret how this shift has benefited out social lives, including dating, and it's becoming more commonplace even in the corporate world. But it has also now come even to organizations bent on domination rather than human flourishing, as the Islamic State shows. If even destructive groups rely on this form of entrepreneurial organization, then hierarchy's time could truly be coming to an end.
Download Free PDF