Freeman

ARTICLE

The Savings Crisis

Taxation and Welfare Have Eroded Thrift and Self-Reliance

MARCH 01, 1999 by JOHN HOOD

John Hood is president of the John Locke Foundation, a non-profit think tank based in Raleigh, North Carolina, and the author of The Heroic Enterprise: Business and the Common Good (The Free Press).

It’s a constant refrain among politicians and the news media: America has a low savings rate. This, it is said, has dire consequences for the long-term health of the economy. Some analysts put the personal savings rate—the percentage of disposable income that isn’t consumed in a given year—as low as 2.1 percent in 1997, a 63-year low. Although this undercounts the real amount of American savings and investment, there is no doubt that the private savings rate is near an all-time low.

Why is this? Are Americans simply irresponsible? Should government force households to save more money for a rainy day, as some doomsayers have suggested?

On the contrary. America’s savings crisis is almost totally manufactured by government itself. Through punitive taxation and the growth of the welfare state, big government has legislated against the habits of thrift and delayed gratification on which human progress is based. The decline in the savings rate over the course of the twentieth century isn’t just a minor economic trend. It represents the threat that government expansion poses to the principles on which successful human civilization rests.

What Is Savings?

It’s worth pausing to consider the definition and origin of savings. After all, economists aren’t talking simply about a bank account, or even an investment in stocks or bonds, when they address the issue of savings. For example, one’s home is a form of savings, in that the value of the home usually appreciates even as one uses the home for current consumption (shelter).

Indeed, the official savings rates don’t factor in the appreciating value of assets. They reflect the difference between reported income and reported expenditures. The Commerce Department, which generated the 2.1 percent rate cited above, doesn’t actually track inflows and outflows of wealth. It doesn’t include the rise in the value of investments in individual retirement accounts, 401(k) plans, real estate appreciation, or the rise in value of mutual funds or corporate stock. It doesn’t count the implicit wealth people accumulate in pension plans. The Federal Reserve computes an alternative measure incorporating some of these factors and estimated that the 1997 savings rate was actually 7.1 percent of disposable income. Still, that’s pretty small compared with the savings rates of other countries. In Asia, rates reach as high as 20 percent.

The reason the issue of savings is so important is that deferring consumption—which is probably the best definition of savings—is essential to human happiness. That statement may seem paradoxical. How can forgoing current pleasure or satisfaction by choosing not to buy a good or service today increase one’s happiness? To answer that question, it’s important to look at how four closely related economic behaviors—savings, insurance, investment, and specialization—arose among prehistoric human cultures.

A Short Economic Prehistory

The earliest societies historians can identify are the nomadic hunter-gatherers who inhabited various regions of Africa, Asia, Europe, and the Americas. Throughout most of their history, these hunter-gatherers traveled in small groups of 20 to 30 related individuals—a patriarch and matriarch, their children and grandchildren, mates from neighboring groups, and other assorted relatives. They worked together both to hunt animals for meat and to gather fruits, vegetables, nuts, simple tools, and other useful objects. Because these groups were constantly on the move, their housing consisted of either portable shelters (tents, or yurts) or structures they found or could construct quickly (caves, lean-tos).

In this earliest of human societies, the line separating political, economic, and spiritual institutions was blurred or nonexistent. Few of the tools future civilizations would use to draw such lines were available to the hunter-gatherers. Their nomadic lifestyle, for example, provided little opportunity to develop or enforce private property rights, not only over land but over the products that land might produce. Private property rights are indispensable if societies are to separate the public from the private, the sphere of human activity over which political power is freely wielded and the sphere over which it is limited or excluded.

Furthermore, the small size of these societies necessarily reduced the extent to which differences in knowledge or talent resulted in the development of markets or a division of labor beyond the most rudimentary one. Finally, because they lacked any practical means of saving and transporting food of more than limited quantities, these earliest human societies provided little reward or incentive for hard work. All that was hunted or gathered was consumed relatively quickly. An individual who was exceptionally talented as a hunter or gatherer had no reason to invest much of his time performing those tasks.

Stages of Economic Discovery

Even under these conditions, however, developments occurred in economic exchange and institutions. The earliest hunting and gathering humans cooperated with each other to derive sustenance from a sometimes-forbidding environment. They had to develop terms of trade and rules of work and reward in order to put members of the clan or band to productive use. A simple division of labor did evolve, based on age, sex, and physical prowess (or infirmity). We know that these small societies, while primitive and sometimes brutal, also gave opportunities for those with different natural talents and skills. And while the invention of money was far in the future, these early humans engaged in mutually beneficial trade in valuable goods that sometimes spanned surprising distances.

Archaeologists and anthropologists divide human prehistory before metallurgy into three periods: Paleolithic (Old Stone Age), Mesolithic (Middle Stone Age), and Neolithic (New Stone Age). The names suggest a categorization based on stone technology, but the real dividing lines among these periods involve changes in geography and corresponding changes in economic technology and organization.

For the purposes of understanding economic history, the Paleolithic Period is critical. It lasted from 2.5 million years ago until about 10,000 B.C. Humans during this period include Homo erectus and archaic Homo sapiens. Early in the period, humans used simple tools made either of the core of a stone after pieces had been flaked off, or the flakes themselves. They used fire (discovered sometime between 1 million and 500,000 years ago) to drive or prod animals, to prepare food, and of course to keep warm.

Later, in the so-called Middle Paleolithic Period, flake tools were developed into a wide array of scrapers, borers, and points used to manipulate animal kills and other materials. This was the period of Neanderthal Homo sapiens in Europe and some humans even closer in appearance to modern humans. Their tools may have improved their hunting but, more important, there is evidence of extensive social cooperation not only in hunting but in storing and exchanging food. As one writer has noted, hunting is not just about killing animals. Killing animals is the easy part, whether it is by using weapons, by trapping, or by driving prey over cliffs. The difficult part lies in the organization of personnel so that they are in the right place at the right time and with the right gear to ensure a better-than-average chance of success.

Even more difficult is the task of coping with failure, weather, and the decline of animal stocks. Remember that these societies lacked effective technologies for storing food, and a diet of nuts, berries, and other gathered foods was of insufficient quantity or nutritional value to tide families and clans over in the event of a seasonal decline of game.

Humans Beings Invent Savings

It is here, in the Middle Paleolithic populated by humans whose language skills were limited or nonexistent, that economic institutions extending beyond kinship probably developed. More specifically, these Middle Paleolithic people pioneered the crucial economic behavior of managing risk—in this case, the risk of starvation. They did it with two innovations: savings and group insurance. In the former case, they learned to put stores of food in secure places in their hunting and gathering territory, and then budget them to cover subsistence needs over the lean times. In the latter, nearby small kinship groups established social networks through alliance, intermarriage, visiting, and feasting that served as regional insurance policies. A group facing a decline in game (or lack of success in hunting) could make claims on the food stores of neighbors with whom they had previously invested time, attention, or their own food. This served to spread the risk of starvation over a broader number of hunters and gatherers, thus improving the chances of survival for all.

The concepts of savings and insurance are closely related; indeed, one might think of insurance as a way of realizing the benefits of savings without 1) having to develop technologies for saving food over a long period and 2) having to wait until a stock of savings is sufficiently large to tide a family or group over during hard times. Insurance-type arrangements, if they are of large enough breadth for their members to escape simultaneous misfortunes, represent a reasonable technique for coping with risk under difficult technological conditions.

The Upper Paleolithic, beginning about 50,000 years ago in various regions, represented the appearance of humans roughly identical to us today. It also brought two additional economic innovations of critical importance: specialization and investment. The Cro-Magnon and other advanced Homo sapiens of the Upper Paleolithic, unlike their predecessors, seem to have specialized in the hunting or gathering of specific species, rather than being purely opportunistic and pursuing whatever was visible in their range areas. Like savings and insurance, specialization was a form of risk management in the sense that it usually involved careful study of particular animals with the goal of predicting their behavior—and thus increasing the likelihood of catching them. Specialization took some of the risk out of hunting and gathering by allowing for the accumulation of knowledge with which small human societies could ensure a more stable and predictable food supply. It not only required study of animal habits and movements, but also a more detailed understanding of the climate and topography of their range areas.

Accumulating such knowledge had a cost. It took a great deal of time and effort, which earlier human societies had probably devoted to less taxing and more recreational activities. Indeed, many people assume that economic progress throughout history has meant a progressive increase in leisure time, but until recently the opposite has been true. In earlier Paleolithic groups, there was little to be gained by spending more time on hunting and gathering activities rather than on leisure activities. Hunting and gathering was such a hit or miss process that increased time and effort invested in it wouldn’t have yielded much more in the way of results.

But with the introduction of specialization, increased time and effort devoted to the study of quarry and environment could pay off in higher returns. So the more sophisticated hunters and gatherers of the Upper Paleolithic discovered the value of investment—forgoing consumption or leisure in the short run for greater material benefit in the long run. Until it became practical to store information (through better language skills and specialization in a relatively narrow field of inquiry) investment would have had paltry returns.

This revolution in hunting and gathering had repercussions beyond improvements in the food supply. It was also associated with the development of larger, multi-family groups. For one thing, specialization and investment made population growth much more desirable. Additional people meant additional opportunities to gather useful information and carry out the more complex strategies required to catch more game. At the same time, the improved food supply made it possible to support larger populations. Specialization and the increased scale of human populations led to larger and more permanent settlements, and the beginnings of a territoriality that would eventually lead to concepts of private property. And both the larger populations and the increased productivity of hunting and gathering made possible a greater division of labor within the groups, allowing some individuals the option of specializing in tasks other than hunting and gathering.

Savings and Investment Create Trade

Finally, the development of larger groups with a higher level of specialization created the conditions required to inaugurate trade among unrelated groups. After all, if one group chooses to specialize in a particular form of hunting and gathering, then the occasional acquisition of game or artifacts outside the specialty becomes more attractive. It represents variety and diversion. A coastal group specializing in catching fish and gathering shellfish would have something of value to offer an inland group specializing in hunting deer, horses, or other big game. This increased value of trade would apply not only to foodstuffs but also to other goods. In both western and central Europe of the Upper Paleolithic, for example, there is evidence that several species of seashells were traded or exchanged over hundreds of miles. Similarly, there is evidence of relatively long distance trade in high-quality flints and other raw materials.

The Upper Paleolithic Period ended with the end of the last Ice Age and the impact of global warming on the environment. The initial stages of this warming can be traced back to about 11,000 B.C., and affected different parts of the Old and New Worlds at different times. The oceans rose as ice melted, forests expanded over the expanses left behind by the retreating glaciers, and portions of the globe saw climate change of cold to temperate or temperate to hot. These changes led either to human migration or to significant changes in the way humans lived in their home areas. One further note about the end of the Upper Paleolithic is that it coincides with the final settlement of every habitable continent.

The key point for our purposes is that the development of multi-family societies, and later towns and cities, was based on the invention of economic practices involving delayed gratification, or savings. The habit of savings and investment paid its greatest returns once human settlements developed around new notions of private property. Once families could work their own land or tend their own herds, the economic value of hard work soared. After all, at a very basic level, agriculture and animal husbandry were the epitome of savings and investment. Rather than literally “eating the seed corn,” farmers planted it in the ground, tended it for months (perhaps nearly starving the whole time), and then reaped a far greater bounty at harvest time. With herders, delaying gratification meant feeding scarce grains and water to stock animals rather than to their own families. It meant tending, not eating, the stock. Those who could learn to do this would, in the long run, be far better off that those hunter-gatherers who worried only about today’s meal.

If one considers the broad sweep of human history, this radical revolution in how human beings live has occurred only recently. The passage of time from the last Ice Age and early human settlements to the modern day is, geologically speaking, a blink of the eye. Yet consider how radically different our lives are compared with our hunter-gatherer ancestors.

The Savings-Rate Decline

Although no data exist on the personal savings rate per se throughout recorded history, one can safely say that it was extremely high. Even as recently as the nineteenth century, most Americans still lived and worked on farms. They spent most of their waking hours saving and investing—working the land for the promise of future return. Few could afford luxury items, vacations, or other forms of current consumption.

The Industrial Revolution changed all that. Mass production, first of clothes and later of other goods, significantly reduced their cost and thus reduced the amount of time and effort consumers had to save and invest in order to afford them. Advances in transportation technology came next, facilitating trade and thus the comparative advantages in price and quality of purchasing goods from others rather than making them oneself. Finally, dramatic advances in agricultural technology and practices so increased the productivity of farming that prices fell, far more food and fiber could be produced per person, and farmers gradually found their time and effort more profitably invested elsewhere.

The purpose of savings and investment changed from acquiring basic necessities to addressing “quality of life” goals. As workers moved to cities, then to suburbs, in the new industrial economy, savings and investment became a means of purchasing a new automobile, buying a nice home, sending children to college, affording higher levels of health care, dealing with sudden disability or unemployment, and providing for a comfortable retirement.

Unfortunately, the advent of the New Deal in the 1930s, followed by the Great Society programs of the 1960s, began to suggest to Americans that these big-ticket items in their lives might be paid for by someone else. Roosevelt’s Social Security and unemployment insurance promised income in case of sudden loss of employment, disability, or old age. Johnson’s Medicare and Medicaid promised families that they wouldn’t have to save and invest to take care of elderly relatives who needed lots of medical attention or institutional care.

At the same time, these expansions of government spending required increasing amounts of revenue. The government got this revenue in large part by taxing private savings and investment. The first permanent income taxes arose in the early 1900s. Marginal rates soared during World War I and then rose again during the New Deal and World War II. Payroll taxes were added to fund entitlement programs, taking money that families would otherwise have saved for their own needs. Today, the income tax code is strongly biased in favor of current consumption and against long-term savings. For example, a worker who earns $1,000 will pay income and payroll taxes only once on that money if he chooses to spend it immediately. If he invests it in corporate stock, however, he will likely pay income tax on the money three or more additional times, as the corporation pays taxes on earnings, the worker pays taxes on his dividends, and then pays again if he leaves the investment to his heirs.

Is it any wonder that the rise of the welfare state has coincided with what many observers, of varying political persuasions, call a new ethic of instant gratification and self-indulgence? Public opinion polls show that Americans have become convinced that government will help them through the big expenses they will face in their lives, be it unemployment, education, health care, home-buying, disability, or retirement.

This is a mirage, of course. If one thinks of the money taxed out of families and “invested” by government for these future needs as the real rate of personal savings, then it’s not being invested very well. Everyone knows the crisis that faces Social Security in the long run, as Baby Boomers retire and base their expectations of a comfortable old age on the prospect of government transfer payments from a smaller cohort of working Americans. But the problem isn’t just limited to retirement. Medicare will go broke far earlier, perhaps in the next decade. Medicaid—increasingly a program of nursing-home payments for the middle-class elderly instead of an anti-poverty program—will collapse next. Unemployment insurance is already an awful deal for most workers, who are eligible to draw money only if they are terminated without cause, and even then they get paltry returns. The money Americans are forced to save and invest in education also gets poor returns, as the mediocre test scores of American schoolchildren and the declining academic standards of American universities illustrate.

So even if one defines the savings rate in the most expansive manner—as not only private savings but also the amount of money government “saves” on your behalf—there remains a savings crisis. Too little of our seed corn is being invested in long-term productive enterprise, with a promise of future harvests. Instead, government is feeding this seed corn to current beneficiaries of entitlement programs. Americans have lost the impulse that industrious human beings throughout history have maintained to work and save for the future.

This is the real savings crisis. It isn’t just one of dollars and cents, but of thrift and self-reliance. The solutions are clear: end the tax code’s bias against private savings and investment, and end government’s false promise of saving and investing on our behalf. There is no more serious challenge facing American society and American freedom.

(For further background, see Barry Cunliffe, The Oxford Illustrated Prehistory of Europe (Oxford: Oxford University Press, 1994) and Mark Kishlansky, Civilization in the West, Volume 1 (New York: HarperCollins Publishers, 1991).

ASSOCIATED ISSUE

March 1999

comments powered by Disqus

EMAIL UPDATES

* indicates required
Sign me up for...

CURRENT ISSUE

September 2014

For centuries, hierarchical models dominated human organizations. Kings, warlords, and emperors could rally groups--but also oppress them. Non-hierarchical forms of organization, though, are increasingly defining our lives. It's no secret how this shift has benefited out social lives, including dating, and it's becoming more commonplace even in the corporate world. But it has also now come even to organizations bent on domination rather than human flourishing, as the Islamic State shows. If even destructive groups rely on this form of entrepreneurial organization, then hierarchy's time could truly be coming to an end.
Download Free PDF

PAST ISSUES

SUBSCRIBE

RENEW YOUR SUBSCRIPTION