Being out of work in the old days usually brought shame and humiliation. How—and why—have we changed our feelings about unemployment?
-
December 1978
Volume30Issue1
During the early 1960’s, Professor Paul Jacobs of the University of California, intent upon discovering “what life as an unemployed worker means,” passed himself off as a job seeker among unemployed miners in West Virginia, factory hands in an upstate New York town, and migrant farm laborers in northern California. He found that while being without work was a sobering experience (tending, for example, to make people withdrawn and uncommunicative), it was difficult to discover many substantial changes in the standard of living or personal attitudes of the unemployed so long as they were receiving insurance benefits. Jacobs saw no sign that the jobless preferred unemployment to working, but neither did he conclude that the condition was a traumatic experience for those who had to endure it. If the behavior of the workers Jacobs observed was typical, and there is every reason to believe that it was (recently an observer of the unemployed in Germany described their mood as “carefree,” and a New York Times headline read: MANY JOBLESS IN FRENCH CITY BUT FEW WORRY ), perhaps the current concern about high unemployment is somewhat exaggerated.
The concern, however, is serious and rooted in history. For the postwar era has seen a dramatic, indeed revolutionary change both in the way the United States deals with the unemployed and how the unemployed deal with themselves.
During the Great Depression of the 1930’s America suffered from unemployment on an unprecedented scale. At its peak in early 1933 somewhere between 13,000,000 and 16,000,000 people (about one-fourth of the work force) were idle, and throughout the decade, despite New Deal relief and recovery measures, the unemployment rate never fell below 10 per cent. But in this respect the Depression was merely the most profound and long-lasting of many such catastrophes. Although far back into the nineteenth century the United States had been rightly regarded as a prosperous country and the land of opportunity par excellence, periodic “panics” had caused much unemployment. The record reveals, for example, that as early as 1737 many “honest and industrious tradesmen” in New York City were “reduced to poverty for want of employ,” and that during the unsettled period at the beginning of the American Revolution, joblessness was so common that a group of citizens in that city founded the “New York Society for Employing the Industrious Poor.”
For many reasons, however, the federal government never provided any assistance for the unemployed until the New Deal period, and even local public assistance was very skimpy. In general, when workers lost their jobs they had to depend upon their savings and the assistance of relatives until they found new ones; these resources exhausted, they could turn only to charity and the municipal poor-relief agencies. When the requested assistance was doled out by either private or public bodies, it was always hedged about with demeaning restrictions, designed to separate the “deserving” from what were variously called “incorrigible idlers,” or “vagrants,” or “bummers.”
The chief concern was, as one charitable organization put it during the depression of 1857, to avoid “the injurious effects of indiscriminate aid.” During the depression year of 1894, for example, Denver was flooded with unemployed miners. A local clergyman established a relief program, offering them three square meals and a bed for a quarter, with the proviso that those without the quarter could work off their “obligation” by putting in three hours of labor in the clergyman’s woodyard. The work was essentially useless—with a hundred men a day chopping, the wood piled up faster than it could be disposed of. But it was assumed that only by exacting this labor could “idlers” be deterred from taking advantage of the clergyman’s largess. A social worker, Philip W. Ayers, described this view of most nineteenth-century relief organizations. Work relief, he said, “must be… so unattractive as to guarantee that, when other work can be had, the laborer will seek it.”
Such attitudes were common to all the industrial nations in the nineteenth century. They were reinforced by the prevailing dogmas of classical economics: that governments should not meddle any more than was absolutely necessary in economic affairs; that work was available for all who were willing to labor; that public assistance to the destitute only encouraged heedless reproduction. It was “very rare,” said the respected American economist Francis Weyland in 1877, that anyone who was “really anxious” to work could not find a job.
All these beliefs were so pervasive that they affected how the unemployed viewed themselves. At one level the loss of one’s job was frightening—it threatened not merely economic deprivation but also the possibility that the suspicious scrutinizers who alone could relieve the deprivation might refuse to do so. At another level it was psychologically depressing; to admit that one could not care for oneself and one’s family aroused feelings of profound inadequacy. And at still a deeper level, unemployment stirred guilt feelings—everyone has a somewhat ambivalent feeling about working, an urge to get something for nothing, so that when the authorities as a matter of course assumed that applicants might not really need or deserve help, many of the jobless at least unconsciously felt that they were correct in doing so.
But in the early twentieth century a more sophisticated understanding of the causes and character of industrial depressions changed the way unemployment was dealt with. By the late 1920’s, Great Britain, Germany, and a number of other countries had unemployment insurance, and others operated government employment offices, so that some kind of national assistance for the destitute was available nearly everywhere. Only the United States still left such matters entirely to local public and private agencies. Therefore, when the depression of the 1930’s struck and American unemployment soared from under 5 per cent of the work force to more than 20 per cent in little more than two years, President Herbert Hoover was reacting in the traditional American manner when he announced that “the principles of individual and local responsibility” would be applied. “Each community and each State should assume its full responsibilities for organization of employment and relief of distress with that sturdiness and independence which built a great nation,” Hoover told Congress.
Even the unprecedented severity of the unemployment of the Great Depression could not change Hoover’s mind, as everyone recalls. What is less well remembered is the extent to which the unemployed themselves remained prisoners of the old values and traditions. We know a great deal about how the unemployed felt and acted during the 1930’s because psychologists, sociologists, and public health officials, as well as journalists, political scientists, and novelists studied their behavior closely. While individuals responded in widely differing ways, nearly every survey revealed essentially the same pattern. When workers lost their jobs, the first thing they did, perhaps after a brief period of waiting for something to turn up, was to search feverishly for new ones. Then they became discouraged, sometimes also emotionally disturbed. Finally, after some months of idleness, the majority either sank into apathy or adjusted to an extremely circumscribed existence. Only the strongest retained their energy and determination unimpaired.
Nearly every investigator agreed that apathy and shame rather than anger or aggression were the most common reactions to long-term unemployment. New York City social workers noted this tendency as early as 1931, and in 1940 a sociologist who had studied fifty-nine New York unemployed families was still calling attention to the “deep humiliation” that the condition had caused. The psychologist Abram Kardiner also described jobless people as being afflicted by anxiety and shame and then descending into apathy. The writer Sherwood Anderson, who toured the United States in the mid-thirties gathering material for his book Puzzled America, noted a “profound humbleness” among the citizenry. “People seem to blame themselves,” Anderson wrote. The historian Ray Billington, who ran a New Deal relief project during the Depression, still recalled in the 1960’s the “bleak, downcast eyes” and “broken spirit” of unemployed people seeking aid.
These examples, culled from contemporary sources, make clear that Franklin Roosevelt’s unemployment and relief policies, although the antithesis of Hoover’s, did not have much effect on how the unemployed felt about themselves and their condition. During the years of the New Deal, Congress appropriated billions for direct relief and for public employment projects. By 1943 (when wartime demand had restored full employment) the Works Progress Administration alone had spent $11 billion and found employment for 8,500,000 people. In 1935 Congress also passed the Social Security Act, which provided unemployment insurance as well as old-age pensions for a large majority of the work force. The idea that mammoth outside forces rather than any supposed personal inadequacy were responsible for most of the unemployment of the era received from Roosevelt and his administration both lip service and huge financial commitments. These attitudes and actions gave essential material help to millions and restored the faith of millions more in American democracy and the capitalist system, as the New Deal sweep of the 1936 elections demonstrated. They did not, however, do away with the proclivity of unemployed workers to draw back into apathy and self-recrimination.
This strange effect of idleness goes far toward explaining why so few of the unemployed were radicalized by their suffering. Logically, as radicals of the period were quick to point out, the jobless had every reason to blame “the system” rather than themselves; the more than 10,000,000 idle people who had previously labored steadily all their adult lives could not all have suddenly lost their skills or their will to work. But when radicals and reformers went among the unemployed, hoping to recruit them for the revolution or to organize them to demand change, they received a rude shock. Sociologist E. W. Bakke attended a May Day meeting on the New Haven town green in 1934. A Communist orator delivered a powerful speech, but when, at the end, he urged the crowd to march on city hall to protest, only a handful followed him. Puzzled by such behavior, Bakke later posed as a Communist in talks with unemployed men. When he did so, he was told that Communists were either crazy or traitors. “In the face of Communism,” he concluded, “the most insecure American worker becomes a hero by defending American conditions.”
Matthew Josephson, author of the Depression best seller The Robber Barons , spent considerable time talking with down-and-outers in municipal lodging houses. Once he treated a group of them to a meal. “Things just can’t go on like this,” he said to them. Why didn’t they take part in the protests that radicals were organizing? They told Josephson that they were ” ‘good Americans’ and didn’t go for that ‘Communist stuff.’ ” Another radical journalist, back from a tour of the country, wrote of the unemployed: “The more I saw of them … the more I was depressed and outraged both by their physical and spiritual wretchedness and by their passive acceptance of their condition.” The trouble with the unemployed, the Marxist labor organizer A. J. Muste complained, was that they were “devoted to mere selfpreservation.”
There was, of course, considerable active protest by unemployed groups—mass meetings, hunger marches, rent strikes, along with a good deal of petty theft and some violence. But much of the violence of the times was related to strikes, that is, the participants were workers , not persons who had lost their jobs. The famous bonus army of World War I veterans who marched on Washington in 1932 to demand relief, only to be dispersed at President Hoover’s order by troops, was remarkably passive, an “army of bewilderment,” according to one newspaper correspondent, their behavior marked by “a curious melancholy.” When informed that the Senate had defeated the bonus bill, the marchers gathered outside the Capitol meekly accepted the suggestion of their leader that they sing “America,” and then returned to their ramshackle camp on Anacostia Flats. The violence associated with the bonus incident was supplied by General Douglas MacArthur’s soldiers, not by the poor veterans. Election statistics tell a similar story—in the bleak days of November, 1932, with Hoover discredited and Franklin Roosevelt still attacking him for reckless spending that had unbalanced the federal budget, and with over 12,000,000 Americans out of work, the Communist party received only 102,000 votes, and the Socialist party fewer than 900,000.
The unemployed did vote their own interests as they saw them; certainly they supported Roosevelt’s New Deal overwhelmingly in later elections. But they nowhere became an effective pressure group or an independent political force. Political activism was apparently incompatible with joblessness. Insecurity caused the unemployed to be fearful and dependent. Fear and dependence eroded their confidence and destroyed hope. Lack of confidence and hopelessness undermined their expectations. Typically, when workers lost their jobs, they had not suffered enough to become rebels. By the time they had suffered, they had somehow lost the capacity for militant protest.
World War II finally brought an end to the Great Depression and to the high unemployment that had plagued the United States and the rest of the world for a decade. But what changed the way unemployment was seen and experienced by society, including the unemployed themselves, was not the war but the economic theory that wartime policies had proved viable—the so-called General Theory of the British economist John Maynard Keynes. According to Keynes, whose chief work, The General Theory of Employment, Interest and Money , was published in 1935, governments could stimulate economic growth in bad times and thus create more jobs by deliberately spending more money than they took in, and by increasing the amount of money in circulation. Conversely, in boom periods, when inflation became a problem, governments could check the expansion by restraining the growth of the money supply and by reducing expenditures or raising taxes in order to create a budget surplus. These latter actions might cause a rise in unemployment; indeed deliberately inducing unemployment was seen as a necessary device for “cooling off” an “overheated” economy. But the basic idea behind Keynesian economics was that by properly manipulating monetary and fiscal policies (by what economists in the 1960’s called “fine-tuning the economy”), something very close to full employment could be maintained continuously without triggering serious inflation.
In 1946 Congress passed the Employment Act. Although hedged about with more than the usual share of political compromises and equivocations, this law, besides establishing a Council of Economic Advisers to provide expert information about economic trends and guidance for the President in applying the proper policies, made the government formally responsible for “creating and maintaining” full employment. With the New Deal Social Security system of old-age and unemployment insurance now fully in operation, this Employment Act appeared to herald the dawn of a new era. Henceforth American workers need not fear prolonged idleness. Should slack times cause unemployment, the government was obligated to swiftly check the trend by stimulating demand, and while the medicine was having its effect, insurance payments would protect the unemployed from serious deprivation.
Moreover, Keynesian economics and the political changes it brought about altered the way the unemployed reacted psychologically and emotionally. If the state claimed to have the power to create jobs by its manipulations, as well as the right to deliberately cause workers to lose jobs, then those who lost the jobs were unlikely to blame themselves for their idle state. The logic of Keynes’s argument permeated the consciousness of thousands who never had read a word he wrote. As a result, the tendency of the unemployed of the thirties to consider themselves somehow responsible for their unfortunate condition, the tendency that puzzled and exasperated social scientists like Bakke and radicals like Matthew Josephson, weakened greatly in the postwar years. Being unemployed certainly remained seriously disturbing; it was no joke. But the typical attitude of those without work was expressed by one unemployed man whose benefit payments had been withheld because of a disagreement about his eligibility. “I feel,” he told a reporter, that “these people are just holding me back from something I deserve.”
This attitude surely represented an improvement over that of the jobless of earlier times, but there was a certain irony in the situation. At a time when protection against unemployment was more and more considered to be a right, events were demonstrating that the government did not have the ability to keep everyone at work. For a decade or more after World War II the new economic mechanisms functioned smoothly. The number of idle Americans ranged mostly around four per cent of the work force, roughly what experts considered full employment, and it never remotely approached the rates of the 1930’s. Inflation troubled some economists (and more consumers), and unemployment among blacks, youths, and people without much education was ominously high; but these problems seemed manageable.
In the late 1960’s, however, the rate of inflation began to quicken, pushed up by world shortages of raw materials and the enormous budget deficits resulting from President Lyndon B. Johnson’s unwillingness to call attention to the huge costs of the war in Vietnam by seeking a tax increase. By the early 1970’s an alarming new term, “double-digit inflation,” was entering the lexicon of economics. Still worse, unemployment increased along with prices, in direct violation of the “laws” of Keynesian economics. Then the Arab oil embargo of 1974 and the subsequent quadrupling of petroleum prices further disrupted the economy. The United States experienced a bad recession, by far the worst economic downturn since the Great Depression.
Unemployment rose abruptly to about nine per cent; then, as the economy recovered, it fell back with agonizing slowness, remaining for month after month in the seven to eight per cent range. To deal with the large number of jobless people who needed larger dollar benefits to keep up with inflation and who were idled for longer periods of time, Congress extended benefit periods and increased the size of weekly grants. Soon there was little relationship between what the unemployed were taking out of the system and what they and their employers had paid in; in other words, the system ceased to be actuarially sound and therefore it no longer insured, in the sense of “made certain.” When, in May, 1977, President Carter proposed that general federal revenues be used to bolster the shrinking insurance fund, it was no longer possible to ignore what was happening. A second great irony was now apparent: while workers were finally coming to accept unemployment insurance payments without shame or self-doubt, the “insurance” was in fact becoming a euphemism for public charity or welfare.
Then there was the question of whether or not freeing the jobless from irrational guilt and self-doubt was having unintended side-effects. Was the system actually increasing unemployment? Nineteenth-century theorists had been sure that insurance would do so, both by siphoning off resources from the “wages fund” that supposedly financed all payments for labor, and by satisfying the urgent needs that otherwise forced unaided unemployed people to search for work. The first of these notions was long ago exploded by economists, but as presently constituted, the insurance system does sometimes come close to encouraging idleness.
Conditions vary from state to state, but because benefit payments are never subject to income tax, the difference between the take-home pay of a worker and the spendable income of the same person when collecting unemployment insurance can be quite small. Peter Passell, the economist, has offered a striking example. Take a family of four in New York State, the husband and wife each holding down jobs paying $190 a week. Such a family’s after-tax income would be about $270 a week. Suppose the wife lost her job. She would draw $95 a week in unemployment benefit, but since the money would not be taxed, and because the family’s tax bracket would also be substantially lower, the actual income available to the family would be reduced by only $30 a week. “Thirty dollars a week isn’t much of an inducement to look for work,” Passell comments, adding that particularly for middle-income workers in cyclical industries where relatively brief layoffs are common, the tendency when discharged is to sit back and wait to be rehired rather than to search hard for a new post.
This same tendency seems to characterize the behavior of workers in resort areas and others with seasonal occupations, many of whom make no serious effort to find jobs when the season ends, patiently (some would say complacently) collecting unemployment insurance until the new season begins. The humane tendency of Congress to adjust insurance payments upward when the cost of living rises, to extend benefit periods in prolonged depressions, and to eliminate waiting periods in “emergencies” may also increase the amount of unemployment, because it blunts that goad to effort, the knowledge that assistance is limited and will end at a predetermined date.
Despite the ironies of the present situation, few if any critics have proposed the abolition of unemployment insurance as a way out of the dilemma. It is in the general interest as well as that of the unemployed that persons who lose their jobs be maintained decently while they look for new ones, and that the payments be made and received as honestly earned and thus unaccompanied by shame or stigma. But cogent critics of the system suggest that insurance ought to be insurance, not the dole it has become. If benefits were based on the contributions of workers and their employers and the sums determined on actuarial principles, there probably would be fewer people collecting benefits (and for shorter periods), and more people working.