Skip to main content

A Clean Break With The Past

November 2024
31min read

“In terms of change in American attitudes and American values, these last five years have surely been the crucial ones in the quarter century since V-J Day. And these changes seem of such a magnitude that every American except the very young, the very empty, and the very enclosed must now, to some extent, feel himself a foreigner in his native land”


Something very strange has happened in the United States very recently. Traditional attitudes and values that have prevailed and come down from generation to generation in all but unbroken succession since the founding of the republic have suddenly been overturned or are in the process of being overturned. Traditional American ways of looking at things—including the traditional way of looking at our own past—have suddenly been reversed. A startling discontinuity, as stark as a geologic fault, has occurred in our cultural history since 1964.

It is a temptation, and one constantly yielded to by social commentators, to look upon these things (like the geologic fault) as having simply happened —as having occurred without human volition or control. The environment has changed, it is said; no wonder people and their attitudes change. The process is made to appear as inexorable as changes in the phase of the moon.

What has “happened” in America has been largely the doing of the older half of our present population—those born before the Second World War. Through their ingenuity and enterprise and with the help of their equally ingenious and enterprising predecessors of the generations before, the members of the present older generation have changed the country so radically that the old conditions under which the old values obtained are simply not there any more. True enough, the change was brought about (in traditional American fashion) entirely without planning, and, indeed, its social effects have by and large not been felt by the generation responsible. Not really understanding what it has wrought (still quaintly anthropomorphizing computers, for example) and being beyond the age when long-held attitudes and values are easily surrendered, the older half of the country mostly clings to the old ideas. In the meantime the younger half, those born since the war, have grown up in a whole new world and, in a triumph of environment over heredity, have become a new breed. The fathers, clever enough to invent computers, jet planes, moon ships, antibiotics, and weapons of race suicide, are not wise enough to know their own sons, who are now shaping the values of America and will soon be America.

A quarter century ago next month, with V-J Day, the United States emerged from the war into modern times. In the subsequent twenty years, while the nation’s adults, the prewar generation, were unwittingly removing from their own and their children’s lives the physical underpinnings of the old national faiths and attitudes, they were also continuing—in fact, accelerating—the long process of social amelioration that had gone on, though not uninterruptedly, since not long after the Civil War. The first postwar quarter century was one of outstanding social as well as material progress.

About five years ago, I undertook a study of social change in the United States over the twenty-five years between the outbreak of the Second World War in 1939 and the mid-sixties. I found, among other things, that the great corporations, considered so gigantic and sinister in 1939, had become many times more gigantic and —in the pretty-well-substantiated public view—a good deal less sinister. (In the late thirties the Temporary National Economic Committee had reported with awe that General Motors, perhaps the archetypical American corporation, had assets of one billion dollars; less than two decades later General Motors would have annual profits after taxes of one billion dollars. It would meanwhile have abandoned its former rather surly attitude toward society and become a corporation as enlightened as most in its social attitudes.) The gross national product over the period had gone from 90 billion to 630 billion dollars a year; the federal budget had swollen from around nine billion to over one hundred billion; national income per capita had risen from nine hundred to well over three thousand dollars, while the national population (in sharp contradiction to the glum demographic predictions of 1939 that the nation faced a people shortage) had risen from 130,000,000 to over 190,000,000. Taxes and other forces had brought about a vast and generally beneficent redistribution of national wealth. Computer technology, in 1939 just a gleam in a few scientists’ eyes, was already on the way to bringing about a new era in science and technology and, more obviously at first, in business methods; and the initial fears that computers would throw millions of people out of their jobs were beginning to prove unfounded. Poverty had by no means been eliminated, but by almost any fair-minded standard it had been sharply reduced; indeed, my calculations showed that by the standards applied in 1964, Franklin D. Roosevelt’s one third of a nation ill fed, ill housed, and ill clothed in the thirties had been a gross understatement.

I found that over the period under study there had been a vast tidal migration from farms to cities. Thirty-one million Americans, or a quarter of the population, were farmers in 1939; only thirteen million, or less than 7 per cent, were still on the farms by 1964. The effect of this influx on the cities and on the new urbanités themselves, despite crime and overcrowding and suburbia and urban sprawl, had not proved to be all bad. I found a tremendous rise in formal education: the average American was an elementary-school graduate in 1939, a highschool graduate by 1964; 15 per cent of college-age Americans attended college in 1939, well over 40 per cent in 1964.

Further, I argued that anti-Semitism, a strong and ominous thread in our national warp in 1939, had ceased by 1964 to be an important factor—permanently, I patriotically supposed. On the question of Negro rights and privileges the evidence of progress, though more equivocal, was nevertheless present. In 1964, in the nation’s capital city, where in 1939 no black man had been suffered to eat in a public restaurant used by whites or to register in any hotel, Congress was passing a wideranging civil rights act, and the next year it would pass a far wider-ranging one. A long, painful campaign for civil rights in the South, beginning with the Supreme Court’s first desegregation decision in 1954, had caught the national imagination and that of our Texan President himself, and as a result of increased black-voter registration Negroes were being elected to office at many levels in most parts of the country. Economically, to be sure, the average black man was only slightly better off in relation to the average white man than he had been in 1939. Formerly his income had been somewhat less than half of the white man’s; now it was slightly more than half. But in 1964 the country seemed to have the will to tackle even this anomaly. One felt, buoyantly, that with the political liberation of the Negro virtually accomplished, his economic liberation was next on the agenda.

“… the national mood reversed itself as dramatically as a manic-depressive”

In its foreign affairs I said that the United States, which with the end of the war had assumed free-world leadership for the first time, had handled this generally unwanted responsibility fairly well in spite of some spectacular bungling. There were, despite moral arguments advanced in their behalf, such egregious disasters as the Bay of Pigs, the ill-starred U-2 reconnaissance flight, the unfortunate (but then not yet overwhelmingly tragic) miscalculation of our involvement in Vietnam. But there was also the nuclear test-ban treaty of 1963, the relief programs all over the world, and, of course, the Marshall Plan, which through its backers’ statesmanlike vision of where enlightened self-interest lay, had done so much to set flattened Europe back on its feet.

In sum, my research convinced me that “the quarter-century … had seen such rapid and far-reaching changes in many aspects of American life as are not only unprecedented in our own national experience, but may well be unprecedented in that of any nation other than those that have been suddenly transformed by … war or plague.” And I concluded that while the enormous material gains of the quarter century had unquestionably had their moral costs, the moral loss was far less clear than the material gain. America could not patly be said to have “sold its soul for mediocrity.”

So I wrote then. Between then and now, over the past five years, many but not all of the trends I noted have continued. Economic growth has gone on to the point where most economists believe that 1971 will be the year when our gross national product will pass the all but inscrutable figure of a trillion dollars a year. Our 1964 federal budget is now almost doubled. Poverty, more and more in the news, is nevertheless still decreasing in fact.

On the other hand the migration from farms to cities has slowed sharply. Anti-Semitism in a new form has made an ominous appearance among Negro militants. Racial integration of schools has failed tragically, as shown by the fact that at the beginning of 1970—sixteen years after the Supreme Court’s desegregation decision—less than one fifth of southern Negro pupils and hardly more than one fourth of northern and western Negro pupils were attending predominantly white schools. The stagnation of black economic status is shown by the persistence of two familiar statistics—black income still just over half that of whites, black unemployment still double that among whites.

But statistics are not all. There exist also national moods, and they rather than statistics reflect attitudes and values. There are fashions in statistics; appropriate ones can be found to fit any mood, to buttress any conventional wisdom, and it can be argued that the moods give birth to the figures rather than vice versa. At any rate, some time recently, probably in 1965, the national mood reversed itself as dramatically as a manic-depressive patient goes from the heights to the depths, and I see my study as having been completed in the last, climactic days of a period of national euphoria.

The trigger for the mood change is harder to identify. Many would equate it with the hateful trigger of Lee Harvey Oswald’s mail-order gun in Dallas in November, 1963, and contend that the accomplishments of 1964 and early 1965 were the result of accumulated momentum—that, indeed, the productive phase of the Kennedy administration was actually the year and a half after John Kennedy’s death. Others would choose the Watts riots of August, 1965, the first time the murderous and suicidal rage and despair of urban blacks outside the South was revealed; perhaps the largest number would choose the escalation of the Vietnam war, which began with the bombing of North Vietnam that February. At all events the change occurred, and the nation went into the valley of a shadow from which it has not yet emerged as this is written.

In terms of inner change, of change in American attitudes and American values, these last five years have surely been the crucial ones in the quarter century since V-J Day. And these changes, which I propose to examine here as cheerfully as possible, seem of such a magnitude that every American except the very young, the very empty, and the very enclosed must now, to some extent, feel himself a foreigner in his native land.

Better than statistics, as a starting point for a study of moods, are words. The 1947 Britannica Book of the Year gave a list of words that, according to its authorities, “became prominent or were seemingly used for the first time” in either 1945 or 1946. Predictably enough, some of the listed words were of only ephemeral usefulness and have vanished without leaving a trace; for example, athodyd (a ramjet engine), cuddle seat (a contrivance for carrying small children, both word and device introduced by Australian war brides), and Huff-Duff (a navigation aid for ships and planes that was quickly superseded). But a surprising number of the new coinages survive, and a listing of some of them gives a remarkable picture of the preoccupations of the time: atomic cloud, be-bop, buyers’strike, existentialism, fact finder (as in a labor dispute), fissionable, gray market, iron curtain, operation (as in Operation This-or-That), push-button (as a metaphorical adjective), shock wave, sitter (for babysitter), truth serum, U.N., UNESCO .

Fact finder, fissionable, sitter: talismans of the time, casting strange shafts of light into the future. It was a time of getting settled. That, of course, meant more than veterans coming home; it also meant industrial workers demanding the raises that had been deferred by wartime controls, and therefore strikes. In November, 1945, there began a series of crippling strikes in key industries. Meanwhile, as the government vacillated on price controls, meat disappeared from grocery shelves for days at a time because of speculative withholding by suppliers. None of these inconveniences held back the business of nest building. The year 1946 stands out as the all-time record year for marriages in the nation’s history, not only relative to population but in absolute numbers—2,291,000 marriages all told, or almost 700,000 more than in 1945, and almost twice as many as there had been in the deep Depression years before the war. The first nest in 1946 was usually an apartment rather than a house ; material shortages held up the beginning of the great postwar home-building boom, but even so, construction of one-family dwellings tripled between 1945 and 1946. And whatever their nature, the new nests were quickly fruitful. The national birth rate went up 20 per cent in 1946 over 1945 (that November, New York City actually ran out of birth certificates) and another 10 per cent in 1947 over 1946, as the celebrated postwar baby boom got under way.

So the ex-serviceman, in college on the GI Bill, with his pregnant wife struggling to make a palatable dinner on short meat rations in their barracks apartment, was earnestly trying to sop up the knowledge that would get him a civilian job, with no thought farther from his mind than questioning, much less protesting against, the social framework or the institution in which he worked. Nest-building time is not a time for rebellion. Also in 1946 the government was paring back its budget from 1945’s one hundred billion to sixty billion dollars, and the next year it would spend less than forty billion; the infant United Nations was trying out its unsteady legs at the optimistically named Lake Success on Long Island; there were four lynchings in the South; the Bikini bomb tests were appalling us, and the Cold War was taking shape; radio was still the great national diversion, with Jack Benny first in the Hooperatings, Fibber McGee and Molly second, and—incredible as it now seems —Amos ’n Andy seventh. And while all these quaint happenings were in process, the word existentialism was coming into the American language.

Of such was the nest-building mood, the nation’s first in the postwar period. There have been five more since then that I can distinguish: the Korean-war mood, the McCarthy mood, the Eisenhower-prosperity mood, the Kennedy go-go mood, and finally the present one of paralysis, gloom, and reappraisal.

Beginning with the North Korean invasion of South Korea on June 25, 1950, the Korean war was a time of nightmare. There was a kind of déjá vu about finding ourselves again embroiled in a war when we had just settled down to peace, and for thousands of veterans of the Second World War who had signed up for the reserves without thinking twice about it (I remember, for example, that when I was separated from the Army at Fort Dix, New Jersey, in 1945, they encouraged reserve enlistment by’letting you out one day sooner if you signed up), it meant an actual return to combat. It was a new kind of war—not even officially called a war, but rather a “police action”—as frustrating as an unpleasant dream, that we could not win and apparently were not supposed to win. (We would learn more about that kind of war later.) The rumors we heard, later confirmed, that American prisoners were being subjected to a new and horrifying form of mental torture called brainwashing were literally the stuff of nightmare. So was the vision of an endless mass of humanity, bent on killing and seemingly unconcerned about being killed, that was embodied in the phrase “human wave,” used to describe the Chinese Communist hordes that streamed south across the Yalu River in November, 1950. Finally, during the two years that the armistice talks dragged on at Panmunjom while the shooting continued, there was the nightmare sense of trying to wake up to a pleasanter reality and being unable to do so.

Shaken but relieved, the country finally awoke with the signing of the armistice on July 27, 1953—but awoke merely, as sometimes happens, from one level of nightmare to another. The time of the paid informer and the false witness had already come. As early as 1948 Whittaker Chambers had first made his charges of Communist spying against Alger Hiss, the apparently exemplary young statesman who had been a framer of the United Nations charter, and the Dreyfus case of modern America was launched. In 1949 eleven leaders of the U.S. Communist Party had been sent to prison; the following year Judith Coplon and Dr. Klaus Fuchs had been convicted of spying, the latter with reference to vital atomic secrets, and the young Senator Joseph McCarthy, seeing his chance, had made his famous series of accusations that there were 205 (or 57 or 8l or 10 or 116) Communists in the State Department. With that, the hounds of fear and distrust slipped their leashes, and by the time of the Korean armistice Senator McCarthy had made the nightmarishly irrational term “Fifth Amendment Communist” into a household expression; hardly any professional in the country could feel his job or his way of life safe from the random malice of almost anyone, and constitutional guarantees against just this sort of mischief were becoming all but meaningless.

“… the word existentialism was coming into the American language’

That nightmare almost drove us crazy—perhaps came closer than we care to admit, even now. But finally our underlying national health asserted itself, and we awoke at last, this time definitively, in December, 1954, when the Senate censured McCarthy and McCarthyism went into decline. Small wonder, after such horrors, that the next mood should have been a recessive one, one of huddling in our shells and comforting ourselves with material things while remaining heedless of the mess we were making. The essence of the Eisenhower mood was long-deferred self-indulgence. It was a time of soaring stockmarket prices and soaring participation in the boodle. The members of the middle class, the hugely expanding group that dominated the country, were becoming capitalists at last and were doing very well at it. It was a time of rocketing corporate profits and resulting fat dividends—at the cost of inflation and polluted air and water. It was a time of greatly increased leisure for millions—at the cost of littered roadsides and tamed and uglified national parks and forests. It was a time of more and more automobiles for more and more people—at the cost of traffic jams, more air pollution, eyesore automobile graveyards, and neglected public transportation. It was a time of bursting cities and proliferating suburbs—at the cost of increasingly neglected slums full of explosive anger quietly ticking away. It was a time when we thought of our “race problem” as being mainly a political matter confined to the South; when, in foreign policy, we fatalistically hid behind the dangerously provocative shield of “massive retaliation” and “brinkmanship” (and meanwhile were sowing the seeds of our Asian disaster); when college students kept a low profile, politically and otherwise, so as not to jeopardize their chances of flowing smoothly onto the production line to affluence right after graduation; and when—not so paradoxically as it may seem at first glance—the federal budget grew year by year and social security and other public benefits were greatly widened. The Eisenhower era is not to be compared too closely to that of Coolidge in terms of free enterprise’s running wild. In the earlier time the country had been all too truly committed to unrestricted free enterprise, but by the late fifties, despite Fourth of July paeans to the “American system” as fulsome as ever, the notion of cradle-to-grave security for most people had been thoroughly accepted and, indeed, assimilated into the system. The mood was heedless hedonism.

Next, in abrupt reaction, came the Kennedy years with their quite opposite mood of responsibility and hope. It is tempting now to think of those years as a golden age, though if we look closely we find they were scarcely that in practical terms; after all, Kennedy’s domestic legislative defeats—on civil rights, on tax reform, on Medicare—far outweighed his victories, and he died leaving unsolved most of the problems he had inherited, including, of course, Vietnam. But his successful conclusion of the 1962 Cuban missile crisis, along with the limited nuclear test-ban treaty that followed the next summer, did much to allay the fear of nuclear war that had overhung the country all through the postwar period up to then. Much more important, he and his administration, through the almost magically inspiring quality of their very style, succeeded in regenerating the old American faith, not in the perfection of man or his nation but in their perfectability. No one despaired under Kennedy; somehow everything seemed possible. “I have a dream that one day this nation will rise up, [and] live out the true meaning of its creed …” Martin Luther King, Jr., said at the interracial March on Washington in August, 1963—a fitting epitome of the Kennedy mood, in a climax that no one could know came near the end of the last act.

Then everything went wrong. With Kennedy’s death that November began an age of assassination; within five years probably the two most admired black men in the country, King and Malcolm X, and almost certainly the most admired white man, John Kennedy’s brother Robert, would be dead from the same horrifying and dispiriting cause. During the same period more and more Negro leaders turned against King’s dream, rejecting the American creed for a cynical, angry separatism; the hopeless war in Vietnam was escalated, and revelations about its conduct led many Americans to a similarly escalating sense of horror, disillusion, and shame; political colloquy at home became violent rather than reasonable; Americans achieved the technical masterwork of flying to the moon and back while failing to accomplish the technically simple one of giving all their citizens proper food and clothing. The sixth postwar mood was, and is, one of violence, disillusion, and doubt verging on despair such as has not been felt since the time of the Civil War.

It is my thesis, then, that while material change has generally been steady, continuous, and for the most part beneficent over the postwar period, the past five years or so have seen an explosive—and morally equivocal—increase in the rate of change in values and attitudes. It is in these last five years that most of our moral history since V-J Day has been written, and it is since 1965 that many Americans have come to feel like expatriates in America. In support of the thesis, let me tick off a few current American attitudes—now accepted widely enough among the influential, especially in the communications media, to constitute what might be called leadership opinion, if not national consensus—that would have been unthinkable not only on V-J Day but on the day of John Kennedy’s death as well.

The attitude toward military affairs, and in particular toward our own military, has to a large extent undergone a reversal. My own generation, the one whose coming of age coincided with U.S. entry into the Second World War, had thought itself pacifist; we had been brought up on Dos Passes’ Three Soldiers and Hemingway’s A Farewell to Arms and the Nye investigation with its implication that wars are fought for the profits of munitions makers. But it turned out that our pacifism was only skin-deep; when the call to arms came, it found us full of sanguine enthusiasm. We wanted to be in it, and quickly, and we hurried to the recruiting offices; we thought of draft-dodging as contemptible and conscientious objection as respectable but, to say the least, highly eccentric. After Pearl Harbor a uniform, even that of an ordinary soldier or sailor, was a clear-cut asset in the pursuit of girls.

In the postwar period up until recently a uniform was neutral, considered neither glamorous nor unappealing. Not so now. There are no American “heroes” of Vietnam (not that there has been no actual heroism), and the sporadic efforts of the military to create some have failed utterly. On the contrary, among the heroes to today’s youth, or a significant segment of it, are the evaders who are hiding out illegally in Canada or Sweden. Idealistic young people casually and openly discuss and choose among the legal and illegal ways of avoiding induction, and many of them consider the act of draft avoidance or evasion to be a moral one. As for the sexual aspect: the son of some friends of mine, living in a conservative eastern community, complained soon after he was drafted that girls who had formerly gone out with him would no longer do so. The old taunt of “Why aren’t you in uniform?” has become the opposite: “Why aren’t you in Sweden or in jail?” Soldiers on leave these days wear mufti.

Again, certain broad, vague expressions of patriotic sentiment that in 1945 would have been considered commendable and in 1963 at least harmless have now become specifically distasteful to many as indicative of “extremist” beliefs. To a liberal—and liberals, on political record, are something like half of our voters—the display of a bumper sticker reading “Honor America” now suggests that the owner of the car is a full-fledged reactionary, ready to jail dissenters against the war and to use atomic weapons in its prosecution. “Support Your Local Police,” which until a few years ago might have been an advertisement for a cake-sale benefit, now suggests racial prejudice. Even more to the point, display of the American flag itself in many unofficial settings has come to have disturbing implications. I confess, with some reluctance, that a flag decal posted in the window of a car or a barbershop now arouses in me feelings of hostility toward the owner. It would emphatically not have done so in 1945.

True enough, the practice called flag-waving has been in bad repute in sophisticated American circles for generations. But the expression was metaphorical, usually referring to overly florid oratory. That the display of the flag itself should come to suggest extremist political and social views is surely an anomaly without precedent. Try to imagine any other democratically ruled nation in which such a condition exists—or ever has existed.

“Why aren’t you in Sweden or in jail?’

The reason behind these changes is hardly obscure. On V-J Day we were triumphantly concluding a war in which the moral imperative had been clear to just about everyone. On the one hand our territory had been attacked in the Pacific, and on the other a barbaric aggressor who clearly represented a threat to us as well as to our allies was at large in Europe. Now we are engaged in a military adventure in a distant country in which I believe tortuous logic is required to find the threat to ourselves and in which, threats aside, the moral imperative is certainly not clear to many millions. Is the change, then, only temporary and expedient—like, say, the 1930’s pacifism of my generation? I rather think not.

The computer revolution, filtering through from technology to culture, has recently come to change ways of thinking, perhaps more than we usually realize. Norman Macrae, deputy editor of the British Economist , commented after a recent U.S. visit on “the greater air of professionalism which runs through all ranks of American society; the greater instinct among ordinary individuals to say ‘Now here is a problem, how can I solve it by a systematic approach?’” We have learned that computers can not only imitate the human brain (play chess, choose marriage partners) but can in many ways far exceed it (retrieve material from huge library collections or scan the contents of a fat telephone book in a fraction of a second; predict election results in an instant; put men on the moon). Is it not logical, then, that we should try to improve our minds by making them as much like computers as possible? The young executive or computer programmer who has learned the meaning and value of the systems approach to problems tries to apply it in every area of his personal life—in choosing schools for his children, in mowing his lawn, in pleasing his wife. It may well be that the current cult of irrationality is partly a reaction against this computer-spawned mimicry of mechanical thinking in everyday life.

Whether or not television and its concomitants in mass communications and world travel have done what Marshall McLuhan says they have—destroyed the “linear” habit of thinking imposed by the printed page and returned the whole world to the instinctual communication methods of the primitive tribal village—they have, it seems evident enough, changed our living and thinking habits in the direction of passive receptivity. I suggest that, with the first generation of television children now coming of age, we are just beginning to feel the force of this change.

While the Negro-rights movement has passed through its various stages—full integration of the armed forces (1948), the fight for integration of schools and public facilities (1954 et seq. ), and finally “black power”—white attitudes toward aid to the Negro cause have gone through a spectrum of changes. In 1945 the majority of us, to judge from our actions, still clung to the thought that such aid through federal intervention was unnecessary or inappropriate. During the civil-rights decade beginning in 1954 most of us permitted ourselves to think of such aid as morally commendable on our part—that is to say, to think of it as having at least a component of charity. Now, in the black-power era when integration as a goal and the possible perfectability of American society are being increasingly rejected by the more militant black leaders, it has been borne in on more and more of us that giving things to minorities is and always was at best mere political expediency and at worst blackmail. Such ideas were unthinkable for nearly everyone in 1945; for all but a few in 1964. (President Johnson, it is interesting to note, was very much in the avant-garde of American thought in 1965 when he said at Howard University, “You do not wipe away the scars of centuries by saying, ‘Now, you are free to go where you want, do what you desire, and choose the leaders you please.’ You do not take a man who, for years, has been hobbled by chains, liberate him, bring him to the starting line of the race, saying, ‘You are free to compete with the others. …’” Might not those words—had they not been spoken by an American President—serve as a blackpower slogan?)

Along with the change in white attitudes toward blacks is a profound and unsettling change in the attitude of liberals toward our national history. Blacks and others, but mainly blacks, have persuaded liberals that ours is in crucial ways a racist society, and that it always has been. Formerly we thought of the American past, broadly, in terms of rural individualism, fanatical independence, and anti-intellectualism combined with visceral folk wisdom and an inherent sense of fairness—thought of it, that is, in a way that was both affectionate and patronizing. We minimized or dismissed particular instances of racism (lynchings, the Scottsboro case, or the wartime detention camps for Nisei) as being confined to a particular geographical area or attributable to the bad judgment of particular leaders. Now, for many Americans, almost any tintype glimpse of the American past—the village band concert with its handful of tentatively smiling black faces in the back row, the political rally with no black faces anywhere—suggests racism. To a degree our history has been poisoned for us. And I believe that the consequences of this, in the light of our current national demoralization, can hardly be overemphasized at this time in America’s life.

“… the defense of irrationality is often put on rational grounds”

Our leaders themselves have become demoralized to an extent surely without precedent since the Civil War. “We know our lakes are dying, our rivers growing filthier daily, our atmosphere increasingly polluted,” John Gardner, former Cabinet member and more recently head of the Urban Coalition, said not long ago. “We are aware of racial tensions that could tear the nation apart. We understand that oppressive poverty in the midst of affluence is intolerable. We see that our cities are sliding toward disaster. … But we are seized by a kind of paralysis of the will.” Does not such language, in the nation of Fourth of July oratory, and coming from not just an Establishment figure but to some the Establishment figure of the present moment, represent a clear break with the past, even the very recent past?

Naturally, the demoralization of the leaders is felt among the people. “Most people no longer seem to care—if, indeed, they know—what is happening to their country,” Richard Harris wrote late last year in The New Yorker magazine. “Exhausted by the demands of modern life and muddled by the fearful discord tearing at society, they seem to have turned their common fate over to their leaders in a way that would have been inconceivable five years ago.” But when the leaders talk of paralysis of the will, who will lead?

I come now to recent changes in attitudes and values among the young, where we may find a key to what is happening to the country. To review briefly, then, the most obvious manifestations of these changes:
Youth on the campus has discovered its previously unsuspected and therefore untested power to change its environment and the conditions of its life. From the Berkeley revolt (1964) to the one at Columbia ( 1968) to the one at Harvard (1969) we have seen the content of such campus uprisings gradually broaden from demands for the right to use dirty words to demands for changes in the course of study, insistence on sexual and other forms of personal freedom, demands for revision of admissions policies, and ultimatums about the reorganization of entire curricula. The rebels have developed their own jargon—largely mindless and question-begging like all political jargon: in pursuit of “restructuring” (getting their own way) the dissidents resort to “confrontation” (violence or the threat of it), make “nonnegotiable demands” (refuse to engage in reasonable discussion), and, if they get what they want, sometimes complain with what seems to be a certain disappointment that they have been “co-opted” (yielded to). A comical aspect of their behavior is that they frequently ask those in authority to help them revolt against that very authority; they want, for example, to be offered formal courses in the techniques of campus disruption as well as guerrilla warfare. (A university president told me recently of a student delegation that had come to ask him, not without an attractive diffidence, that he help them by giving them the benefit of his political experience. “What they wanted me to help them rebel against was me ,” he commented.) But campus revolts are not a joke. They are evidence of an idea completely new in the United States, poles apart from the passive orthodoxy of the silent generation of a decade earlier, that teaching authority is not absolute but fluid and malleable, that the young can move the sun and the moon in their heavens if they try, that their universe in spite of its ordered surface is basically anarchic. And the authorities, by yielding to them again and again, have confirmed their most disturbing suspicions.

Recent statistics compiled by the Urban Research Corporation of Chicago give a striking picture of how widespread campus revolts have been. Covering 232 campuses over the first half of 1969, the study showed that during that period 215,000 students, or about one tenth of all those enrolled at the institutions studied, actively participated in a total of nearly three hundred major protests—all in just six months. Before the fact that only one student in ten was active in the uprisings is taken to indicate that the youth revolt is just the phenomenon of a small but visible minority, we would do well to consider that historically the passive sympathizers with new movements have usually far outnumbered the activists.

The young have turned against careers in business, particularly in big and long-established business, to such an extent that some campus recruiters have expressed concern as to where the managers of the future will come from—although up to now there have been enough exceptions to keep the business schools from being depopulated and the lower ranks of corporate management from being undermanned.

They have made a cult of irrationality, what with astrology, Oriental occultism, and above all the use of drugs. (” We never needed drugs to drive us crazy,” the middle-aged social commentator David T. Bazelon once told me.) This tendency runs deep and wide, cutting across economic, social, and intellectual lines among the young. The sheltered, conservatively brought-up white southern darling and the would-be hippie son of liberal northern suburbanites yearn alike for the experience of New York City’s East Village, and the young Harvard intellectual is almost as likely as the high-school dropout to express or imply hostility to the traditional intellectual materials, abstract ideas, and rational comment. Curiously, the defense of irrationality is often put—persuasively—on rational grounds: that logical thought in foreign policy led to Vietnam, that logical thought in economic development led to pollution, and so on.

The young are apparently in the process of radically redefining sex roles. The question of which forces (the Pill, the obscenity explosion in the media set off by the courts’ anticensorship decisions, or the general air of permissiveness in the land) have brought about a radical change in sexual customs among both the young and their elders, remains undecided. No one really knows. What is much clearer, and perhaps more interesting, is that the traditional aggressiveness of the young American male about his maleness, which has so often led to anti-intellectualism, Babbittry, and cultural self-deprivation in general—for example, the American he-man’s hostility to most of the arts on grounds that they are effete—seems to have been emphatically reversed. The short hair and pointedly different clothing that he always used to set himself unmistakably apart from girls are more and more being abandoned in favor of long hair, fur coats, beads, and other adornments that were formerly considered feminine. The American male’s dread of appearing to be unmanly seems to be lessening. More significantly, one is struck by the new sense of community that boys and girls feel. The growing insistence of the young on coeducation is not just a matter of having sex available but one of principle, growing out of a new conviction that the sexes are not so different as American culture has decreed them to be and that the old segregation is therefore absurd.

The symptoms I have been recording are, of course, parts of a syndrome, and one that may be viewed in two diametrically opposed ways. Looked at in a favorable light, the new youth are gentle, loving, natural, intuitive, opposed only to war and obsession with money, to hypocrisy and the other agreed-upon weaknesses of modern society as organized by their elders. In a different perspective they represent progressive-school permissiveness and self-indulgence run wild: their causes are merely self-serving (opposition to the draft, for example), their attitudes are self-righteous (“Just because you are growing older do not be afraid of change,” they gravely lecture their parents and teachers), their manners are deplorable or nonexistent, their minds are flabby, their herding together indicates fear of standing alone, and the manner of their protests sometimes appears ominously antidemocratic. Macrae of The Economist goes so far as to say that some of the actions of black-power and radical white students during the winter of 1968-69 “invited the most direct comparison with the way that Hitler’s brownshirts operated in the Weimar Republic.” On the other hand Ralph Nader’s consumer-protection crusade, which clearly appeals strongly to the brightest and most idealistic among the young, might fairly be described as passionately pro democratic in that its aim is to save that most characteristic democratic institution, the business corporation, from its own shortcomings. Paradoxes and contradictions, then; and it is quite possible—indeed, perhaps it is inevitable—for a liberal of the previous generation to see the young in both lights at the same time.

For such an observer, analysis is more profitable than judgment. Consider, then, the vital statistics. The median age of the American population at present is a bit under twenty-eight years. Half of us, roughly speaking, were born before the middle of World War II and half since it. Half of us were of an age to be percipient before V-J Day, and half were not. The distinction is not arbitrary, because it was with the end of the war that the new era, the modern world, began. The time has come when “over thirty” means precisely “born before the war.” Only the younger half of the American people have never known the world of traditional values as it was without the disrupters of those values—television, computers, jet travel, space travel, the threat of nuclear extinction. Only the younger half truly belong to the new world—that is, accept it instinctively, without mental or emotional effort, because they have not any old world to compare it with.

And consider this: the five postwar moods before the present one were conjoined as well as consecutive—each had its roots in reaction to the previous one, as have the moods of most nations through most of past history. Wartime family disruptions led logically and naturally to early postwar domesticity. The Cold War, which really began in 1945 at Yalta, bore its bitter fruit five years later in Korea. Armed conflict with our former allies, the Communists, led logically to the era of suspicion. The eventual relaxation of that crisis cleared the way for the Elsenhower years of self-indulgence. And the new energy and responsibility of the Kennedy term was clearly enough a reaction to that. In such a linear way did our history unfold for almost two decades.

And then—snap! The chain of events seemed to be broken. Suddenly we flew off in directions that seemed to be neither a continuation of nor a reaction to anything that had gone before. Disillusion with uniform and flag did not appear to be rooted in reaction to any particular superpatriotism of the preceding period; mechanized thinking was not new, but the existence, indeed the ubiquitous presence, of actual thinking machines was; the new youth rebellion could be seen as a reaction to youth passivity a decade earlier, but the breadth and depth of the response was so far out of proportion to the challenge as to make such an explanation seem entirely inadequate. The present American mood, then, in many of its aspects, has had no precedents or antecedents; it represents almost a clear break; it seems to have come out of the blue. Meanwhile, let us remember, it has not been accompanied by sharp breaks in or reversals of the broad ameliorative trends that have marched through the whole postwar period. There are no jolts or breaks around 1964 or 1965 in the charts of social progress. The nation seems to have changed its mind, or to be in the process of changing its mind, on many of the most basic matters for no immediately discernible material reason. And this occurs precisely at the time when the new post-V-J Day generation is coming of age.

“Will the affairs of General Motors be managed by men smoking pot?”

Can this conjunction of facts be more than coincidental? Indeed, must it not be? If so, then the new generation, the generation that is in tune with the new world because it never knew the old one, appears, for better or worse, as the basic force behind the new, unprecedented American attitudes. As for the statistical charts, their relatively smooth continuance through this period of violent cultural upheaval may be explained by the fact that the charts and the things recorded in the charts- matters of business, government, philanthropy—remain in the hands of the old postwar generation. It does not really live in the new world it has made, yet it still nervously holds all the levers of national power.

One who accepts such an analysis is Margaret Mead. In her recent book, Culture and Commitment: A Study of the Generation Gap , she declares that “our present situation is unique, without any parallel in the past,” and that—not just in the United States but world-wide—the human race is arriving through the youth revolt at an entirely new phase of cultural evolution. Putting her argument in a context of rigorous anthropological study rather than in the familiar one of parlor sociology, she describes the new phase as a “prefigurative” society: one in which the traditional handing down of knowledge and belief from the elder generation to the younger is being reversed and in which “it will be the child and not the parent or grandparent that represents what is to come.” No longer anywhere in the world, Dr. Mead says, can the elders, born before the Second World War, know and understand what the children know and understand: “Even very recently the elders could say, ‘You know, I have been young and you never have been old.’ But today’s young people can reply, ‘You never have been young in the world I am young in, and you never can be.’” The prefigurative society she sees emerging is, Dr. Mead says, the first one in human history.

It is a persuasive case, and, fitted together with the vital statistics I have cited, it leads to a persuasive explanation of why changes in our values and attitudes, after years of poking along like a donkey cart in a time of great transformation in our material situation, have recently taken off as steeply as a jet plane. So it comes about that the elders—whether they conservatively wring their hands over the new changes or liberally try to understand, absorb, and temper them—feel like expatriate visitors in their own country. Like expatriates, we of the prewar generation are inclined to spend our days wavering between wonder, exasperation, apprehension, disgust, and superiority toward what we see around us. Again like expatriates, we tend to cling together in enclaves, to propitiate our sense of loneliness by finding islands of our own within the new world that conform as closely as possible to the old one. The turned-on headlights of daytime drivers that have been so familiar a sight in many parts of the country in recent months are supposed to mean support of our Vietnam policy, but they mean more than that. They are a symbol by which the loneliest of the lonely expatriates reassure themselves that they are not wholly alone; they are the club ties of the American Club in Samarkand.

Even if the analysis is right, all of this is, of course, a change still in process, and indeed still in an early stage of process, rather than an accomplished fact. The “silent majority” apparently still is a majority—of poll respondents and of voters—and even if it were not, traditional methods of succession to power have survived up to now to the extent that the older generation would still hold business and government power and might be expected to continue to do so for some years to come. Even among the young themselves Dr. Mead’s prefigurative culture is still very far from universal. There are whole campuses in “middle America” where a long-haired boy is an object of derision, where revolt against university authority never crosses anyone’s mind, where books and magazines containing four-letter words are missing from the library shelves, and where “God Bless America” is sung without irony.

But such attitudes among the postwar generation seem to represent cultural lag. They are scarcely the wave of the future. It is those older than the nation’s median age who make up the bulk of the silent majority. In purely actuarial terms, this majority is living on borrowed time; it is a majority under a death sentence.

What happens next?

One line of thought holds that the strange new attitudes and values are attributable not to the influence of youth but to the Vietnam war and the disruptions, frustrations, and loss of morale attendant upon it—its ability, as James Reston of the New York Times has written, to “poison everything.” This interpretation is reassuring to the prewar generation because it implies that when the war is over everything will revert to the way it was before. But those born in the years immediately after V-J Day, who were entering college when the Vietnam war was escalated and are leaving it now, and who have lived only in the strange new world, can scarcely be expected to go back where they have never been. I am convinced that Vietnam is not the root cause of our current malaise and that if there had been no Vietnam the young would have found plenty of other reasons to dissociate themselves violently from their elders and their elders’ regime. Certainly the end of the war, when it blessedly comes, will mark the end of our current paralysis and the beginning of a seventh and more hopeful postwar mood; but I expect it to be a mood not of returning to the familiar but of pushing forward to something new and unknown. In the traditional American cultural pattern youth has always been allowed its fling with the tacit understanding between youngsters and elders that after graduation the youngsters would “put away childish things” and “settle down.” The wild young buck who had been proud of his capacity for beer and beerinspired pranks would sink quickly into sober, hardworking domesticity, and the pretty blonde who had found it amusing to flirt with Communism while in college would become his meekly Republican, upwardly mobile bride. It is impossible for me to imagine the post-V-J Day generation following this familiar pattern. One can, for example, visualize their male hairstyle going from shoulder length to shaved heads—but not to crewcuts; one can visualize their politics doing a flip-flop to dangerously radical rightist positions—but not to traditional conservatism or traditional liberalism.

How, then, can they be expected to react to being older and to assuming power and responsibility instead of defying them? Will they, in their turn, be “prefigured” by the new younger generation that will consist of their children? How will they run the Ford Foundation? the Institute for Advanced Study? the Bureau of the Census? Will they continue the broad liberal trends initiated by the older generation that they now revile—trends toward more social-minded corporations, better-distributed wealth, more general education, less pervasive bigotry? Will they bring to reality The Economist ’s prophecy that “the United States in this last third of the twentieth century is the place where man’s long economic problem is ending”? Will, say, the affairs of General Motors be managed by men (or women) wearing long hair and beads and smoking pot during sales conferences? Or will there be no General Motors?

The fact that it sounds like material for a musicalcomedy skit indicates how little we know what to expect. Adolf A. Berle said recently, speaking of economic and social affairs in the United States, “We are beginning to evolve a new ball game.” Whether we like it or not, the rules of the new game will not be our rules. They will be devised by those born since V-J Day.

We hope you enjoy our work.

Please support this 72-year tradition of trusted historical writing and the volunteers that sustain it with a donation to American Heritage.

Donate