February/March 1987 | Volume 38, Issue 2
If the historians themselves are no longer interested in defining the structure of the American past, how can the citizenry understand its heritage? The author examines the disrepair in which the professors have left their subject.
In the mid-sixteenth century, a blind and deaf old Spanish soldier named Bernai Díaz del Castillo set out to write an account of what he had seen and done as a follower of Hernando Cortés during the conquest of Mexico. “Unfortunately,” he noted by way of introduction, “I have gained no wealth to leave to my children and descendants except this story, which is a remarkable one.”
There is little doubt that Castillo would have enjoyed having gold, silver, and estates to pass along with his narrative. But he recognized that the simple tale of what happened was a treasure in itself. In that respect he was much smarter than most of today’s Americans.
The remarkable story of the American past is not being handed down in any satisfactory way to our descendants. Most of the blame must fall on American educators. At the highest levels of the university system, a majority of professors no longer appear to believe that there is such a thing as the story of the past. They see history as a mosaic without pattern or, at a minimum, with many small ones that never add up to much. At the grade school and high school level, the men and women who create the curriculum see history as a grab bag of what they call “material” for lessons in how society works, and not as a tale to be told.
As a result there is a vacuum of information about our shared past. Apart from a few serious historians (in and out of the academy, though mostly out), the emptiness is filled by the sound and fury of commercialized anniversary hype, in the course of which we are losing one of the prime sources of national identity and civic spirit. If people are ignorant of who they are and how they came to be, they cannot judge their leaders and are subject to every passing foolishness, every plausible scoundrel who would divide and delude them.
I was reminded of all this last year at a meeting of the Society of American Historians held at the New York Yacht Club. I had been invited to present the society’s Bruce Catton Prize for Lifetime Achievement in history to C. Vann Woodward. The society, an organization limited to two hundred and fifty elected members, was founded in 1939 “to encourage literary distinction in the writing of history and biography.”
The evening was very satisfying. Woodward, who is professor emeritus at Yale, was clearly a perfect choice for the award. Not only has he written books that changed the way historians and lawmakers view the history of the South after the Civil War, he stands in a line of historians and teachers who once could take pride that the mark of the educated American was an interest in their subject. During the presentation the approving ghosts of Catton, the first editor of American Heritage, and of Allan Nevins, founder of the society, could almost be sensed hovering around the guests in the room. These included not only distinguished professors but many historians and biographers whose names have appeared on best-seller lists during the last two decades, including Barbara Tuchman, Nancy Milford, Walter Lord, David McCullough, and Robert Caro. Also on hand were representatives of the major publishers. History, and the business of history, it appeared, were prospering.
Next morning I found a scene not nearly so exciting at the New York Penta Hotel, only a few blocks away, where the seventy-ninth annual meeting of the Organization of American Historians was in progress. Unlike the society, the organization is open to any dues-paying member and is composed mostly of the men and women who teach the history of the United States in colleges and universities throughout the country. (To confuse matters slightly, there is also the American Historical Association [AHA], which enrolls college-level teachers of the history of any part of the world whatever.)
I had attended many OAH conventions myself as a young professor in the early 1960s. They had been lively affairs. Throngs of rapid talkers swirled and roared up and down hotel corridors and formed cheerful eddies in the bars. Expectant, optimistic job seekers hurried past on their way to interviews in rooms where hiring committees sat behind piles of dossiers spilling over the remnants of room-service breakfasts. At cocktail time publishers threw large, harddrinking parties in hopes that the sheer weight of conviviality would persuade the best-known professors to sign contracts for textbooks potentially worth millions.
Not in 1986. Now a subdued pessimism hung in the air. There were fewer young faces. The booming academic “slave market” had shrunk to a job directory on a single bulletin board that offered lean pickings. There was, for example, a one-year nontenured opening at the University of Connecticut, mainly to teach Western Civilization, doctorate required, publications desired. Another one-year slot in Constitutional and Early National History at the University of Toledo was posted. There was one plum, a department chairmanship at Drexel University. Other openings were for public—that is to say, nonacademic—historians. The Franklin D. Roosevelt Four Freedoms Foundation in Hyde Park, New York, and the Rhode Island Black Heritage Society both were looking for executive directors. In New York City, the Fiorello H. La Guardia Archives was seeking an administrator who was expected to have “superior interpersonal communications skills, experience in grant writing, public relations, preparation of fund-raising materials and copy for public speeches, exhibits and correspondence.” I tried to imagine some giant of American history like the reclusive Brahmin Francis Parkman exercising superior interpersonal communications skills.
The convention program gave notice of roughly a hundred sessions, not counting lunches, workshops, and walking tours. Many of the subjects were intriguing, but many others seemed remote from what I, in common with most people, had once thought of as history. On the agenda were such offerings as Historical Perspective on Middle-Class Formation; Changing Sexuality in a Changing Society; The Social Construction of Domestic Space in the Early Twentieth Century; Computer Graphics and History; Gender and Technology; Exploring Important Issues on the Micro Level—Southeastern Pennsylvania in the Eighteenth Century.
Micro, indeed. I left the hotel in a far different mood from the buoyancy of the night before. The atmosphere of the convention seemed to be that of a gathering of medical and legal specialists around the body of a very frail patient named history.
Best-selling and academic historians seemed to live in different worlds. And neither group showed much connection with the resounding battlefield dwelt in by thousands of high school and elementary school teachers. The news from that front was not good either. One of the papers presented at the convention was by Diane Ravitch, professor of the history of education at Teachers College, Columbia University. Ravitch had already revealed, in an article in The New York Times Magazine , that we are losing the struggle to keep historical knowledge alive in the schools. According to her, the amount of time given to history in all schools is steadily declining. And what students do learn is frequently presented by teachers without historical training. Ravitch cited a study showing that among a sample group of seventeenyear-olds, most of whom were taking an eleventh-grade U.S. history course, “about two-thirds could not place the Civil War within the period 1850–1900; a third did not know that the Declaration of Independence was signed between 1750–1800; a third did not know that Columbus sailed for the New World before 1750.”
How did we come to such a pass? How do the worlds of popular history, the graduate school, and the elementary classroom mesh, if at all? What is the place of the nation’s past in the nation’s future? My efforts to find out took me not only to offices and libraries but on a trip through parts of my own past. I have been in this business a long time. If I speak here in the first person, it is not out of vanity but as a witness to change.
Let’s start with some history. Hardly ever did a profession ride such a roller coaster of boom and slump as college and university instruction in United States history during the forty-year period following the end of World War II. Before that, in the 1920s, history Ph.D.’s were produced at a rate of something like fifty to sixty-five per year. There were few historians, but their prestige and status were considerable. As late as 1950 there were fewer than a thousand undergraduate history majors in all the colleges of the United States. But suddenly the number of majors—and the doctor professors to teach them—began to climb: 881 Ph.D.’s were granted in history in 1969; the following year some 45,000 students enrolled as history majors. The graduate schools rolled on like World War II assembly lines—1,183 Ph.D.’s in 1973 and 1,157 in 1975. Almost half of all Ph.D.’s in history between 1920 and 1980 were awarded in that last decade.
And then the great collapse. Enrollments in all the liberal arts fell. The number of history majors sank to fewer than 20,000 by 1979, and the ebb continues in the 1980s. Production of Ph.D.’s was cut back—to 932 in 1977, then to 665 in 1981. But the job market shrank faster, and it was already too late for many of those who had finished their degree work. Thousands of highly trained historical specialists were stranded by the receding tide. As one of them told me, “The country hasn’t experienced anything like the Depression of the 1930s, but our profession sure did.”
What accounted for the extraordinary growth after 1950 was a shower of prosperity that descended on every campus, thanks directly to World War II and the Cold War. First there had been the GI Bill, bringing into the classrooms a tidal wave of government-financed students; that was the wave I had ridden. In 1958, the year after the Soviets had launched their Sputnik satellite, the federal government opened the sluice gates of the Treasury again and provided grants and fellowships for defense-related advanced studies—a definition that charitably included the history of potential enemies and allies around the globe.
Demography did its work too. Approximately two years after Sputnik, the advance guard of the postwar baby boom poured out of the high schools to occupy the heights of higher learning. Swarms of them majored in history, and United States history was especially popular. It was attractive to learn how we had come to be what we perceived ourselves to be—the world’s treasurer, workshop, success model, and armed guardian of liberty.
Historians with Ph.D.’s were soon so freely available to college administrators that it was automatically assumed that only those with, or on the verge of, the doctorate were qualified to teach undergraduates, which up to then had rarely been the case. The Ph.D.’s themselves had additional ideas. Their proper function, it was taken for granted, was to rise above grading sophomores, to continue advanced research within a major university if possible, and to select, encourage, and train successors. With plenty of money available for libraries, publications, fellowships, and faculty enlargement, the professors dreamed big. They saw themselves as part of a permanent and self-perpetuating guild, with few responsibilities outside the academy.
Meanwhile, professional societies like the American Historical Association and the Organization of American Historians did little to attract, hold, or promote members who taught at the pre-college level or who worked in museums, libraries, historical societies, and archives around the country. And although the holders of leading chairs wrote high school and college textbooks that presumably reflected what the academy thought was the essence of history, few of the major professional journals reviewed these works. Since texts for grade schools and high schools were, of necessity, highly condensed and simplified (and profitable), it was considered bad form to scrutinize them too closely. A colleague had the right to supplement his salary without embarrassment. Thanks to this lack of criticism by one’s peers, defined by the current executive director of the AHA, Samuel R. Gammon, as “lamentable politesse ,” professional historians—which meant, overwhelmingly, college-teaching historians—shook off any responsibility for what American children were being taught about their past.
At the same time, most of the successflushed, textbook-writing professors were not very hospitable to those who wrote for a general audience. In faculty clubs, authors without doctorates who had the temerity to write history and biography were referred to, unflatteringly, as “laymen.” It did not matter whether the books were trashy—as some undoubtedly were—or meticulously researched. I remember a loud and contentious lunch where I failed to impress anyone with the argument that if such outsiders were laymen, we were priests, and was that really how my colleagues imagined themselves? A few top-ranked professors did challenge the two-culture theory. Among them was Arthur M. Schlesinger, Jr., whose sins included bestsellerdom, two Pulitzer Prizes, and service in the White House. To boot, he had not taken a Ph.D. And there was the tireless and venerated Allan Nevins (likewise un-doctored), who labored to bridge the gap by helping to found the Society of American Historians and who acted as a godfather to American Heritage in 1954. In common with such other exponents of lucidity and literacy as Richard B. Morris, Samuel Eliot Morison, and Henry Steele Commager, Nevins wrote frequently for American Heritage. In fact, many well-regarded academics contributed articles to the magazine at one time or another.
But the word for such efforts was popularization , intoned with a sniff of condescension and the implication that the author was buying into the fat life at the expense of integrity and virtue. One night, during an AHA convention in Philadelphia, a distinguished professor from the University of Pennsylvania told me amiably but firmly that it was a betrayal of my training to waste time writing for this magazine, whose circulation then was around three hundred thousand. Our job as social scientists was to find and shape data to be used by others trained to interpret and analyze it. The public be damned.
Busily talking to themselves, the mandarins forgot some important lessons from their own past. The great nineteenth-century historians—Richard Hildreth, William Hickling Prescott, George Bancroft, Francis Parkman, Henry Adams—had not been professors, though Adams did put in a stint of teaching at Harvard, where his colleagues bored him. But he, like the others, thought of himself as a man of letters and an independent scholar—a proud title. He believed in a science of history, but one to be presented with the maximum of art. Historical writing still straddles those two camps.
History did not become a profession until the formation of the AHA in 1884. For a long time its membership was not dominated by those who had undergone graduate training, which was an exotic import from German universities. As late as 1911 less than 30 percent of the members were college academics. In 1912 the association’s president was a “layman” named Theodore Roosevelt.
The 1960s professors were as heedless of the future as investors in 1929. They ignored obvious danger signals. One was demographic: it was self-evident that the bull market for future teachers would end with the baby boom, but there was no sign of self-restraint in the production of Ph.D.’s. On the contrary, history departments at institutions like Wayne State and Rochester, where I taught, either started or enlarged Ph.D. programs. They claimed it was important to break the “elitist” domination of the profession by the handful of universities— Harvard, Yale, Columbia, Princeton, Chicago, Penn, Wisconsin, Hopkins, Berkeley—that awarded almost all the nation’s history doctorates. Such democratic rhetoric was irresistible in the optimistic 1960s. Once, during a 1962 AHA convention at which I was present, W. Stull Holt of the University of Washington rose, like the Ancient Mariner, to detain the revelers with a question: “What happens to all these programs when the money runs out?” He was more or less ignored as a graybeard, out of touch with the times.
There were also booby traps in the rapid changes of curriculum that the good times brought about. The “equality revolution,” as it was called, spurred demand for studies in the history of previously overlooked groups—women, blacks, immigrant workers, Indians. Computers made it possible to analyze transfers of people, property, and votes with a subtlety that made older investigations seem as crude as shooting sparrows with a blunderbuss. Young historians, fresh from graduate school, arrived on campus eager to teach what they had learned in so-called relevant—and undoubtedly intriguing—new researches into the psyches of the Founding Fathers, or the ecological and sexual practices of the Age of Jackson.
Courses in these subjects shouldered aside old-fashioned staples of the history curriculum—the frontier, the Constitution, American diplomacy, the Civil War, Progressive politics, the rise of Big Business. Such changes were especially easy in the iconoclastic sixties. Good enough; change is the law of life. But those older courses did have some connection with themes already familiar to students from childhood—cowboys and Indians, covered wagons, great inventors and builders, the march of democracy. And they could be fitted into some kind of coherent pattern.
The newer courses, more accurate in some ways, nonetheless aroused neither echoes nor pride, and were somehow disconnected from one another. Since it did not matter in what order they were taken, departments began to drop requirements for sequences and prerequisites. Many also gave up their sweeping introductory surveys for freshmen as hopelessly unsophisticated and probably unnecessary. But it had been those very courses that had offered a common historical ground for all college graduates. It gave them something to stand on. Survey courses were also magnets for recruiting history majors. Often they were conducted by the most forceful, humorous, and exciting lecturers. Abandoning them not only had educational ramifications; it weakened history enrollments. “When departments dropped the Western Civ or the U.S. surveys,” said one of my former students at Rochester, “they lost the barkers in front of the carnival tent.”
The discipline of history was becoming more fragmented and—to use the jargon—less accessible. The process was described to me by Clyde Griffen—a friend and colleague at Vassar, where I taught part-time for most of the 1970s—when I spoke to him in his home in Dutchess County one chilly afternoon last spring. Griffen had come to Vassar from Columbia, where Richard Hofstadter—the best-known intellectual and political historian writing in the 1950s—had supervised his dissertation. At Vassar, Griffen became interested in studying social mobility. With money from the Ford Foundation, he and his wife—also a Columbia-trained historian—undertook a formidable, computer-based study of what people worked at and where they lived in nineteenth-century Poughkeepsie—a sketch of the ladder of opportunity, as they called it.
Griffen then drifted further away from conventional history by doing work at Yale on how to teach American studies from an interdisciplinary point of view. “After all those years digging in the Poughkeepsie records,” he explained, “I wanted to get out of the tunnel.” Now his courses at Vassar deal heavily in family, work, and leisure patterns among urban and rural Americans and are flavored with a strong brew of geography, statistics, economics, and sociology. He is a dedicated and popular teacher but admits that such courses do not enhance the appeal of history to the student looking for a broad picture. At Vassar the department of history largely has given up on synthesis. There are no longer comprehensive examinations for senior history majors that make them pull together what they have learned in their diverse courses. Meanwhile, Griffen notes, “it is no longer assumed that there are certain historical works that the generally literate population ought to know.”
Griffen is not apologetic about the new order of things. “We’ve reached a level of sophistication,” he argues, “where close analysis of social data is indispensable.” But he recognizes that a greater challenge lies ahead: “It takes art of the highest kind to interweave that kind of analysis with narrative.” Griffen is convinced it can be done. Still, he admits, there is a “hunger for an old-fashioned kind of history.” A hunger that goes unappeased in an age of specialization.
The economic shrinkage of the seventies compounded the problem by chasing more and more students from the humanities into professional and business courses that qualify them for jobs. Ph.D. candidates now tended to be more dedicated—and more frustrated. The end of the long, sleepless road of seminars and dissertation preparation—often involving years of costly travel—was a stone wall. There were few careers open to talent. Departments were already full of professors in early middle age who had achieved tenure, the guarantee of lifetime employment. Designed originally to protect academic freedom, tenure had turned into a bottleneck as soon as growth no longer created new jobs. In order to guarantee quality, colleges and universities had adopted an “up or out” rule: a young scholar not promoted to tenure within a reasonable time span—usually six years—had to leave. But without tenured slots to fill, quality was irrelevant; the good and the mediocre alike were told, like loitering bums, to keen moving.
By the late 1970s, when thousands had given up, it looked as if the future of history as an academic discipline could be summed up in a scene from some novel of a post-doomsday future—a handful of very old gurus teaching an even smaller handful of disciples the secrets of an art that nobody cared much about. In 1980 the critic John Lukacs, in the pages of Harper ’s, charged the historical profession in America, which had allowed the “virtual elimination of history from American public and other secondary schools, as well as the elimination of history requirements from colleges and universities,” with being “gnarled and ossified.” Joan Hoff-Wilson, now the executive secretary of the OAH, could even ask in a 1980 article: “Is the historical profession an ‘endangered species?’ ”
But six years later, plenty of people insist that the patient is alive, hungry, and kicking off the bedclothes. Samuel Gammon for one. He is the cheerful, bespectacled head of the AHA, and he thinks it possible that history enrollments have “bottomed out.” He believes that “in the 1960s the Vietnam misadventure and technological hubris led to a disenchantment with the wisdom of the elders” but that the negative mood is passing. John A. Garraty, chairman of the Department of History at Columbia, said much the same thing a few weeks later. “When the country is in bad odor,” he observed, “then U.S. history is in bad odor.”
Gammon also argues that the battle is not over and lost in the elementary and high schools. It is true, he says, that traditional nuts-and-bolts history was elbowed out of school curricula to make way for social studies, whose content is more elastic and therefore more susceptible to pressure for the inclusion of trendy themes. “A secondary school curriculum,” he notes, “is a political act. Hispanics vote. Constitutions don’t.” What is Gammon saying? That the American people have voted themselves the kind of inadequate history curricula their children are stuck with? If so, it would be news to the populace at large. Nevertheless, the AHA and OAH are no longer indifferent. They have joined, along with the National Council for the Social Studies (NCSS), in the History Teaching Alliance to improve pre-college instructional programs in history by bringing together college professors and secondary teachers.
Donald Bragaw, the former president of the NCSS, is now chief of the New York State Bureau of Social Studies Education. He spoke to me in his office in a colonnaded building in Albany that he referred to as “the Greek Temple.” He agrees that the pendulum had swung away from chronological history in the 1960s and toward “case studies.” But he adds that, in New York at least, there had always been a mandatory year of straight United States history in the eighth grade and a follow-up year of American studies in the eleventh, and that a new 1987 syllabus proposed for both grade levels is going back to a “fairly traditional setup.”
However, the draft of the eleventhgrade syllabus that he showed me’which could be modified by individual school districts—compressed everything from 1607 to 1865 into one-eighth of the school year, except for what it calls the “enduring themes” of the Constitution, which are introduced throughout the modern material. And the suggested content of the rest (which shared billing with “Major Ideas” and “Model Activities”) looked very unspecific to my eye. But Bragaw offered the familiar argument that there was no sense in “force-feeding” students with facts divorced from context or purpose. For many years students were taught the so-called basics, yet to what effect? (As if to say, because the nature of American government had not changed very much, why be bothered by the details of its development or its constitutional parts.) Besides, Bragaw continued, “kids like big ideas. Big ideas hang around in their minds.” He does not think that the ignorance of basic information cited by Ravitch is new or frightening. He had debated her on the issue at the OAH convention and believes she was being too much of an alarmist. There never had been a golden age when every student was a fountain of historical knowledge (as Ravitch herself acknowledged in the Times article). “Every ten years,” said Bragaw, “we ‘prove’ that students don’t know their history. Well, what do we want them to know? And why?”
That seemed, to me, to be exactly the question on which the multiplying cliques of academic historians could not agree.
Still another sign of returning professional health, according to some observers, is the surge of interest in what is termed “public history,” a phrase used interchangeably with “applied history.” There had always been a few openings for trained historians to work in archives, museums, and historical associations, and in government policy-making and planning agencies where they could use their skills in research, analysis, and presentation to do background papers on current problems. In 1950, when I got my Ph.D. from the University of Chicago, several of my fellow students went instead to the State Department. One of them, after two years of slashing his way through the dense Teutonic prose of the German historian Friedrich Meinecke for his dissertation, said he found it a cinch thereafter to study the daily East German press. Another friend joined the Central Intelligence Agency with an untroubled conscience, for this was before Guatemala, Iran, the Bay of Pigs, and Chile, and we were still the good guys.
Yet another friend of mine was Dick Hewlett, whose easy smile disguised what a ferocious competitor he was on the drafty old handball courts under Stagg Field. He eventually joined what was then the Atomic Energy Commission. By 1977 he had become the chief historian of the Department of Energy and had lost none of his edge. At a symposium titled “Broadening the Scope of History,” he lashed out at the academic tendency to downgrade his kind of work. “What do you think would be the outlook for the legal profession today,” he asked, “if the only purpose of law schools were to train law professors? … Why is it that lawyers, physicians, economists, and scientists can involve themselves as professionals in government planning and administration but historians cannot? … The profession has not yet ventured much beyond the hypothetical world of the classroom.”
The time-honored answer of the academics was that the government or corporation-employed historian (or, less politely, “court historian”) had his research topics, and sometimes the preferred answers, chosen by his superiors, so that he lacked the independent curiosity and creativity of the tenured campus scholar. (It was a little difficult to sustain this argument in cases like that of Adm. Samuel Eliot Morison, official historian of the United States Navy in World War II.) When the wellsprings of teaching jobs dried up, however, the parched profession became more flexible in its outlook. The AHA set up the National Coordinating Committee for the Promotion of History, which, among its other duties, encourages high-quality community programs of historical research, preservation, and publication. Public historians—including some involved in business research—organized their own association in 1979, the National Council on Public History (NCPH), and held its 1986 meeting jointly with the OAH in New York City. So the gathering at the Penta was the historians’ counterpart of the merger of the American and National football leagues, or an alliance between osteopathie and general medical associations.
The NCPH has its own journals, institutes, and conferences. Each group is also pushing for programs to train college and graduate students specifically in public history techniques—so that an advanced student might prepare a computer-assisted environmental-impact statement instead of a dissertation in order to graduate. In the end the public historians claim to be a robust and growing company of scholars outside the frozen professorial ranks. They say that in their custodial hands the study of how to find and apply pieces of the past is doing quite well.
But perhaps the most visible sign of vitality in history is the popular demand for it. The fires were stoked by the long series of bicentennials that began in 1976 and will run through 1989 (after which the 500th anniversary of Columbus’s first voyage will be sailing toward us). Historical anniversaries like the recent birthday celebration of the Statue of Liberty offer irresistible opportunities to editors, entertainers, and politicians as well as to the historians whom they consult. The reading public is apparently eager to share in lives of the past. Larry Shapiro, of the Book-of-the-Month Club, says that some of the club’s most popular titles over the years have been in history and biography.
Ever since Roots , historical “docudramas” and “mini-series” have flourished on television and competed successfully for top ratings. And both television and movie screens have been lit by hundreds of hours’ worth of awardwinning documentary and feature films on such diverse topics as the Brooklyn Bridge, the IWW, and women factory workers during the Second World War.
“History” is not sick. On the other hand, the teaching of history may be.
History teaching is in trouble at the lowest levels partly because it shares in the problems overwhelming a school system that is, for the most part, old, overcrowded, and broke—and not certain of whether it is supposed to be teaching history or good citizenship, social hygiene, and self-awareness. The draft of the eleventh-grade social studies curriculum for New York State, for example, says that on completion of the program, “the student will be a person who can demonstrate the ability to make rational and informed decisions about economic, social and political questions confronting himself or herself, the society and the interdependent world. Such decisions will draw upon the lessons of history and the social sciences.” There isn’t much room for a sense of the living past in that ambitious prescription—much less in the sanitized patriotism that some right-wingers want to push back into the classroom.
History teaching is also in trouble at the highest levels because, over time, academic arrogance allowed many professors to lose touch with their base in popular culture. That is unfortunate, for the academy does teach things worth taking to heart: respect for facts; awareness of complexity and change; caution in judgment; the willingness to submit to the criticism of informed peers. In other times the campus historians had a chance to popularize these values—as well as new ideas and discoveries—by addressing as many people as possible who enjoyed the drama of history. They blew their opportunity by largely disregarding anyone outside their guild. This they did for reasons of fear, snobbery, laziness, vested interests—the usual suspects.
So we have a paradox: The subject matter of history is more popular than ever, but it is taught to schoolchildren only in dilution and is no longer required as part of a college education. At the same time, more than ever, continuity is needed. The bicentennials and other spectacles must be fitted into a context. The historical awareness of a whole people can’t be left entirely to the chance of which books and television shows win the biggest audiences in any one year; the most enjoyable biographies and most readable accounts of a single historical event inevitably deal with isolated points in a landscape of time. But without the map provided by a general knowledge of the past, their connections and overall meaning to us are lost.
A sense of the wholeness of history has to be restored and passed on, whatever that may take in the way of providing more trained teachers and demanding more years of study. It is that sense of wholeness that makes for good judgments, for good ground for a nation to stand upon.
The tide of the culture is running against such an effort. The old faith that history is a record of progress is gone. The eighteenth-century idea, inherited from the Renaissance, that one learns virtue and character from the lives of the ancients is also gone. No one to whom I spoke saw any hope of finding equivalents for the old grand syntheses of the textbooks of my student years, like Morison and Commager’s Growth of the American Republic or John D. Hicks’s The American Nation . These books, in turn, distilled the views of older historians like Frederick Jackson Turner, Charles A. Beard, Albert Bushnell Hart, Edward Channing, and Herbert Baxter Adams. All were men who quarreled endlessly among themselves about what gave motion and direction to American history—was it the frontier? the battle between classes? the English background?—but they were united in believing that it had direction. There was a story there that made sense. The Santa Maria, Jamestown, Bunker Hill, Marbury v. Madison , Jackson’s war on the Second Bank of the United States, the carpetbaggers, the railroad builders, the Big Stick—the good people and the bad, the names and dates and slogans we had to remember were parts that added up to a whole adventure in which everyone shared.
For the last forty years academically trained historians have been picking apart this fabric. They have justifiably criticized its oversights, its self-centered nationalism, its insensitivity. But nothing has been left in its place. There is no mural called “The Story of America” for the children to look at and remember, only the uncoordinated bits and pieces represented by the works of popular, public, and academic historians.
Surely those who are involved in history in any way can do better. If they accept this state of affairs, they lose touch with the unifying sense of a shared past. And then they have abandoned themselves and their fellow citizens to a world with “neither joy, nor love, nor light/Nor certitude, nor peace, nor help for pain.…” It may be intellectually fashionable to talk of living in a posthistoric age, but it is a bit like Noah’s family commenting that they seem to be in for an extended period of wet weather. Unless we can remember how to build an ark, we’re going under.