Skip to main content

Agents Of Change

March 2024
33min read

You’ve probably never heard of them, but these ten people changed your life. Each of them is a big reason why your world today is so different from anyone’s world in 1954

For want of nails, kingdoms are won and lost. We all know that. The shoe slips, the horse stumbles, the army dissolves in retreat. But who designed the nails? Who hammered the nails? Who invented the nail-making machinery? Who figured out how to market the nails in neat plastic blister packs hung from standardized wire racks in hardware stores? • The house of history, that clever balloon frame of statistics and biographies in which we shelter our sense of tradition, of progress, of values gained and lost, is nailed together with anonymity. Too often we look at history instead as a half-timbered castellated structure, focusing on the carved keystones above the doors bearing the faces of Napoleon or Lincoln, Voltaire or Descartes, Michelangelo or Machiavelli. History tends to neglect the nails, the nuts and bolts of daily life. But in the last few years, the captains and kings have, if not departed, been joined in the history books by butchers and bakers and nail makers. • As the lens of contemporary media creates instant history, with more and more focus on the individual and “human interest,” it replaces fame with celebrity. The last forty years have been a time when individuality itself has become more problematic, in which questions of individual rights and responsibilities have come to the fore, when the issue of “conformity” has been eclipsed by doing your own thing and the Me decade. And history has seemed to gel too quickly. Not for us the reflective distance of Gibbon or Parkman. Kids come home from school announcing they are “doing a unit on the sixties.” One class of these units is decades—the fictions into which we package years and subsume details. The fifties are trapped in a 1957 Chevrolet Bel Air permanently wired to a drive-in theater speaker with Rebel Without a Cause on the screen. The sixties remain lost in the mud and flowers of Woodstock and Vietnam. • Also lost in the fast cutting and the sound bites are those nails, the nuts and bolts of ordinary life. One can look at the making of nails—real nails, not metaphorical ones—as Adam Smith in The Wealth of Nations looked at the making of pins through the division of labor, to understand fundamental changes in an economy. One can see the paradoxes of Thomas Jefferson’s world view summed up the nail-making operation his slaves ran on Mulberry Row at Monticello, and the impact of Jacob Perkins’s nail-making machines in making possible George Snow’s creation of the American balloon-frame house and how those houses in turn made possible the model towns of John Nolen, America’s most prolific and least known town planner, and the suburbs William Levitt built, using special machines to make nails on the site.

 
 
 
 
 
 
 
 
 
 

Time travel, like physical travel, turns on such details. That’s why the Back to the Future films seem inherently more humane than, say, the writings of Edward Bellamy and possess a charm that lifts them above their genre. The wonders we encounter in them are designer underwear, panty hose, and boom boxes.

 

To look at the nails is to see how history can make unwitting heroes of such people as—to take an extreme example—Hub Reese, a Tennessee mule merchant. In the early 1980s, when the United States sent Stinger anti-aircraft missiles to the Afghan rebels, it found there was no way to move them through the country’s rough mountains. It turned to Reese, who loaded C-5 transports full of Tennessee’s finest muleflesh.

Carried on the backs of Reese’s mules, the Stingers proved so effective that Soviet fighter and helicopter pilots soon refused to fly, crippling the operations of Soviet ground forces. The Soviet army pulled back to the cities and finally out of the country altogether. The result was a flow of bodies back to the U.S.S.R., along with wounded and disaffected veterans, and a lost sense of invulnerability. And these Vietnam-like effects surely were a factor in the end of the U.S.S.R. Which is to say, the Stinger was a nail in the coffin of the Soviet Empire, and Reese was one of the unsung nailers.

Ordinary life is shaped for the most part by ordinary people. From its beginnings and over the last forty years this magazine has tried to highlight the human side of history, as reflected, for instance, in a soldier’s diary or his letters home to his sweetheart as much as in the general staff’s orders. To reaffirm this tradition, we’ve sampled some of the great nail makers of the history of our time—people you’ve likely never heard of but who in one way or another, directly or indirectly, by intention or accident, changed the way you lead your daily life.

These are truly important people who remain unknown, but they are not a representative cross section of anything, and they are not players in great well-known historical movements like civil rights or feminism. Nor are they any kind of top ten. But they are a core section into a neglected stratum of the record, a layer of history where quiet people quietly change our lives in ways that affect us through and through, every day.

1 MALCOM MCLEAN

Malcom McLean engineered one of the two or three vital shifts in transportation in the last century. More than GATT or the 747, his development of containerized shipping changed world trade. It has been compared to the transition from sail to steam. It reduced shipping times from the United States to Europe by some four weeks, cutting loading and unloading at the docks from days to hours and enabling a vessel to carry four to five times as much freight as before. Workers no longer dreaded descending into ship holds, and cargoes rode in the far greater security of sealed containers.

Containerization, though, can best be viewed as a kind of technical Northwest Passage or Suez Canal, changing not only economics but geopolitics by connecting land and sea transport more profoundly than canals connect bodies of water. As “intermodality”—linkage between modes of transportation—became a favorite word not only of the freight industry but of executives and politicians who understood that clichés about the global economy could only be made tangible at docksides and on tarmacs, McLean proved himself a kind of Cortés and de Lesseps in one, a mental explorer and engineer.

Born in 1914 in rural Maxton, North Carolina, McLean bought his first truck in 1931. Six years later he found himself cooling his heels while his truck’s contents were loaded onto a ship in Hoboken, New Jersey. It occurred to him, he would later recall, that there must be some way simply to lift the trailer right onto the vessel and save enormous time and labor.

The idea stayed with him over two decades as that single truck multiplied to several, tens, and then hundreds. After he had built McLean Trucking into one of the nation’s largest freight fleets, he had the resources to return to his idea, designing containers and a specially fitted ship. On April 26, 1956, the first of his container ships, the Ideal X , left Port Newark a few miles from the Hoboken pierside where McLean had had his brainstorm.

It would take a full decade of battles against entrenched shipping firms, railroads, and unions before McLean went international, dispatching a container ship to Rotterdam in 1966. Soon the lower shipping costs produced by containerization made it possible for Americans to eat apples from New Zealand, record on Japanese VCRs, wear Hong Kong-produced jeans, and drink French Perrier water. Today, if you use it, eat it, or wear it, it probably reached you via a shipping container.

Containerization changed the landscape as well. You can see the revolution McLean wrought from the air above New York Harbor. To the east the crumbling piers of Brooklyn and Manhattan signal the old way of doing business. In the early 1950s the city had extensively redeveloped Brooklyn’s Furman Street piers, the setting for On the Waterfront . But immediately afterward McLean’s containerized shipping arrived, and the investment in the old break-bulk system was obsolete. On the western side of New York Harbor, by contrast, you see the flourishing ports of Elizabeth and Newark, New Jersey. There great stacks of containers sit straddled by the gantry cranes that pile them. Black tracks of rails and asphalt run toward them like furrows of earth. Any number of older cities have been so altered.

In 1969 McLean sold his containership company, Sea-Land, to R. J. Reynolds, which in turn sold it to CSX. The industry he had established would soon be dominated by huge Asian firms. The ships grew larger and larger—stacked so high with containers they seemed ready to capsize—and the system grew more and more sophisticated.

The cargo that once arrived in boxes, bales, barrels, and bags now comes in blank containers, with no indication to the human eye of their contents, but only a product code that machines can scan and computers trace. The system of that tracking has become so exact that a two-week journey can be timed for arrival within fifteen minutes. That fact has helped make possible another fundamental change in the world economy: global “just-in-time” manufacturing, in which, for instance, automobile engines from Japan arrive at the Toyota plant in Georgetown, Kentucky, still sealed in their containers, less than an hour before they are placed beneath the hoods of cars.

 

2 LEONARD AND PHIL CHESS

Highway 61, the legendary blacktop road celebrated in blues lyrics, was the highway of freedom from the Mississippi Delta. And its northern conclusion, at least spiritually, was Maxwell Street, the jostling marketplace that was Chicago’s equivalent of New York’s Lower East Side. There the sons of sharecroppers mingled with the sons of kulaks. Maxwell Street was the eventual destination of two boys, Leonard and Phil Chez, who arrived at Ellis Island from a village near Pinsk, Poland, on Columbus Day 1928 and traded their last name for Chess.

Graduating from their father’s Chicago junk business, they began running bars and clubs around Maxwell Street. Their prize was the Macomba, a comparatively classy joint where the likes of Billy Eckstine and Ella Fitzgerald would sometimes appear. The talent they saw in their’clubs gave the brothers another idea. They bought a record label called Aristocrat and began recording records and selling them out of the back of their car. Before long, college kids and other white suburbanites were discovering the music popular on Maxwell Street.

The Chess brothers made records that helped transport African-American culture, especially its language and music, to its central place in American culture, so that decades later whites across the nation would dance to beats from the Delta and use words like funky without thinking of their origins. The Chesses helped turn the blues into rock ’n’ roll and set off a musical revolution. They gave us Muddy Waters, Howlin’ Wolf, Willie Dixon, Elmore James, and Chuck Berry. As the producer Jerry Wexler put it, “they created the single finest archive of the blues in existence.” When that archive found its way to England, the Rolling Stones, Eric Clapton, and others listened to it and then beamed it back to America in transmuted form.

The Chesses influenced jazz: Muddy Waters’s stop time seeped into jazz riffs and the sound track of the 1955 film The Man with the Golden Arm . And they joined in pioneering the sound called doo-wop. In 1951 they released what is arguably the first rock ’n’ roll song, Jackie Brenston’s “Rocket ‘88.’”

They were not graceful personalities. They were never overly sensitive to niceties of contracts and royalties, and they knew that to get their records played on national radio, they needed the services of the legendary mobconnected promoter Morris Levy. They even listed payola as an expense on their tax returns.

Muddy Waters was working as a Chicago venetian-blinds deliveryman when they signed him on in 1948. Waters had been recorded in Mississippi by Alan Lomax for the Library of Congress, but now he was playing a more urban blues. His arrival in the Chesses’ usually rented studios began the electrification and urbanization of the blues and the birth of rock ’n’ roll. For the Chess brothers, Waters recorded “Rollin’ Stone” in 1950, the song that gave the culture a phrase and a rock group and magazine their names. When Waters appeared at the Newport Jazz Festival on July 4, 1960, the crossover of the blues to white audiences was symbolically complete.

The Chess brothers were widely viewed in the music business as superstitious, crude, and crass. Leonard Chess was famous for answering the telephone and addressing even good friends with a genial obscenity. He claimed to care not for the blues but only for the green they produced. But he knew what he liked. At one recording session he shoved a drummer aside and picked up the sticks himself.

 

The Chess brothers’ story is one in which greed and inspiration swirled together in a characteristically American pot where the ingredients did not so much melt as alloy in a metallurgical sense: steel guitar, electricity, and vinyl transmuted into a wholly new cultural substance. By the early 1960s the major record labels were discovering the Chess brothers’ market, and the era of the small operation was fading. After Leonard died, in 1969, the company was sold. But the recordings keep coming back to us, imported from Japan and Europe or reissued here by MCA. And the music the Chess brothers pursued for commerce has become utterly commercial itself. Today Muddy Waters’s songs show up in commercials for Timberland boots, Diet Coke, and the Gap.

3 LUTHER TERRY

Before 1964, when Luther Terry issued his famous report on smoking, few people knew that there was such a thing as the United States Surgeon General, and fewer still knew his name. Today everyone knows of the Surgeon General, but who can name the one behind the Surgeon General’s report?

That report pushed government’s involvement in policing the health of its citizens to a new level, and it did so by attacking America’s first cash crop. The American economy was literally founded on the hogsheads of tobacco that departed Jamestown. And the plant gave the South its second birth after the Civil War, thanks to the cigarette and the cigarette-rolling machines that James B. Duke put into operation.

By 1964 smoking had long been an essential feature of American life, coloring Hollywood’s vision of us, sustaining us through the Depression and World War II (“if you’ve got ‘em, smoke ‘em . . . Lucky Strike Green has gone to war”). To be sure, the tobacco companies, seeing concern growing, had pushed filters, but they did so by brilliantly grafting tobacco to essential American tropes through such images as the Marlboro man.

For the Marlboro man or the Camel to draw the fire of the Surgeon General seemed at first the height of absurdity. His office was one of those odd nooks and corners of government that had quietly grown huge—to almost twenty thousand people by the time Terry became Surgeon General. Established in 1798 as the United States Marine Hospital Service, it had become the Public Health Service in 1912 and had been made part of the Department of Health, Education and Welfare in 1953.

As its head the surgeon general wore what seemed like a Salvation Army officer’s uniform. Terry was no costume general, however; his research specialty had been hypertension, which was clearly linked to smoking. He had joined the Public Health Service in 1941, when he was in his early thirties; he finished his career, which lasted until 1982, as a professor of medicine at the University of Pennsylvania. After Terry, the Surgeon General became a character on the political stage, and such successors as C. Everett Koop in the 1980s and Joycelyn Elders in the 1990s frequently took the center of it in debates about issues ranging from abortion to drug legalization.

The response to the 1964 report was slow and tortuous. Only after years of battling did the tobacco industry, under threat of tough legislation, agree to warning labels and a ban on television commercials. The original warning—“Caution—the Surgeon General has determined that cigarette smoking may be hazardous to your health”—bore all the signs of committee-crafted prose. It was stiffened in 1971, and the TV-advertising ban took effect at midnight, January 1, 1971, granting the tobacco companies one last day of football bowl games for their ads.

 

The label and ban struck some as a horrendous government intrusion into private affairs and others as a sorely needed offensive against an addictive drug that each year took more American lives than World War II had claimed.

At first the warning labels appeared only on ads. The 1971 policy put them on the cigarette packs and banned TV advertising. That, in retrospect, was a turning point. It was the beginning of the appearance of tiny warnings on the objects of our lives, little footnotes of caution to the prose of daily affairs. Soon came seat-belt warning chimes and legends on rearview mirrors and graphic propaganda like the nutrition pyramid.

The debate over public health became part of a wider debate between those advocating limits to government power, citing John Stuart Mill’s injunction against protecting individuals against themselves, and those arguing that in a modern complex society, where we all breathe the same air, we can’t escape the costs of insurance, health care, and law enforcement, and so on, forced on us by the acts of others. It is a debate that reaches to the core of American ideals, one that will likely continue as long as the Republic.

4 FRANCIS TURNER

Frank Turner shaped the creation of the largest public-works project in the history of the world, the network of interstate highways that changed the country subtly as much as the transcontinental railroad did overtly. He was secretary to the Clay Committee, which formulated the interstate system plan, and later he rose to head the Federal Highway Administration.

Turner grew up in the era of the farm-to-market road, when highway builders thought in terms of routes to link the country to the city, and the interstates were born as the farm-to-market road writ large, though two-thirds of interstate money was ultimately spent in the cities. When he was young, in Texas, part of the state poll tax could still be a day’s work on the roads, and he labored alongside his father on those roads. At Texas A&M he studied soil science and the dynamics of asphalt. He worked on key military highways in Alaska during World War II, scouting terrain from a light airplane, and was sent to rebuild the highway system of the Philippines after the war.

In 1954, when President Eisenhower appointed Gen. Lucius Clay, architect of the Berlin Airlift, head of a commission charged with formulating an interstate-highway plan, Turner was made its executive secretary, the key staff position. He carefully helped draft the legislation, then spent days in patient testimony explaining it to members of Congress. Turner prided himself on his role in the surveys that broke the country down into grids in which citizens were scientifically polled on their transportation patterns and their desires. He and his staff drew what they called “desire lines” on the national map. Those lines were paved into the interstates.

GRUEN ENVISIONED malls as civilizing nodes in America’s open space. Looking on what he had wrought, he was not pleased.

But Turner’s maps were deceptive. Although most of the interstates’ length was in the heartland, most of the dollars and the most difficult and expensive miles of the system ended up in the cities and in beltways around them. The highways helped shift the balance of economic power to the Sun Belt, where the roads could be built without the interference of old water- and rail-based transportation patterns. Designed in part to empty cities quickly in the event of imminent nuclear attack, they helped empty old row-house neighborhoods of residents heading for the suburbs. Thanks to the interstates, you could see a lot more of the U.S.A. in your Chevrolet and commute farther to work. Detroit’s automakers prospered, and new sectors of the economy grew up around interchanges and along beltways.

Turner shared the vision of the man he worked for. Dwight Elsenhower, too, championed the system in the context of farm-to-city routes. When Ike once found his limousine stalled by interstate construction outside Washington, he was baffled; he thought the interstates were supposed to stay away from the cities. Elsenhower’s vision had been shaped by the muddy roads of his boyhood and a famous crosscountry Army truck convoy to which he was assigned after World War I. It took the vehicles weeks to make it from coast to coast over dirt roads.

Turner became head of the Federal Highway Administration in 1969—just as a backlash set in. That year a Boston downtown artery was killed in a battle that set a national pattern. Under fire from Jane Jacobs, Lewis Mumford, William Whyte, and others who saw urban highways destroying old neighborhoods, the interstates found themselves effectively brought to a halt. Having conquered mountain ranges, rivers, and swamps, they were being stopped by human forces. Soon no mayor could support a downtown interstate.

Maintenance funding became hard to pry loose from a Congress that preferred cutting ribbons to patching potholes. The system fell for a while into decline, and funds were diverted to mass transit. The era of the building of the interstate system came to an end.

Turner, who had outlived that era, was both bitter and bewildered. He thought vital opportunities were being lost. To tap into the Highway Trust Fund for mass transit offended him morally. He took the word trust literally, he said. He had turned down under-the-table offers from developers for advance word on the location of highways that could have made him rich. And among the people whose homes were razed to make way for the new highways were a couple near I-45 in Texas, his own parents. He privately derided what he called “rabbit transit.” “If you like waiting for elevators, you’ll love rabbit transit,” he would say.

Today, although his monument is often widely criticized as dull to drive and as a homogenizing factor in American life, the interstates’ benefits are so taken for granted as to be beneath the level of consciousness. And there is testimony to their power in the contemporary metaphor for the latest infrastructure dream: information superhighway.

If a national system of wires could provide such benefits as its advocates say, doesn’t that point to the benefits of the national system of roads? Doesn’t all the talk of the wonders of the Internet point implicitly to the wonders of the interstates? There is a neat parallel between Al Gore’s advocacy of the information superhighway and his father’s role in shaping the interstate legislation, questioning and listening to Frank Turner and others. The Vice President recalls as a child being “in the committee room when the signs were made green” on the system.

5 VICTOR GRUEN

On the ocean liner Leonardo da Vinci , contemplating the Renaissance artist’s drawings of his Renaissance ideas for organizing traffic systems, Victor Gruen was reconsidering. It was the early sixties, and his great innovation, the covered shopping mall, had become a disappointment to him. He had hoped to bring the amenities of European-style public spaces to Americans, but Americans had taken his ideas and used them as weapons against the very sort of urban life he had sought.

 

Intellectually Gruen might have felt most at home halfway between Europe and the United States. In July of 1938 he had made the crossing westward on the SS Staatendam , with eight dollars in his pocket and little more than his T square as his luggage. Before that, in Austria, he had championed innovative public housing and other elements of modern architecture’s highminded social program. He had studied with Peter Behrens—the godfather of modernism, teacher of Mies van der Rohe and Le Corbusier—who had declared that ornament was a crime. Gruen had just begun to get good architectural commissions from Vienna department stores when the Anschluss came. He fled to the United States and there created that most American of environments, the mall. It would, he was convinced, help halt suburban sprawl.

The idea was inspired by the markets of medieval Austrian and Swiss towns he had visited on bicycle as a young man and by the stately Galleria Vittorio Emanuele II in Milan (whose name was echoed first in a Houston mall and then in gallerias all over the country). But in lesser hands malls came to take on a form that reflected American efficiency and disdain for frills and showcased the amazing power of American marketing.

When it all began, in Minnesota in the mid-fifties, Gruen was dealing with a more immediate problem: balancing the competing claims of two “anchor” department stores. In shopping centers, one anchor was the rule. Gruen responded with a central courtyard equidistant from both stores. The result, Southdale, opened on October 8, 1956. It had a roof because the local climate offered only 126 days of ideal weather a year. “In Southdale Center,” the first ads read, “every day will be a perfect shopping day!”

Like Frank Lloyd Wright, Gruen envisioned retail centers as civilizing nodes in the open space of America. But to developers the mall had a different appeal: Covering it with one giant roof meant that construction costs for individual stores could be reduced, making the whole complex cheaper to build.

Soon a new generation of developers was building on Gruen’s ideas. Edward DeBartolo took to the air to scout locations for new malls; Melvin Simon and Alfred Taubman built fortunes in malls. James Rouse combined the basic mall with historical themes in old Eastern cities—with Quincy Market in Boston, Harborplace in Baltimore, and South Street Seaport in New York. So many developers hired Gruen himself that soon his firm had offices in six major cities. He was called in to replan whole downtowns, notably in Fort Worth. At the height of his influence, the Shah of Iran hired him to redesign Teheran.

But Gruen, looking on what he had wrought, was not pleased. He believed that malls “represent a great step forward. They are clearly defined urban organisms to fight sprawl.” But he also realized their deficiencies, chief among them the fact that no mall had “treated successfully the appearance of the area immediately surrounding the building core, which appears like an asphalt desert.” He had dreamed of social, cultural, and recreational crystallization points for amorphous, sprawling suburbs. But instead, the rise of the mall had freed suburbanites from the need to visit cities. When offices followed, the process was complete.

 

By 1964 Gruen was downplaying suburban malls in favor of the downtown pedestrian mall—the cheapest form of “urban redevelopment,” requiring simply the exclusion of traffic from an old Main Street, some paving blocks, and a few planters. But life had already left Main Street for Gruen’s malls.

His formula for the beginning of a richer, more humane environment had become something that could be reduced to the lowest common denominator, so totally planned that men’s shoe stores stood right next to women’s shoe stores, allowing couples to browse separately without getting lost. (So many mall stores seem to sell shoes; where do Americans ever walk enough to wear out so many shoes, except perhaps at malls?) Each store space was standardized into a module, so that if one enterprise failed, it could be jerked out and replaced like a defective computer chip. The stores became the same across the country, and so did the malls, sealed off from their local landscapes and finally coming to resemble huge ocean liners or spaceships—the starship Free Enterprise .

In 1967 Gruen returned somewhat downcast to Vienna, with its orderly urban plan of forts and battlements and ring roads, its cafés and plazas. He died there in 1980.

6 ALEXANDER PONIATOFF

Bing Crosby loved his golf game even more than he loved his radio show. So in 1948 he bought the first tape recorder marketed in America. It was made by Ampex, a company founded by a Russian émigré, Alexander M. Poniatoff, and named with his initials plus “ex for excellence.” Now Bing could hook and slice and three-putt to his heart’s delight while a nation of fans heard his voice, taped, on their radios.

Liberating GIs had brought the tape recorder back from Germany, along with Wernher von Braun and rocket science. It was clear that the next step would be a recorder for video. Crosby put his money into the videotape recorder (VTR) effort. So did Gen. David Sarnoff of RCA, who poured millions of dollars into a huge team of researchers during the fifties.

Poniatoff took a different approach to the problem. He was a former pilot in the czar’s air force who had fought on the White side during the Russian Civil War and later worked as an engineer in China before coming to the United States in 1927 and finding it much more amenable to his ambitions. He believed in hiring good engineers and giving them freedom. “You have to create an atmosphere where engineers do their work with some sort of deep personal interest and enthusiasm,” he said in 1960. His videotaperecorder effort was headed up by two young whizzes, Charles Ginsberg and Ray M. Dolby, a nineteen-year-old prodigy whose name would one day be synonymous with electronics used in virtually every tape recorder.

Their work paid off when they gave their machine its premiere at a national broadcasting industry convention in April 1956. Giant RCA had been beaten out by a small company from the so-called hobby lobby. Ampex’s key innovation had been the idea of recording on a bias, a strategy not unlike slicing carrots on the slant so that more of their surface is exposed for cooking. The method proved far more efficient.

Like photography and film, the videotape recorder subtly changed experience. Within months after Poniatoff showed his prototype machine, a VTR was used to tape-delay a national news program. The demand for Ampex machines exceeded all Poniatoff’s projections, and within a dozen year the value of his company more than doubled, to $220 million.

The machine inaugurated the era of the media event when the famous Nixon-Khrushchev “kitchen debate” took place in July 1959 at the Ampex booth of an American trade exposition in Moscow. It had been carefully set up by aides to the then Vice President, including the journalist William Safire, and was recorded on the VTR.

In the fall of 1963 the instant replay arrived as television highlighted the scrambles of the Navy quarterback Roger Staubach. Within weeks the nation watched Jack Ruby shoot Lee Harvey Oswald over and over again. Poniatoff’s machines, installed in stations all over the country, had arrived just in time to turn the events of the sixties into a repetitious nightmare. A few weeks after the assassination of President Kennedy, Robert Kennedy greeted a visitor to his home with the words “Hi, come in. We’re just watching my brother’s funeral on TV.”

PONIATOFF’S creation has helped change our sense of experience, by making it repeatable and editable.

By the time Poniatoff died, in 1980, Ampex was a half-billion-dollar-a-year company. But his engineers had been tube men, unfamiliar and uneasy with the world of transistors, and Ampex had licensed its tape technology to a rising Japanese company called Sony in return for Sony’s developing a smaller machine. Sony introduced the videocassette, while Ampex, used to selling only to radio and television stations, floundered in the new markets it opened. By 1972 Sony and other Japanese firms had already put Ampex to rout, just as the home market was about to open up. By the early eighties the Japanese home video-cassette recorder stood as a symbol of America’s economic decline.

But Poniatoff’s creation had helped change our sense of experience and especially the public experience of history by making it repeatable and editable, completing changes begun by phonograph, film, and audiotape. To see films at home, reduced in size but available on demand, both demystified them and made them objects of almost universal study. Not long ago Gene Kelly expressed amazement at the interest VCRs had created in the movie musicals he had starred in years earlier; they had been designed to survive for only a season or two at most. Old films were becoming more and more part of a common heritage, a language shared by a society that had lost many other mutual frames of reference.

Few Americans have noticed that the videotape business Poniatoff’s machine made possible has become a $15 billion industry. That’s more than twice the box-office receipts of the Hollywood feature-film business, the original American dream factory that now exists in part to feed the home machine.

7 JOHN FLYNN

He was called the Muhammad AIi of Arizona for his combative courtroom style but was celebrated only by other local attorneys, who often showed up in court to watch him in action. Then a public defender and the American Civil Liberties Union brought him in to argue the case of Miranda v. Arizona . The half-hour he spent before the U.S. Supreme Court one spring afternoon in 1966 helped change forever the debate over law and order.

The man on whose behalf he nominally spoke was languishing in a Spanish-style state prison in Florence, Arizona. Ernesto Miranda had confessed to rape and kidnapping and had even signed a statement that said he was aware of his legal rights. But since he had not specifically been informed of his right to counsel, the Supreme Court ruled, he had been deprived of due process. The wider consequences of the decision, which actually dealt with four cases raising related issues of the rights of the accused, were as much symbolic as practical. No one could show that police procedure really changed very much, but the decision, the Miranda warning, and the Miranda card became a focus for the debate over law and order.

There was a certain irony in Flynn’s involvement. A decorated Marine, he had gone AWOL in the Pacific during World War II—to return to the front. He had served as a prosecutor as well as a defender in Arizona, and when called into the Miranda case, he had had no ambition to change the nature of American law. But Miranda’s name became part of the language, as did the very phrase “You have the right to remain silent.” The card’s language was taken almost verbatim from the Court decision written by Chief Justice Earl Warren, whose references ranged as far afield as the Code of Hammurabi and the English Star Chamber.

The decision arose in the context of police third-degree practices that had long been a target of reform. You have only to read of the interrogations undergone by pre- Miranda fictional detectives—Philip Marlowe, say, in The Long Goodbye —to get an idea of the excesses of police practice at the time. But documenting the extent of those excesses, or the extent of change after Miranda , has always been hard.

The reading of a Miranda card has served nightly on American movie and TV screens to dramatize the contradictions at the heart of our system. The chase is followed by the collar, the suspect spread-eagled against car or alley wall, and then, all of a sudden, the grudging warning: “You have the right, you miserable —, to remain silent. Anything you say can and will be used against you. ...” The warning provides an absurdist caesura, a parenthesis in the action that sums up the paradoxes and, to many, the absurdities of the American system. It emerged from a whirlpool spun by tides both of desire for law and order and of pressure to end the days of back-room police tactics.

 

To the “silent majority” of the Nixon era, Miranda became a symbol not of law but of lawlessness and the “coddling” of criminals. The search for villains latched on to the Warren Court and everyone involved in the decision and some not, from the ACLU to President Johnson’s Attorney General, Ramsey Clark. Sen. John McClellan rose in the Senate to declare the decision something close to the end of civilization, and Nixon’s Attorney General John Mitchell worked to create test cases to challenge the ruling.

But some law-enforcement leaders believed Miranda actually made their job easier, by standardizing methods of informing suspects of their rights and therefore validating confessions. Others argued that it kept the police honest, that the old ways had brutalized the police and made them lazy about developing evidence independent of confessions.

Miranda himself had a retrial and a prison term and then returned to society as an appliance deliveryman. He was playing cards at a Phoenix bar called La Amapola on January 31, 1976, when a fight broke out and within minutes he was stabbed to death. His attackers were immediately apprehended and their rights read to them in both English and Spanish from the Miranda card issued to all Phoenix police officers.

Flynn became a judge on the Arizona Court of Appeals, a widely respected figure in the state but never again a national figure and not a wealthy man. After he died, on a ski vacation in 1983, he was found to be in substantial debt. One of the few signs of his brief brush with greatness hung on the wall of his office: a ceremonial quill pen purloined from the offices of the Supreme Court, a misdemeanor for which he was never charged.

8 JACK EWALT

Only in the late 1970s did we begin to call them the homeless. There had long been American drifters, such as the railroad “bummlers” of the 1890s and the Okies and Texies of the Dust Bowl, but in the seventies the old images of Bowery bum and wino were replaced with those of an underclass of the permanently and structurally lost, a group pitied and feared, sleeping on heating grates, living in cardboard boxes.

The causes of homelessness have been much debated, and even the numbers are unclear, but the social and spiritual dimension of the problem is not. What has been clear is that whatever the role drugs and alcohol abuse and family disintegration play, mental illness is a key component. One of the few agreed-upon statistics about the homeless is that 30 to 40 percent of them are people who have been in or need treatment for mental illness.

The presence on the streets of so many of the mentally ill can be traced back to the good intentions of the Joint Commission on Mental Illness and Health and its chairman, Jack Ewalt, in the early 1960s. The commission’s report, five years in the making, set the Kennedy administration’s mental-health-policy priorities. John F. Kennedy had been one of the sponsors of the legislation that established the commission in 1955 and he was the President when its report was delivered.

Ewalt, the mental-health commissioner of Massachusetts, directed the commission to a series of conclusions that would, in time, lead to the national policy its advocates called deinstitutionalization and its critics called dumping. Ewalt’s story is one of goodwill and high ideals gone awry. It begins in the late forties, when the public was encouraged to think of mental hospitals as medieval places of shackles and shock treatments. Books such as Albert Deutsch’s The Shame of the State and Mary Jane Ward’s novel The Snake Pit strengthened this public image. A new generation of psychological theorists, from R. D. Laing to Erving Goffman, questioned the concept of “madness,” while state administrators despaired at their rising mental-health expenses.

Ewalt’s report was delivered to President Kennedy in March 1961. “It is recommended that all existing state [mental] hospitals of more than 1,000 beds be gradually and progressively converted into centers for the longterm, combined care of persons with chronic diseases, including mental illness,” the report urged. The commission also recommended replacing oldstyle hospitals with community health centers. “With present knowledge put to use, the nation could more than double the number of chronically ill mental patients returned to the community.”

 

Kennedy declared support for the plan. “When [it is] carried out,” he proclaimed, “reliance on the cold mercy of custodial isolation will be supplanted by the open warmth of community concern and capability.” He went so far as to argue that two-thirds of all schizophrenics could be treated and cured within six months.

Such optimism was nourished by theories expounded in several influential books about mental illness. The same year Ewalt’s report was delivered to President Kennedy, Erving Goffman’s Asylums and Thomas Szasz’s The Myth of Mental Illness were published, and a series of legal cases established the civil rights of the mentally ill and proscribed holding mental patients against their will.

In October 1963 Congress implemented the plan, in the Community Mental Health Centers Act of 1963. It was the last law signed by President Kennedy, and it began the reversal of national health policy—from “stockpiling” patients to what would be known as dumping.

It can be argued that the plan was never given a chance. The report called for some two thousand community mental-health centers, but fewer than seven hundred were ever built. Smaller intensive-care hospitals were not built at all. And though the report recommended tripling spending on the mentally ill, Congress and the states used it to justify cutting spending.

Legislators and mental-health ad-ministrators were hoping to reduce, not to increase, their budgets. They could use one new treatment the commission advocated as a means to that end. In 1954 the drug chlorpromazine, with the trade name Thorazine, was introduced. Pushed heavily by its maker, Smith Kline & French, whose salesmen argued that its use would more than pay for itself in reduced staff injuries, damage to furniture, and broken glass from violent patients, it was called by some doctors a “medicinal lobotomy.” But Ewalt supported its use, saying, “Unquestionably, the drugs [in its class] have delivered the greatest blow for patient freedom, in terms of nonrestraint, since Pinel struck off the chains of the lunatics in the Paris asylum 168 years ago.”

To administrators, Thorazine and its ilk offered another sort of hope. Drugs might pacify patients outside the institution while contact with family and community helped heal them, all while money was saved.

The results took time to become evident. The discharges began in such Northern states as Massachusetts and New York and progressed to less prosperous and less urbanized ones. A whole generation of highly motivated professionals worked to establish successful community health centers, but few of them were created, and there was almost no coordination with existing mental-health hospitals, which nonetheless began discharging patients and cutting their admissions.

At first most of the discharged patients, roughly two-thirds of them, went back to their families. But by 1975 the figure had dropped. By then some 28 percent of discharged patients in New York were headed for destinations listed as “places unknown.” Gradually, the anecdotal evidence suggests, families became unable to deal with patients. Many stopped taking their drugs, which tended to have debilitating side effects, and many who did take them wandered city streets in a zombielike gait that became known as the Thorazine shuffle.

THE DEVELOPMENT of Hoff‘s microprocessor has taken on a sense of inevitability unknown since the nineteenth century.

The process was abetted by the 1972 broadening of Social Security benefits called Supplemental Security Income; it provided checks to the mentally disabled. This gave deinstitutionalization added momentum. In 1955 an estimated 0.47 percent of the population was sleeping in a mental hospital on an average night; by 1990 the figure had fallen to just 0.05 percent.

The larger public policy consequence, as Sen. Daniel Patrick Moynihan has argued, was a miscasting of a problem: What appeared to be a shortage of housing was in large part a lack of mentalhealth care. The nation wrestled with its conscience, but with little awareness of the events that had brought on the crisis. And the crisis continues.

9 MARCIAN HOFF

“The engines of personal computers” they are often called, but micro-processors now also run our toasters, preventing golden brown from becoming char black. They control the mixture of gasoline and air in automobile engines. (Souping up a hot rod is no longer a matter of greasy muscle work with wrench and drill but one of finding a nerdish specialist in aftermarket controller chips.) Like the insects their packages resemble, microprocessors have spread around the world on gossamer wings of marketing and innovation.

The man behind these tiny bits of reason etched in sand was Ted Hoff. Hoff worked for Intel, which had led the way from integrated circuits made of separate transistors wired together to printed circuits on single chips of silicon. Hoff and Intel’s client was a Japanese calculator maker, ETI, whose product had the awful name Busicom. Asked to design a dozen chips to fit inside a hand-held calculator in 1970, Hoff and his colleagues instead managed to combine onto one fingernailsize chip not only all the functions of a pocket calculator but the logic unit of a mainframe computer—the equivalent of a three-hundred-thousand-dollar room-size IBM machine of only a decade earlier.

Ironically, the executives of Busicom at first resisted his idea; later they went bankrupt. No laurels descended on Hoff’s head then or later. He continued to labor valiantly for Intel, then went on to work for Atari, one of the first personal-computer companies, before he retired to tinker in his garage, that proverbial gestator of Silicon Valley entrepreneurialism.

Born in 1937, he had grown up near North Chili, New York, near Rochester, and attended Rensselaer Polytechnic Institute. Hired by Intel’s Robert Noyce as the company’s twelfth employee, he had been delegated to figure out new applications for the company’s electronic chips. His microprocessor could do much more than serve as the heart of a pocket calculator, but neither he nor his bosses were sure just what. On November 15,1971, Intel ran ads in Electronic News announcing the new kind of chip. It invited the world, almost desperately, to figure out what use to make of it. Eventually the world responded.

Intel never even pushed for patent rights, and Hoff never became wealthy from his invention. A self-effacing man, he often said that if he hadn’t invented the microprocessor, someone else would have, because the idea “was in the air.” In 1990 there surfaced a rival claim by a man named Gilbert Hyatt, an inventor no one in Silicon Valley had ever heard of. Hyatt obtained the patent Hoff had never sought.

The debate suggests how the nature of modern technology has changed the role of individual invention and innovation. Intel itself formally credits Hoff, Stan Mazor, Masatoshi Shima, and Federico Faggin as the team behind the microprocessor, but in Intel’s publicity Hoff is clearly the hero.

Like the larger transformation of the planet through computerization, the development of the microprocessor has taken on a sense of inevitability unlike that enjoyed by technology since the nineteenth century, when the progress of the locomotive seemed as natural as the evolution of Darwin’s finch. At Intel they quote Moore’s Law, formulated by one of the company’s founders, that the number of transistors on a chip triples every two years. The companies behind the chips seem to do the same. The Electronics Industry Association for years handed out a family tree of Silicon Valley firms that sprang from Bell Labs, birthplace of the transistor, and multiplied like an atomic chain reaction.

 

Faggin put the matter in perspective when he wrote that the microprocessor was one of those “turning points” in technology that occur when “something unstoppable . . . becomes the catalyst for sweeping social and technological changes. We call these inventions, and yet they appear at once to be too big and too obvious to be described as such. Such examples are the car, the airplane, the microprocessor. These are the inventions that do not result from the establishment of any new scientific principles, but rather from the synthesis of existing principles into a new form that extends in both foreseeable and unforeseeable ways what could be done before. . . . Because there is a certain inevitability about inventions of this sort, which have often been anticipated by futurists, the real contribution lies first in making the idea work.”

Today the microprocessor is one high-tech realm where America clearly dominates the world. It has become so vital to the economy, packing so much power and value into so little size and weight that it is the booty of one of the newest forms of crime. It is like drugs or jewels. Gangs hold up warehouses, steal their Intel microprocessors, and sell them on the black markets of Taiwan and Korea. It is perhaps the second-highest form of flattery.

10 ALAN STILLMAN

It was one of the very few good ideas ever born in conversation across a bar. After a day’s work as a perfume and flavor salesman, Alan Stillman would drop in at his neighborhood bar, the imaginatively named Good Tavern, on First Avenue in Manhattan, a place with a bullet hole in the window. Stillman told the bartender, “You ought to work on luring single people in here with food.” The idea still sounded good the next morning, at least to Stillman. With five thousand dollars of his own money and five thousand borrowed from his mother, he bought the Good Tavern, and on March 15, 1965, he opened T.G.I. Friday’s, the world’s first singles bar.

A new generation of white-brick apartment towers was rising along First Avenue. Stillman, who had grown up in the city, could see the change they represented. Single people, especially single women, were moving into those big buildings. One next door to the Good Tavern was known as the Stew Zoo. “There were a hundred and eighty apartments in that building, but there were four hundred and eighty stewardesses,” Stillman recalls. “No one knew that, but I did because I was part of that culture. We were always saying, ‘Where’s the party tonight?'” It was later estimated that 780,000 single people were living on Manhattan’s East Side between Thirtieth and Ninetieth streets.

Many of these singles had come to New York from small towns searching for what Stillman called “a constant cocktail party,” and he decorated T.G.I. Friday’s to convey a party air, selling inexpensive hamburgers and other food—for these singles had little interest in cooking—along with liquor. Stillman pleads guilty to having created the fern bar at T.G.I. Friday’s; poverty is his excuse. There was little he could do inexpensively to redecorate the old tavern except hang plants and mock-Tiffany lamps.

 

Within a few months T.G.I. Friday’s was joined by Maxwell’s Plum and Mister Laffs. A moribund old bar called Sullivan’s switched to the singles format and came alive. The two blocks of First Avenue in the Sixties they all were on were soon known as “the body exchange,” but from today’s perspective the scene was innocent; most of the exchanging was only of telephone numbers. Newsweek and Life sent reporters who depicted T.G.I. Friday’s as the front line of the sexual revolution. The stigma of being single, especially for women, was being softened, and that of venturing alone to a bar was vanishing.

“You really should credit the man who developed the birth control pill,” Stillman says. The pill was as much a symbol as substance in the 1960s. And the man behind it might be said to be either Carl Djerassi, a biochemist at the Syntex lab in Mexico City who in 1951 made the key chemical discovery (a substance in Mexican yams that simulated the female hormone progesterone), or Gregory Pincus of the Worcester Foundation in Massachusetts, who in 1956 first tested it on fifty women (paving the way for the introduction of the commercial product in 1963).

But the spiritual shift was deeper. Stillman had effectively begun the commercialization of sexual freedom, a course that had been advanced by Helen Gurley Brown and others. Brown had arrived as the editor of Cosmopolitan , a magazine that gave her a forum for “frank talk” about sex, in 1963, the same year The Feminine Mystique came out. T.G.I. Friday’s was hardly a protofeminist place, but the unfolding of feminism was unimaginable without the changes that places like Friday’s helped engender.

The revolution led as always to the counterrevolution, beginning with horror stories like that of the 1975 novel Looking for Mr. Goodbar and fueled by outbreaks of venereal diseases, culminating in AIDS. A decade after Stillman opened for business, the worst imaginings of conservatives in the heartland came true at Plato’s Retreat, the Manhattan club that invited absolutely anyone to join the naked gropings on mattresses on the floor.

The changes Stillman had tapped into were unstoppable. Within five years he had opened Tuesday’s and Wednesday’s. He would expand to Nashville and Dallas—“our first night there you couldn’t get into the place”—and several other American cities before selling the T.G.I, name and format to Curt Carlson Industries in 1974. Today there are 294 T.G.I.’s all over the world; one of the latest opened recently in the former East Berlin. That very first one, on First Avenue, closed in the fall of 1994; business had tapered off, the owners said. Stillman has gone on to other worlds in the restaurant business and is president of the New York Restaurant Group.

THE FULL LIST OF THOSE WHO, WORK- ing in anonymity, have changed our lives is of course much longer than the above. And of course it can never be known well enough to be drawn up. But contemplating it can be uplifting. Obscurity itself has value in a culture of overexposure, and the freshness of the innovator’s first steps is never to be recovered once historians begin to bronze his achievements like a pair of baby shoes. Looking at what ordinary folk have done makes us look afresh at every stranger and imagine possibilities. And looking at the history of everyday life can help us remember that every day is history, and perhaps even live it a little more intensely.

We hope you enjoy our work.

Please support this magazine of trusted historical writing, now in its 75th year, and the volunteers that sustain it with a donation to American Heritage.

Donate