- Historic Sites
The 50 Biggest Changes In The Last 50 Years
October 2004 | Volume 55, Issue 5
With American Heritage approaching its fiftieth birthday in December 2004, we asked five leading historians and cultural commentators to each pick 10 leading developments in American life in the last half-century. In this fifth installment, Phil Patton—whose books include Made in USA: The Secret History of the Things That Made America and Bug: The Strange Mutations of the World’s Most Famous Automobile —selects the 10 biggest changes in the realm of innovation and technology. In previous issues we presented our other authorities’ choices of the half-century’s biggest transformations in politics, business, home and the family, and entertainment and culture.
“I can’t imagine how we lived without it.” So we often say about an innovation that has changed our lives. But about the changes that have been most deeply absorbed into the pores of daily routine, we could also often say, “I can’t remember how we lived without it.”
My finger no longer retains the muscle memory of a rotary dial phone. I can no longer remember walking over to a television set to change the channel. When I think of slipping into the back seat of my father’s Oldsmobile, I falsely remember fastening a seat belt. Old television shows are magically remembered in color, and when I recall typing college term papers in the early 1970s, I do so on a click-clacking plastic computer keyboard rather than a massive metal Royal.
Such distortions may be the very definition of what has changed the world most. The year 1954 saw the arrival of the first solar cells, developed at Bell Labs. Boeing was testing a prototype of the 707, the intercontinental jet airliner that would so change patterns of travel and consumption. Elvis was cutting his first records. And computers were just starting to be connected by telephone lines in the creation of the Cold War SAGE air defense system. The broader implications of that development were hardly imagined.
The impact of some innovations, such as jet planes, has been striking in its predictability. But small innovations have wrought surprisingly large and unexpected changes in daily life too. Here are enough innovations, large and small, to count on all 10 of what used to be called digits —your fingers.
It was all there in Arthur C. Clarke’s famous article “Extra-Terrestrial Relays” in Wireless World magazine in October 1945. Inspired by the discovery of German V2 rockets, which he believed could serve as boosters, Clarke proposed launching earth satellites into geosynchronous orbit to handle radio, telephone, and television communications. By 1962 Telstar was beaming TV images between Europe and the United States.
Clarke understood that building ground networks no longer made economic sense, a truth realized as countries all over the Third World leap-frogged straight to wireless phones and satellite TV. The echoes of that article are still resonating in such events as Rupert Murdoch’s installation as the TV baron of China. Satellite phones remain challenged by cost and power demands, but their potential impact was illustrated a few years ago by the poignant final moments of a trapped Mount Everest climber phoning his wife with his last words and more recently by the pixelated pictures from the Iraqi war front generated by satellite phones.
In the western North Carolina valley where my ancestors lived for a century and a half, television reception was long limited by the mountains, and the population was too poor and too sparse to justify investment by cable companies. My cousins and neighbors could see only two fuzzy channels before the arrival of the TV satellite dish. But then this area of Appalachia quickly came to have a remarkably high number of the dishes. Now the mountaineers can keep up with gossip about Hollywood stars as easily as with that about their cousins in the valley.
We’ve all heard by now of Moore’s Law, the dictum laid down by the Intel cofounder Gordon Moore in 1965 that holds that the number of transistors and therefore the capacity of a silicon chip must rise exponentially. The Intel 8088 processor in the first IBM PC had 29,000 transistors. Today’s Pentium 4 has up to 178 million.
The importance of Moore’s Law, however, lies not just in what chips have done better and better—like running automobile engines more efficiently, regulating the browning of toast, and printing professional-looking flyers for the high school dance —but also in the pace at which their power has advanced, as relentlessly as did the frontier in the nineteenth century. Because of this, marketing and sales staffs have been able to set up a steady pattern of declining prices and new fashions in technology. “Adoption curves” have shot upward on the chart of time. Today’s cutting-edge device for the “early adopter” is tomorrow’s, or even today’s, strip-mall commodity.
Technical advances just over the horizon are like the empty lands of the nineteenth century. Exploitation of the manifest destiny of silicon has reinforced all the patterns of the Old West: speculation, competition, shootouts, and boomtowns and ghost towns.
For those of us who grew up on the promise of the laser as a powerful ray gun, slicing up steel plate and boring holes through stone, the unexpected turn has been instead the spread of the low-power, low-cost laser.
It comes as no surprise that Boeing wants to mount anti-missile lasers on jets, but it’s astonishing that the soldier in the field can pick out targets with his red laser pointer —and the regional sales manager can target data on his PowerPoint presentation with a pocket-size version of the same thing. We might have guessed that lasers would reshape the corneas of the myopic, but who would have anticipated the laser in a $30 device at the local Wal-Mart playing music or movies from discs?
At Seaside, the planned town in the Florida Panhandle built in the 1970s to elaborate the ideas of the New Urbanism, the architecture melds old Charleston galleries with bungalows and farmhouses in an American village so archetypical it was used as the backdrop for the film The Truman Show . Picket fences are required by town ordinance. But look behind the fence of the majority of houses in Seaside, and you’ll encounter the jarring sight of a mechanical minitower —a heat pump.
THE HEAT PUMP CHANGED EVERYTOWN, U.S.A., AND HELPED CREATE THE SUNBELT.
The heat pump changed Everytown, U.S.A., and helped create what we began in the early 1970s to call the Sunbelt. The device was developed just after World War II by Professor Carl Nielsen of Ohio State University and an engineer named J. Donald Kroeker, whose engineering firm installed the first commercial unit in the Equitable Building in Portland, Oregon, in 1948. Heat pumps were soon to be found in motels across America.
Basically air conditioners that can be reversed to provide low-demand heating systems, they made life tolerable in the Sunbelt, and at low cost. The heat pump removed the need for radiators or vented-air heat in much of the southern half of the country while supplanting the window-installed air-conditioning unit. It has flourished everywhere cooling is more important than heating and has supported our national dependence on low energy prices to make life sustainable in our fastest-growing areas.
The mechanical cotton picker killed Broadway, believes Jimmy Breslin. By driving poor blacks off the fields of the South to “Trailways and Greyhound bus depots for the long ride to New York City,” he argues, it sent blacks moving “into the tenements that were vacated by whites,” who themselves moved to the suburbs and abandoned Times Square. “Broadway would no longer be the place of guys and dolls.”
The migration of African-Americans north and west out of the South is the greatest in American history, larger than that from the Dust Bowl to California. Cotton-picking machinery, pioneered in the 1930s by the brothers John and Mack Rust, was mature by the late 1940s, but not until 1960 was a majority of the cotton crop harvested by machine.
The cotton picker soon became a key focus for historians studying the interaction of social and technological forces. The debate is charted in The Second Great Emancipation: The Mechanical Cotton Picker, Black Migration, and How They Shaped the Modern South , by Donald Holley. Did the migration of workers out of the South trigger the adoption of the picker and push the maturation of its technology? Or did the machine displace the workers? Did the appeal of greater freedom and prosperity in the rest of the country pull people off the land and into cities? Or did the disappearance of an agricultural society create a classic displaced proletariat?
What is not in doubt are the consequences: The growth of frequently depressed inner-city neighborhoods and expanding suburban ones, and the transformation of the blues, in its new homes in Chicago and elsewhere, into rock ’n’ roll and hip-hop.
Scanning your own groceries and avoiding the gum-chewing gossiping checkout girl may be worth it for you, but it’s even more worth it for the supermarket, with its just-in-time inventory. Much of America’s recent productivity growth has been built on new sets of standards and means of marking products. The bar code is the most visible example of this.
The Universal Product Code was the first bar-code symbology widely adopted, endorsed by the grocery industry in 1973. Product coding allows for quick price changes and has abetted the growth of the big-box discount store. Items can be tracked from port to rail to loading dock to shelf, thanks to containerized shipping that uses the codes. The consequence is lowered living costs.
Bar codes are just one of many industry standardizations that have lowered costs and changed life. The American home has doubled in average square footage thanks in large part to standardized building materials (4-by-8-foot gypsum board and plywood, 2-by-4 studs 16 inches apart). Electronics is built on standards such as Windows compatibility, VHS, DVD, and so on. Coded product standards even rule the food in our kitchens. A banana that was once just a Chiquita is now a #4011.
Can you recall a car without a seat belt? The movement to put seat belts in the car began in 1954, when the American Medical Association first recommended them. Ford and Chrysler began to offer them as options a year later. By 1965 they were standard.
The push by safety advocates to require seat belts helped establish the adversarial relationship between government and the automobile industry, which was accelerated by the Clean Air Act of 1970. Detroit grumbled, but the engineering achievement involved in developing the catalytic converter and the air bag, both of which Detroit argued were impractical, suggested that under pressure industry could do far more than it thought. For historians, the story indicated how effective “force fed” technology, demanded by government, could be. For philosophers, it challenged John Stuart Mill’s classic liberal precept that government should not protect the individual from himself. Harley-riding libertarians, agreeing with Mill, have forced a rollback of mandatory helmet laws in some states. Will belt laws be unbuckled next?
Today’s children watch television in a wholly different way from those of the 1950s. The remote control makes television an environment to be moved through, not a schedule of successive programs. The result is grab-’em-quick programming and short attention spans. Once families clustered together to watch Ed Sullivan. Now a program waited for and seen straight through is the exception rather than the rule.
While scientists at the remote Naval Ordnance Test Center at China Lake were developing infrared heat-seeking guidance for the Sidewinder air-to-air missile in the early 1950s, TV designers were struggling to find a way to change channels from a distance. The first remote control, still wired to the set, bore the apt name Lazy Bone. In 1955 a Zenith engineer named Eugene Policy did away with the wire; his Flash-made used light, but it didn’t work very well, so it was replaced by the Space Command, which relied on ultrasound—frequencies beyond the range of the human ear. The sounds were generated mechanically in a system that was part chime, part tuning fork, because batteries were inadequate to power a wireless electric ultrasound system.
Not until the 1980s did cheap and dependable infrared technology take over. Today 99 percent of all TV sets come with remote controls, and restless fingers seek hot news and hot new stars unceasingly.
We forget how much bigger and slower our portable devices used to be. Remote controls and mobile phones and Game Boys have become possible only with improvements in batteries. Hefty boom boxes are loaded with companies of chunky C cells, but hearing aids, watches, and automobile key fobs contain tiny button batteries that often outlast the devices they power. The change began with the introduction of alkaline and nickel-cadmium cells in the 1960s. Later decades saw nickel metal hydrides and then lithium produce order-of-magnitude extensions in battery life. But there have been tradeoffs. Most of the substances that make the best batteries are environmental hazards. Nickel, mercury, cadmium, and other heavy metals tossed into landfills and incinerators are among the most dangerous sources of pollutants. And while cell phones can remain on standby for weeks, running a laptop for a whole airline flight across the United States remains a challenge. The hope? That in the future miniature fuel cells will replace batteries altogether.
In 1954 the first TV dinner arrived. It was a turkey-and-dressing meal packaged in a segmented foil tray in a box printed up to look like a television screen. Frozen industrialized dinners heated in the home kitchen looked like the culinary future. But in 1955 Ray Kroc began the national franchising of McDonald’s and signaled a different pattern, the industrialization of the restaurant kitchen, with machinery and methods allowing the use of untrained labor. More and more meals would be eaten outside the home as standardized chains spread.
Kroc’s kitchen engineer, James Schindler, first broke down the burger production system, the way Henry Ford had broken down auto manufacturing. Then he refined it, the way Toyota had with its just-in-time automaking. Nothing better exemplified the system than the engineer Ralph Weimer’s fry scoop, a metal device that, when slipped onto a waxed bag, measured out an order of fries with a single unskilled swipe.
McDonald’s success has turned less on burgers than on fries, and the fries in turn have depended on a whole supporting infrastructure. As critical to McDonald’s as Ray Kroc himself was the spud king J. R. Simplot, who produced Idaho russets with just the right water and sugar content for proper caramelizing in cooking fat with just a touch of beef lard added. And the potatoes created by a vast growing, freezing, and transportation network end up in the hands of the worker wielding the scoop.
The scoop is an apt symbol of the power of the franchise itself, the business-in-a-box approach that has sprinkled monad-like restaurants and clothing stores across America and the world in the last half-century. What McDonald’s pioneered has been carried out by Starbucks and the Gap and other chains. The colored signal lights that regulate restaurant machinery, the step-by-step photos on training charts in fast-food kitchens, and the just-in-time shelf arrangements at Gap stores—all are exact counterparts of elements in modern automobile factories.
In the franchise nothing is left to chance—or to sheer stupidity. Not long ago, after happily munching our Roy Rogers burgers, we smoothed out the wrapper to discover a small circle printed on its interior. Inside the circle were printed the words PLACE SANDWICH HERE .