Skip to main content

Tech, High And Higher

July 2024
5min read

No small part of the new prosperity was generated by new technology. Although World War II was the greatest human disaster of the twentieth century, it was not an entirely unalloyed one. The enormous pressure of total war always accelerates technological development, and what emerges often turns out to have major civilian applications. The development of radar and of very large airframes for bombers made the modern airtravel industry possible years before it would otherwise have grown up.

The jet engine, developed too late to be important in the war, revolutionized air travel a decade later (in the process killing both the ocean liner and the longdistance passenger train). Not only did air travel become one of the driving forces of the postwar American economy, but aircraft construction became a major enterprise and a vital part of America’s exports. American planes, especially those manufactured by the Boeing Corporation, continue to dominate this extremely capital-intensive industry.

The jet also shrank the world by an order of magnitude, as the railroads had done a century earlier. Traveling from New York to Los Angeles had taken three days in the 1930s. By the 1960s it required only five hours. Europe, nearly a week’s journey from the East Coast by ship, was only about seven hours away by plane. Foreign travel, heretofore the privilege of the rich, became commonplace.

Out of the V-2 rocket, developed by Germany as a terror weapon, emerged the modern space industry, which has become nearly as vital a part of the American economy as agriculture or automobiles. Hundreds of satellites carry vast data streams, knitting the country and the world together in ways never possible before, and at a fraction of the price of undersea cables.

The fall in the cost of moving data is vividly illustrated in the number of overseas telephone calls originating in the United States. In 1950 we placed about one million overseas calls. By 1970 the number had risen to 23 million. In 1980 it was 200 million. By 1994 the number was up to 3 billion. This is all the more remarkable when one considers that the first year in which more than half of American households had any telephone at all was 1946.

THE NEW TECHNOLOGY WAS POSSIBLE ONLY BECAUSE OF WORLD WAR ITS GREAT GIFT TO THE FUTURE.

Space has also become a platform from which to measure and monitor the earth, both as a whole and nearly every square inch of it separately. Weather satellites now allow more careful storm tracking and far more accurate long-range predictions than ever before possible. Other satellites keep track of land use, forest fires, pack ice, many forms of traffic, and a thousand other things, including, of course, the activities of potential enemies.

Top-of-the-line automobiles these days come with geo-positioning systems that determine the car’s exact location by using signals from satellites and then giving the driver directions to his destination. Farmers now use satellite-derived data to tell them precisely where extra fertilizer is needed as they tend their fields with tractors linked to sensors in space.

And, of course, the technology of the rocket ended this country’s long immunity to foreign attack. As a consequence of this, and because the Soviet Union proved an aggressively hostile power, the United States was forced to spend billions on something it had never needed before, a vast peacetime military establishment. In the 1950s, at the height of the Cold War, 58.7 percent of the federal budget was devoted to military spending, up from just 15.6 percent in 1939.

But none of this extraordinary new technology would have been possible except for World War II’s greatest gift to the future, the computer. The word computer has been in the English language since the middle of the seventeenth century, but before the middle of the twentieth it meant a person who did calculations for a living, compiling such things as actuarial tables for life-insurance companies.

The problem was that human computers (usually women, by the way, who were thought to be steadier and more accurate workers) could not perform calculations fast enough, and of course made mistakes. (The great nineteenth-century mathematician William Shanks calculated the value of pi to 707 decimal places in 1873. Seventy-two years passed before anyone found out that he had made a mistake after digit 527 and the next 180 were therefore wrong.)

The idea of a calculating machine went back as far as Charles Babbage, in early-nineteenth-century England (his machine, a wonder of intricate brass gearing, is on display at the British Museum). But only in World War II did a pressing need, vast government money, and the requisite underlying technology combine to make possible a true electronic computer. The first successful one, called ENIAC, for Electronic Numerical Integrator And Computer, was completed at the University of Pennsylvania by Presper Eckert and John Mauchly in 1946, after three years of effort.

ENIAC was the size of a bus, filling 40 filing cabinets, each nine feet high, with 18,000 vacuum tubes and uncounted miles of wiring. It sucked up enough electricity to power a small town. It was, by modern standards, glacially slow. To program it, its operators had to change its wiring by hand on switchboard-like grids. But it worked (although people had to stand by constantly to replace vacuum tubes as they blew and to remove the occasional errant insect—the origin of the term debugging ). Computers rapidly shrank in size, especially after the far smaller—and far more reliable and cheaper—transistor, invented at Bell Labs in 1947, replaced the vacuum tube.

Computers quickly spread to laboratories and military installations that needed the ability to do millions of calculations quickly. But they also found uses in offices that had to handle heavy amounts of data processing. By the 1960s, banks, insurance companies, and large corporations were depending on computers, replacing hundreds of thousands of mindnumbing clerical jobs.

But computers remained big and mysterious, hidden away in air-conditioned rooms and tended by technicians in white coats. And they remained very expensive. The reason they were so expensive is what is known as the tyranny of numbers. A computer’s power is dependent not only on the total number of transistors it has but also the number of connections between transistors. If there are only 2 transistors, only 1 connection is needed to link them. But 3 transistors need 3 connections to be fully linked, 4 transistors need 6, 5 need 10, 6 need 15, and so on. As long as these connections had to be made by hand, the cost of building more powerful computers increased far faster than did their power, limiting how effective thev could be.

The microprocessor changed everything. The integrated circuit, first developed in 1959, allowed many transistors to be wired together simultaneously when they were manufactured. In 1971 Intel marketed the first microprocessor, in effect a small computer on a chip of silicon that utilizes integrated circuits. Although the price of designing a microprocessor and of building the machine tools necessary to produce each design was very high indeed, once that investment was made, microprocessors could be turned out like so many high-tech cookies. The tyranny of numbers was broken, and the computer age began. Intel’s first chip had 2,300 transistors on it. Its newest one for personal computers, the Pentium 4, has 24,000,000.

The computer age developed with astonishing speed, as the cost of computing, thanks to the microprocessor, collapsed. Computing power that would have cost a thousand dollars in the 1950s today costs a penny or two. Microprocessors are now found in nearly everything more complex than a reading lamp. And the personal computer, something nonexistent outside of science fiction 30 years ago, now resides in more than half of all American homes. American schools have one computer for every five children, 25 times as many as they did as recently as 1983.

The most important spinoff of the computer revolution, the Internet, spread explosively in the 1990s and now envelops the globe in a wholly new communications medium of vast and still barely discerned potential. Its influence on the future will be at least as great as the most important spinoff of the steam engine, the railroad, had on its time. Already we are deep into the information age, and it drives the American economy. Manufacturing, the very heart and soul of the economy as late as the 1960s, now accounts for only 14 percent of the gross national product. And while agriculture remains the single largest component of American foreign commerce, information and expertise dominate the country’s exports. Legal and accounting services run rich surpluses; Hollywood dominates the world markets in movies and television the way Southern cotton once dominated the world market in fibers.

The change wrought by the microprocessor can be easily grasped. As recently as the 1960s, if every computer in the world had suddenly shut down, the average person would not have noticed. Today, civilization would collapse. Telephones would not work, cars would not run, household appliances would freeze. Banks, stock markets, and governments would fail to operate. And broadcasters and newspapers would be unable to gather, write, and distribute the news of the calamity.

Enjoy our work? Help us keep going.

Now in its 75th year, American Heritage relies on contributions from readers like you to survive. You can support this magazine of trusted historical writing and the volunteers that sustain it by donating today.

Donate