Skip to main content

1947Fifty Years Ago

March 2024
1min read

The Transistor

On December 23 researchers at Bell Laboratories in Murray Hill, New Jersey, successfully built and tested the world’s first transistor. It was an ugly-looking affair, cobbled together from irregular chunks of metal and polystyrene and gnarled wires held in place with lumps of solder. But it worked. The researchers—John Bardeen, Walter Brattain, and William Shockley—had proven that a sliver of semiconducting germanium, suitably mounted and connected, could amplify electrical currents just as well as a vacuum tube.

Since being invented early in the century, vacuum tubes had brought great changes to American life, making possible radio, television, and many scientific and industrial devices. Yet they had their flaws, particularly when assembled in large numbers. Like their cousin the light bulb, vacuum tubes needed a constant supply of electricity and tended to burn out after a couple thousand hours. The world’s first programmable computer, ENIAC, which had been unveiled in 1946, contained more than seventeen thousand tubes. They ate up enormous amounts of power and required vigilant monitoring and replacement, as well as efficient air conditioning. The transistor, when brought to the point of mass production, promised to be smaller, more rugged, more reliable, and much, much longer lasting than even the most advanced tubes. It would respond faster to high-frequency signals and would not need time to warm up.

The first major consumer use of transistors came in 1953, in hearing aids. Those early transistors were still no smaller than vacuum tubes, and much more expensive, but the reduced power consumption paid off the extra cost in a few months. The following year saw the earliest transistor radios, and in 1955 the first transistorized computer, the IBM 7090, went on sale. Meanwhile, researchers were learning to make transistors smaller and cheaper, eventually etching them on thin wafers of silicon. This process led to the integrated circuit, first marketed in 1962, and the microprocessor, in 1971, which have long since superseded the individual transistor for most uses.

 

When Bardeen, Brattain, and Shockley were developing their invention, most people expected the world to be transformed by atomic energy, which had made its dramatic appearance during the recent war. Atomic energy has found a niche in naval propulsion and still generates 17 percent of America’s electricity, but its impact has fallen far short of the atomic age that early enthusiasts predicted. Instead, wartime research in electronics, which produced radar and the proximity fuze, gave us the information age—a genuine revolution that continues to reorder our lives almost every day.

We hope you enjoy our work.

Please support this magazine of trusted historical writing, now in its 75th year, and the volunteers that sustain it with a donation to American Heritage.

Donate