Skip to main content

1990s Intel

June 2024
1min read


During the 1990s the Internet became the fourth major communications medium, alongside the telephone, radio, and television. Around the world, some 100 million computers are now purchased annually, and 90 percent of them have microprocessors built by Intel.

Intel got its start in 1968 with an all-star group of founders. They included Robert Noyce, a co-inventor of the integrated circuit; Gordon Moore, the author of “Moore’s Law,” which states that the number of transistors on microchips doubles every 18 months; and Andrew Grove, subsequently the company’s chairman and chief executive.

Intel scored a major coup in 1971 by introducing the first microprocessor on a chip. This device, the 4004, was a 4-bit chip; it quickly gave way to more capable 8-bit ones that became the basis for some of the first personal computers.

Intel faced strong competition, but the company prevailed by introducing the 16-bit 8086 and, soon after, a cheaper version—the 8088, and selling it to IBM in large numbers. That processor saw use in the IBM PC personal computer, which became the industry standard.

In 1993, Intel introduced its first version of the Pentium. Subsequent versions have placed more than seven million transistors on each such chip, to handle 588 million instructions per second. The cost: only $500.

In 1997 Time magazine named Andrew Grove its Man of the Year. By making Moore’s Law come true, Intel has provided the engines for the Information Revolution.


Enjoy our work? Help us keep going.

Now in its 75th year, American Heritage relies on contributions from readers like you to survive. You can support this magazine of trusted historical writing and the volunteers that sustain it by donating today.