Paul van Gerven
1 December 2021

Fifty years ago, on 15 November 1971, a prophetic advertisement in Electronic News appeared to announce “a new era of integrated electronics.” The debut of Intel’s 4004 ushered in the revolution of the general-purpose programmable processor, and, by extension, the modern computer age.

Gordon Moore and Bob Noyce didn’t found Intel in 1968 to design microprocessors. Instead, their focus was firmly on memories. In October 1970, Intel introduced the 1103 DRAM, which laid the foundation for becoming the world’s largest memory company later in the decade. While the startup waited for the memory business to take off, however, it needed all the business it could get.

And so, in 1969, Intel accepted a contract from the Japanese manufacturer Business Computer Corporation to manufacture the chips for its 141-PF calculator. Busicom had come up with a fixed-purpose design requiring twelve ICs, three of which were special-purpose processing units. The Japanese company wanted Intel to design and manufacture these chips.

Busicom 141 PF
Credit: Intel

Intel, however, determined that Busicom’s design was too complex and, crucially, couldn’t be realized using Intel’s standard 16-pin packaging. This prompted Intel engineer Marcian ‘Ted’ Hoff to come up with a more efficient design that was comprised of only four chips. The processing unit in this chipset, the Intel 4004, was destined to become the world’s first monolithic commercial general-purpose microprocessor.

The addition of “monolithic” and “commercial” is important here because the term “microprocessor” was already in use by then. It was coined in 1968 by the Massachusetts startup Viatron Computer Systems to describe an 18-chip minicomputer. In addition, commercial products with a microprocessor on board had been released, but the 4004 was the first processor to be sold as a component on the general market.

Bits&Chips event Save the date

Ten times

The 4004 was extremely challenging to realize and there wasn’t much time to get it done. “When I saw the project schedules that were promised to Busicom, my jaw dropped: I had less than six months to design four chips, one of which, the CPU, was at the boundary of what was possible; a chip of that complexity had never been done before. I had nobody working for me to share the workload; Intel had never done random-logic custom chips before,” recalls Federico Faggin, who was hired to do the project in 1970.

The Italian was the right man for the job, however. Before joining Intel in 1970, he’d been working at Fairchild Semiconductor on metal-oxide-semiconductor (MOS) technology, which some believed could prove superior to the incumbent bipolar transistor technology. In fact, Faggin focused on what proved to be an essential ingredient to make MOS-based microprocessors competitive.

Early MOS transistors were plagued by a parasitic capacitance effect caused by excessive overlap of the gate and the source and drain regions. This could be avoided by forming the gate first and using it to define the source and drain regions, resulting in perfect alignment all the time. Unfortunately, the aluminum used to make the gate electrode wasn’t compatible with that process, since this metal can’t withstand the high temperatures required to form the source and drain junctions.

Building on research at Bell Labs, Faggin managed to replace aluminum with polysilicon, opening the door to self-aligned gates. Supplemented by two additional innovations, chips featuring silicon-gate (SGT) technology proved far superior to their aluminum-gate predecessors. It allowed integration of about twice the number of random-logic transistors on the same chip size, achieving five to ten times the speed of the incumbent technology for the same power dissipation.

Easy to fix

Soon after Faggin’s SGT breakthrough, Gordon Moore and Bob Noyce left Fairchild and started Intel. A year later, in 1969, their company launched an SGT MOS product, a 256 bit RAM. Frustrated by the slow adoption of SGT at Fairchild, which had been crippled by the departure of Moore, Noyce and others, Faggin decided to jump ship as well. Looking to make the most of the technology he got fab ready, he applied for a job at Intel in April 1970.

“I was young and eager to prove myself in my newly chosen field. I understood computers, I could design both logic and circuits and I had a lot of experience in developing MOS processes and MOS ICs – a very rare combination indeed, even in those days – therefore, I felt that if I couldn’t do it, nobody could,” Faggin wrote.

Intel 4004
Credit: Intel

Faggin, with assistance from a Japanese Busicom engineer, worked frantically on all four designs concurrently, clocking 70-80-hour work weeks. “Federico worked at just a furious pace, I mean, he really, really went all out,” Ted Hoff told in an interview. The first silicon was ready by late 1970. The 4001 ROM chip came first, next the 4003 shift register and then the 4002 RAM. These all worked fine. But there was something wrong with the most complex chip, the 4004 CPU, which arrived a few days before New Year’s eve.

In an empty lab – most people had already gone home – Faggin probed one chip after another, only to find them lifeless. It wasn’t a bad wafer either because the young engineer couldn’t find a single working chip on any of the wafers. A glance through the microscope quickly revealed why: the manufacturing technicians had omitted an entire mask layer. The memory still makes Faggin smile.

Three weeks later, Faggin got a fresh batch of wafers. In another late-night session, he checked a large number of chips and, to his great delight, found only a few minor and easy-to-fix problems. The microprocessor, as we understand it today, had been born. Made in a 10 μm process, the 12 mm2 pMOS die harbored 2300 transistors, capable of executing about 92,000 instructions per second. That kind of performance was sure to turn heads in those days.

Wildly successful

Originally, Busicom had the exclusive rights to the Intel 4000 family and nobody at Intel seemed to mind. Even if its customer could be persuaded to relinquish exclusivity, management didn’t think the chipset would do well on the open market. Faggin disagreed and developed a wafer-sort tester to prove his chips were useful for industrial control applications. Next, he found out that Busicom was struggling financially. He told his bosses, who proceeded to negotiate a release from exclusivity in exchange for a lower price.

Intel then published, on 15 November 1971, an advertisement in Electronic News, announcing the availability of the “micro-programmable computer on a chip,” which would mark the start of “a new era of integrated electronics.” For once, this wasn’t just another empty marketing superlative.

Intel 4004 advertisement
Credit: Intel

Other than in Busicom’s calculator, the 4004 ended up in a wide range of products, among which an electronic voting machine, a pinball machine and an arcade bowling simulator. But despite kickstarting the microprocessor revolution, Intel didn’t sell a lot. This was mostly because only five months after announcing the 4004, it started shipping the more powerful 8008, which set off the hobby computer era. As such, the 8008 might be the more appropriate ancestor of the computer age.

Once its memory business took off, Intel’s focus shifted away from microprocessors, though it kept designing new ones – including the 8086, which was released in 1978 and gave rise to the famous x86 architecture that proved to be the company’s lifeline when it was outcompeted in the memory business by the Japanese in the late 80s. Faggin, however, wasn’t happy working for a company “making microprocessors in order to sell more memories.” He left Intel in 1974 to start his own dedicated microprocessor company, Zilog. His first product was the wildly successful Z80, which powered many of the first PCs and remained in production for decades.