Paul van Gerven
7 May

A great pioneer of MOSFET technology has passed away.

Conceived in 1966 and patented in 1968, dynamic random access memory (DRAM) was an instant hit. Faster, higher capacity and cheaper, the new type of memory wiped clunky magnetic technologies off the map in a matter of years. By the mid-1970s, DRAM was the standard. Since then, it has grown into a large family of modules and form factors found in consumer devices, servers and the world’s most advanced AI chips.

The invention of DRAM was the crowning achievement of Robert Dennard (1932), who spent his extraordinarily long professional career at IBM. As a student, he initially showed little interest in electrical engineering, but when he got a chance to get into university on a music scholarship (he played French horn), the young Texan took to the burgeoning fields of circuit design and semiconductor technology. He would bestow them with a lasting legacy.

Robert Heath Dennard passed away on 23 April, aged 91.

Non-volatile cousin

Dennard joined IBM in 1958 and ventured into MOSFET-based memories in 1964. His colleagues had been working on a project for a while, but it didn’t seem the right way forward toward an efficient, high-density memory. It used 6 transistors to store a single bit and featured rather complex circuitry.

Bits&Chips event sponsor registration

Dennard’s key insight was to store the information as a charge in a capacitor alongside a single transistor. At the cost of having to ‘refresh’ the capacitor’s charge constantly, the principle opened the door to high-capacity memories using relatively simple circuits.

Importantly, DRAM scaled well. “Over the next five-and-a-half decades, DRAM would, generation by generation, evolve … it became the foundational technology for an industry that has reshaped human society – from the way we work to the way we entertain ourselves and even to the way we fight wars,” IBM wrote in a 2019 tribute to Dennard.

Even today, DRAM densities continue to increase, although it’s expected that the technology will go 3D by the end of the decade, just like its non-volatile cousin NAND flash did years ago.

Multicore

DRAM wasn’t the only mark Dennard left on semiconductor technology. He famously also described a mechanism that partly underpins the expectation of chips getting better and cheaper every generation. This is commonly associated with Moore’s Law, but that famous observation was about the number of transistors per chip. Having more of them per chip does indeed yield more powerful chips, but there’s another mechanism at play.

Dennard observed that as transistor dimensions go down, so does power consumption. Scaling transistors, therefore, not only yielded more powerful chips because more of them fit on a piece of silicon, but the reduced-sized transistors also enabled chipmakers to raise clock frequency without having to increase power significantly. Capable of doing more switching per second, chips ran faster every time transistors were made smaller.

Dennard scaling, as this powerful mechanism has come to be known, hit a wall around 2005. Leakage current and threshold voltage don’t scale with size, causing thermal issues. As a result, clock frequencies stalled and chip firms started making multicore processors.

Tribute

Dennard has received many awards during his lifetime, including the US Medal of Technology in 1988, the IEEE Medal of Honor in 2009 and the Robert N. Noyce Award in 2019, the latter having previously been awarded to industry giants such as Morris Chang (TSMC) and Martin van den Brink (ASML). A final tribute will be paid to Dennard on 7 June during a memorial service at IBM’s Thomas J. Watson Research Center in Yorktown Heights, New York.

Main picture credit: Semiconductor Industry Association