Bram Nauta is a professor of IC design at the University of Twente.

Opinion

High risk, no gain – let’s stop using amplifiers

Reading time: 3 minutes

Going against 75 years of circuit design practices, Bram Nauta argues it’s time to ditch the amplifier.

Amplifiers are key building blocks in electronic circuit design. Whenever a system has a small input signal, our Pavlov reaction is to amplify it for further processing. The advantage of an amplified signal is that the following blocks are allowed to have higher noise and thus need less power. Today, the actual signal processing is done in the digital domain, so a small input signal is amplified, filtered and further amplified to drive an analog-to-digital converter (ADC).

These ADCs have been optimized by following a figure of merit that relates the power dissipation to the signal-to-noise ratio and the bandwidth of the converter. The ADCs with the best figure of merit have an immense input swing, often as large as the supply or even more. The result is that they have an extremely low power dissipation but need a monster of an amplifier to drive them, which dissipates an order of magnitude higher power than the converters themselves. If we look at the system level, the circuits before the ADC dissipate milliwatts, while the converter itself only needs microwatts. In that sense, the ADC is a spoiled child.

What if we put a system’s small input signal directly into an ADC? The ADC would then have a tiny input signal and would need a low noise by itself, while its power dissipation would be in the milliwatt range! This would result in a horrible figure of merit – not an attractive scenario if you want to sell the ADC for stand-alone use. However, from a system point of view, it may make sense because we save a whole chain of amplifiers and filters that otherwise would burn those same milliwatts.

In ADCs, the decisions are made with comparators. Comparators are a kind of clocked amplifiers that compare their two input signals and give a digital output of one or zero whenever one of the inputs is larger than the other. If we remove all amplification (or gain) before the ADC, the comparator will be the first active circuit that adds noise to the signal. So, it’s interesting to compare the noise of a comparator in an ADC with that of an amplifier.

Suppose we do the math and compare a basic differential amplifier with a comparator operating under the same noise and signal bandwidth constraints. In that case, we can see that the power and dissipation of the input stage of a comparator or an amplifier are precisely equal. However, the comparator only needs a digital output of one or zero, which is easy to do at low power, while the amplifier still needs to make a linear output signal and drive its load. Therefore, an amplifier at the system’s input would burn more power than a comparator at the same position for a given noise and bandwidth.

So, let’s get rid of these amplifiers. No more gain! Let’s remove the low-noise amplifiers (LNAs) and replace them with low-noise analog-to-digital converters (LNADCs). Let’s put that spoiled child to work!

This is a disruptive approach compared to what we’ve been doing for the past 75 years since the invention of the transistor. However, it does make sense. It especially makes sense if we realize that in modern CMOS, the supply voltage is already below 0.8 volts and is decreasing further. It is, therefore, getting hard to make amplifiers because we need to squeeze all transistors under that low-voltage roof. So, let’s forget about amplifiers!

The first LNADC prototypes won’t beat the state of the art, and their figures of merit will be terrible. But let’s try something different, which, in the long run, may bring us further. At least, it’s something new, and that’s always fun to explore. So let’s go for high risk, no gain!

This column is an adaptation of Bram Nauta’s 2024 ISSCC keynote speech entitled “Racing down the slopes of Moore’s Law,” which can be watched in its entirety on Youtube.

Related content