Your cart is currently empty!
AI surge demands that Moore’s Law transcends itself
As Moore’s scaling exponential is slowing, AI is driving an explosion of demand for computational power. Can the semiconductor industry deliver? Surprisingly, it’s not physics that poses the greatest threat.
As artificial intelligence (AI) gains traction, it’s driving a tremendous demand for computational power. Until about five years ago, AI model size was outpacing Moore’s Law, albeit not by much. In the run-up to last year’s ChatGPT-4 bombshell, however, the number of parameters per model has started doubling a whopping 20 times per year, AMD CEO Lisa Su showed at the Imec Technology Forum (ITF) held 20 and 21 May in Antwerp.
These massive models require massive amounts of energy. Currently, it already takes tens of thousands of GPUs to train them. Moving toward hundreds of thousands, ChatGPT-4’s successors could need one to several gigawatts of power to be trained, Su suggested. By today’s standards, that would require a dedicated power plant that’s among the largest in the world.