Rising power density and heat threaten the future of advanced semiconductors
Published on: 2025-04-22 14:06:00
In a nutshell: For more than half a century, the relentless progress of Moore's Law has driven engineers to double the number of transistors on a chip approximately every two years, fueling exponential growth in computing power. Yet as chips have become denser and more powerful, a formidable adversary has emerged: heat. Rising temperatures within modern CPUs and GPUs have far-reaching consequences that impact performance and power consumption. Over time, excessive heat slows critical signal propagation, degrades chip performance, and increases current leakage – wasting power and undermining the efficiency gains that Moore's Law once promised.
The underlying issue is closely linked to the end of Dennard scaling, a principle that once allowed engineers to shrink transistors and reduce voltage simultaneously – keeping power consumption in check. By the mid-2000s, however, further voltage reductions became impractical, even as transistor density continued to increase. This divergence led
... Read full article.