The von Neumann bottleneck is impeding AI computing?
AI computing has a reputation for consuming epic quantities of energy. This is partly because of the sheer volume of data being handled. Training often requires billions or trillions of pieces of information to create a model with billions of parameters. But that’s not the whole reason — it also comes down to how most computer chips are built. Modern computer processors are quite efficient at performing the discrete computations they’re usually tasked with. Though their efficiency nosedives whe