Earlier this month, it was reported that almost 80,000 workers were laid off in the first quarter of 2026, with companies pinning the blame on the rise of artificial intelligence.
Whether through improved task efficiency or cost savings through automation, deployment of AI within the workforce is supposed to be the economically smart decision — even if it didn't necessarily turn out to be true. But even that story may be hard to tell now, as Nvidia executive Bryan Catanzaro recently commented that, within his team, AI compute power is more expensive than actual workers.
While Catanzaro's team is involved in making foundation models for Nvidia, AI usage is increasing among workers, with a reported 50% of U.S. employees using AI in some form, according to data released in mid-April. Several weeks ago, Uber's CTO revealed he had exhausted the company's annual AI budget in just a few weeks. If AI usage costs continue to rise, then those costs must be accounted for.
Article continues below
If the most useful AI models become too expensive without generating a return in productivity, their use in workplaces could fall dramatically, as token costs begin to pile up.
Hey big spender
If you asked Nvidia CEO Jensen Huang how much companies should spend on AI, his answer would probably be at least 50% of what you're paying your workers. He famously said in March that if an Nvidia engineer who's paid $500,000 a year weren't spending at least $250,000 on AI tokens over that same year, he'd be "alarmed."
In an interview with Axios, Nvidia's VP of Applied Deep Learning, Bryan Catanzaro, said that within his team, "the cost of compute is far beyond the costs of the employees." A quick look at an example of the enormity of these costs can be found by looking at available vacancies within the Deep Learning team. One such vacancy for a Senior Software Engineer puts the salary band between $192,000 - $243,000 per-year, which means that employees within that team are racking up high compute costs.
It's important to note that not every employee in the tech industry will be using AI to the degree that Nvidia employees are, especially those working within the Deep Learning team. Therefore, you cannot reasonably equate their usage of AI models and costs with those of the average worker.
However, within the context of other contemporary tech firms, they are also finding AI spending increasing in 2026. A study in February showed over 80% of companies using AI showed no productivity benefit, while a study from the Harvard Business Review shows AI use is increasing worker burnout rates.
... continue reading