Tech News
← Back to articles

Google’s latest AI move shows Apple is on the right track, at least in one way

read original related products more articles

Today, Google announced its own version of what Apple is doing with Private Cloud Compute, in what may be a landmark moment for the consumer AI market. Here’s why.

Late to the game, but still influential where it counts

For the past few months, I’ve come to the conclusion that what is really disappointing when it comes to Apple’s predicament with AI is not just the general notion that it is “behind” on whatever amorphous concept of AI may be the trend du jour.

What is disappointing, to me anyway, is the fact that for the first time in a long while, Apple seems genuinely incapable of delivering a novel(ish) piece of technology on the same level as its competitors can.

And while it may be especially infuriating to see Apple counter the idea that it is behind on AI by listing multiple machine learning-based features it has shipped for the last few years, the company does have a point.

For much of the past decade, Apple has been exploring and shipping interesting machine learning-powered features, many of which are supported by its long-running body of in-house academic research, some of which it began sharing publicly through its Machine Learning Research blog.

But, Apple did miss the tidal shift sparked by the launch of ChatGPT, which, it’s worth noting, debuted four years after the release of the original GPT framework.

In other words, Apple, like the rest of the industry, had four years to realize where the puck was going.

And while the rest of the industry managed to move and start shipping LLM-based products, each at its own pace (and with wildly varying levels of success), Apple remains largely stuck in the same spot as it was when ChatGPT came out.

But Apple has done some excellent behind-the-scenes technical work, some of which I’ve covered over the past few months. Chief among them is the Private Cloud Compute infrastructure, which is, hands down, Apple at its best.

... continue reading