Tech News
← Back to articles

He could just turn it off

read original related products more articles

Generative AI, we are repeatedly told, is a transformative and complicated technology. So complicated that its own creators are unable to explain why it acts the way it does, and so transformative that we'd be fools to stand in the way of progress. Even when progress resembles a machine for undressing strangers without their consent on an unprecedented scale, as has been the case of late with Elon Musk's Grok chatbot.

UK Prime Minister Kier Starmer seems to have so fully bought into the grand lie of the AI bubble that he was willing to announce:

"I have been informed this morning that X is acting to ensure full compliance with UK law."

Not that it currently is in compliance. Nor a timeline in which it is expected to do so. Just that he seems satisfied that someday, eventually, Musk's pet robot will stop generating child sexual abuse material.

Advertisement Advertisement

Advertisement

This statement comes just under two days after Starmer was quoted as saying "If X cannot control Grok, we will." What could Elon possibly have said to earn this pathetic capitulation. AI is difficult? Solutions take time?

These are entirely cogent technical arguments until you remember: He could just turn it off.

Elon Musk has the power to disable Grok, if not in whole (we should be so lucky) than its image generation capabilities. We know this intuitively, but also because he rate-limited Grok's image generation after this latest scandal: after a few requests, free users are now prompted to pay $8 per month to continue enlisting a wasteful technology to remove articles of clothing from women. Sweep it under the rug, make a couple bucks along the way.

Not only is it entirely possible for image generation to be turned off, it's the only responsible option. Software engineers regularly roll back updates or turn off features that work less than optimally; this one's still up and running despite likely running afoul of the law.

... continue reading