Skip to content
Tech News
← Back to articles

Microsoft Mocked for Terms of Service That Admit Copilot Is for “Entertainment Purposes Only”

read original get Microsoft Surface Laptop 5 → more articles
Why This Matters

Microsoft's inclusion of Copilot AI in Windows has sparked criticism due to its own terms of service stating it is for 'entertainment purposes only,' raising concerns about its reliability for important tasks. This contradiction underscores broader industry challenges around AI transparency and accountability, impacting consumer trust and adoption. The company's acknowledgment of outdated language and plans for updates highlight ongoing efforts to clarify AI capabilities and limitations.

Key Takeaways

Sign up to see the future, today Can’t-miss innovations from the bleeding edge of science and tech Email address Sign Up Thank you!

Users of Microsoft’s Windows have grown frustrated with the company’s insistence on stuffing its Copilot AI chatbot into almost every corner of the widely-used operating system, earning it the pejorative nickname of “Microslop.”

That’s despite Microsoft admitting in its own Copilot terms of service that the AI shouldn’t be relied upon for virtually any important work.

“Copilot is for entertainment purposes only,” the lengthy document reads. “It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.”

It’s a bizarre self-contradiction, considering how steadfast Microsoft has been in its efforts to stuff Copilot into even simple Windows apps, like Microsoft Paint and the text editor Notepad, as well as productivity tools.

“Me personally, it’s not a good sign when a company won’t stand behind the accuracy of their product,” one Reddit user noted. “If Microsoft doesn’t trust Copilot, why should I?”

“1/3 of the entire American economy invested into a technology that’s for entertainment purposes only,” another user wrote. “Such confidence. I’m sure this will go well.”

“If a car came with a warning not to trust it and it has no specific purpose or design intent, you wouldn’t pay for it,” yet another argued.

A company spokesperson later clarified in a statement to PCMag that the odd phrasing is “legacy language from when Copilot originally launched as a search companion service in Bing.”

“As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update,” the spokesperson added.

... continue reading