Skip to content
Tech News
← Back to articles

Copilot is ‘for entertainment purposes only,’ according to Microsoft’s terms of use

read original get Microsoft 365 Copilot → more articles
Why This Matters

Microsoft's recent terms of use for Copilot explicitly state that the AI is 'for entertainment purposes only,' highlighting the company's acknowledgment of AI limitations and the importance of user discretion. This move underscores the growing need for transparency and caution in AI deployment, especially as companies push for enterprise adoption. For consumers and the industry, it signals a shift towards clearer disclaimers to manage expectations and mitigate risks associated with AI errors.

Key Takeaways

In Brief

AI skeptics aren’t the only ones warning users not to unthinkingly trust models’ outputs — that’s what the AI companies say themselves in their terms of service.

Take Microsoft, which is currently focused on getting corporate customers to pay for Copilot. But it’s also been getting dinged on social media over Copilot’s terms of use, which appear to have been last updated on October 24, 2025.

“Copilot is for entertainment purposes only,” the company warned. “It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.”

A Microsoft spokesperson told PCMag that the company will be updating what they described as “legacy language.”

“As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update,” the spokesperson said.

Tom’s Hardware noted that Microsoft isn’t the only company using this kind of disclaimer for AI. For example, both OpenAI and xAI caution users that they should not rely on their output as “the truth” (to quote xAI) or as “a sole service of truth or factual information” (OpenAI).