Skip to content
Tech News
← Back to articles

Copilot Is 'For Entertainment Purposes Only,' According To Microsoft's ToS

read original get Microsoft 365 Copilot → more articles
Why This Matters

Microsoft's recent terms of service for Copilot emphasize that the tool is intended for entertainment and should not be relied upon for critical decisions, highlighting the ongoing need for caution with AI-generated content. This underscores the importance for consumers and businesses to understand AI limitations and use these tools responsibly. As AI continues to integrate into professional workflows, clear disclaimers like these are vital for managing expectations and legal liabilities.

Key Takeaways

An anonymous reader quotes a report from TechCrunch: AI skeptics aren't the only ones warning users not to unthinkingly trust models' outputs -- that's what the AI companies say themselves in their terms of service. Take Microsoft, which is currently focused on getting corporate customers to pay for Copilot. But it's also been getting dinged on social media over Copilot's terms of use, which appear to have been last updated on October 24, 2025. "Copilot is for entertainment purposes only," the company warned. "It can make mistakes, and it may not work as intended. Don't rely on Copilot for important advice. Use Copilot at your own risk." Microsoft described the terms of service as "legacy language," saying it will be updated. Tom's Hardware notes that similar AI warnings remain common across the industry, with companies like OpenAI and xAI also cautioning users not to treat chatbot output as "the truth" or as "a sole service of truth or factual information."

Read more of this story at Slashdot.