AI skeptics aren’t the only ones warning users not to unthinkingly trust models’ outputs — that’s what the AI companies say themselves in their terms of service.

Take Microsoft, which is currently focused on getting corporate customers to pay for Copilot.

But it’s also been getting dinged on social media over Copilot’s terms of use, which appear to have been last updated on October 24, 2025.

“Copilot is for entertainment purposes only,” the company warned.

“It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk. ” A Microsoft spokesperson told PCMag that the company will be updating what they described as “legacy language.

” “As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update,” the spokesperson said

Highlighted sentences link to their corresponding claims. Click any highlighted sentence to jump to its detailed analysis.
Highlight Colors Indicate Claim Quality:
✓ Healthy Claim - No fallacies or contradictions detected
⚠️ Minor Issues - Has contradictions or minor fallacies
🚨 Serious Issues - Multiple contradictions or severe fallacies
Quality Criteria: Claims are evaluated for logical fallacies and contradictions with other news sources. Green highlights indicate healthy claims suitable for reference.
Source