Kelvin Wankhede/Android Authority
TL;DR
- Microsoft’s CoPilot terms of use clearly state, “CoPilot is for entertainment purposes only.”
- While other AI companies warn users to double-check AI output, this Copilot disclaimer goes much further.
- Despite the entertainment-only message, Microsoft is heavily promoting the commercial use of Copilot.
Whatever complaints people make about AI replacing human skills, there’s another side to them: The rise of AI has also forced humans to develop new skills, particularly in terms of being able to sort out useful AI output from inaccurate, hallucinatory garbage. Over the years, many of us have become quite good at it, and lean toward making the most of the many limitations experienced with so many AI agents. While the companies behind these projects are equally aware of the limitations we’re up against, one of them is somewhat overcompensating in the legal department, as Copilot users see some concerning language in Microsoft’s terms of service.
Don’t want to miss the best of Android Authority?


Anyone using AI for anything, even remotely serious, should know by now to check the program’s output – AI will confidently share mistakes as truth, and users need to be cautious not to take its output at face value. Accordingly, all major players disclaim this effect, attempting to promote the benefits of their products while acknowledging their limitations. Google’s Gemini Overview A good example of this, which explains how Gemini does what it does, while also drawing attention to the places where it still needs improvement.
And then there’s Microsoft. Like many other companies doing AI, it likes to advertise all the important tasks Copilot can help you with, like coming up with new strategies for your business:
At first glance, there’s nothing unusual there – exactly on par with your modern AI platforms. We just hope the business customers Microsoft is chasing in places like this won’t read the whole thing CoPilot Terms of Use (via Tom’s Hardware). Because if they did, they would see this disclaimer:
CoPilot is for entertainment purposes only.
He’s “only” whole There is a lot of work there. Microsoft continues:
This may lead to mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.
Now, He BIT actually seems relatively in line with the disclaimers we’ve seen from other AI firms: Be careful, and verify the correctness of any AI output before acting on it. As for ourselves, we wouldn’t stop and think twice about an incident that unfolded in the context of Copilot.
But for whatever reason, Microsoft was forced to go back and basically add “LOL JK” to the entire document.
To be honest, this is probably a lawyer feeling the need to reform and cover up Microsoft’s liabilities – they went too far in the process, inviting ridicule. I wish someone had asked the copilot to check the conditions and spot any potential embarrassments – except, as we now know, it’s not made for that!
Thank you for being a part of our community. Please read our comment policy before posting.
