Like most tech companies, Microsoft is going all-in on AI. Their flagship AI product, Copilot (in all its various forms), allows people to utilize AI in their daily work to interact with Microsoft services and generally perform tasks. Unfortunately, this also creates a wide range of new security problems.
On July 4th, I came across a problem in M365 Copilot: Sometimes it would access a file and return the information, but the audit log would not reflect that. Upon testing further, I discovered that I could simply ask Copilot to behave in that manner, and it would. That made it possible to access a file without leaving a trace. Given the problems that creates, both for security and legal compliance, I immediately reported it to Microsoft through their MSRC portal.
Helpfully, Microsoft provides a clear guide on what to expect when reporting vulnerabilities to them. Less helpfully, they didn’t follow that guide at all. The entire process has been a mess. And while they did fix the issue, classifying this issue as an ‘important’ vulnerability, they also decided not to notify customers or publicize that this happened. What that means is that your audit log is wrong, and Microsoft doesn’t plan on telling you that.
This post is split into three parts. The first part explains the Copilot vulnerability and the problems it can cause. The second part outlines how Microsoft handled the case. And the third part discusses Microsoft’s decision not to publish this information, and why I consider that to be a huge disservice to Microsoft’s customers.
The Vulnerability: Copilot and Audit Logging
The vulnerability here is extremely simple. Normally, if you ask M365 Copilot to summarize a file for you, it will give you a summary and the audit log will show that Copilot accessed that file on your behalf.[ 1 ]
That’s good. Audit logs are important. Imagine someone downloaded a bunch of files before leaving your company to start a competitor; you’d want some record of that, and it would be bad if the person could use Copilot to go undetected.[ 2 ] Or maybe your company has sensitive personal data, and you need a strict log of who accessed those files for legal and compliance purposes; again, you’d need to know about access that occurred via Copilot. That’s just two examples. Organizations rely on having an accurate audit log.
But what happens if you ask Copilot to not provide you with a link to the file it summarized? Well, in that case, the audit log is empty.
Just like that, your audit log is wrong. For a malicious insider, avoiding detection is as simple as asking Copilot.[ 3 ]
You might be thinking, “Yikes, but I guess not too many people figured that out, so it’s probably fine.” Unfortunately, you’d be wrong. When I found this, I wasn’t searching for ways to break the audit log. Instead, I was simply trying to trigger the audit log so I could test functionality we are developing at Pistachio, and I noticed it was unreliable. In other words, this can happen by chance.[ 4 ] So if your organization has M365 Copilot licenses, your audit log is probably wrong.
... continue reading