Securing Copilot for Microsoft 365: What You Need to Know
Over the past year, I’ve probably had more conversations about Microsoft Copilot for Microsoft 365 than any other topic. And the question that keeps coming up — every single time — isn’t “How do we turn it on?” It’s “How do we stop it showing people things they shouldn’t be seeing?”
Fair question. Most organisations have years of accumulated SharePoint sites, Teams channels, and OneDrive folders where permissions have drifted way beyond what anyone intended. Nobody noticed because users weren’t actively trawling through all that content. Then along comes Copilot — which is basically a very powerful search engine backed by a large language model — and suddenly every permission gap you’ve been ignoring becomes painfully visible.
The Oversharing Problem
Here’s what trips people up: Copilot doesn’t bypass security controls. It respects existing permissions. If a user can access a document, Copilot can surface it. If they can’t, it won’t.
So the problem isn’t Copilot. The problem is your permissions were too broad to begin with.
I keep saying this to organisations and I’ll say it again here — a Copilot rollout is a data governance project first, and a productivity project second. If you haven’t sorted your permissions and labelling, you’re not ready. Full stop.
Step 1: Audit Your SharePoint Permissions
Before you go anywhere near Copilot licensing, you need to know who has access to what. That means digging into SharePoint site permissions, sharing links, and group memberships.
Things to look for:
- “Everyone except external users” permissions. This is the one I see constantly. SharePoint sites and folders end up with this group added — usually because someone shared a link using “People in your organisation” and never thought twice about it. That means every employee has access. And that means Copilot gives every employee access too.
- Orphaned sharing links. Links shared months or years ago that nobody ever cleaned up. They’re just sitting there.
- Overly broad Teams. When someone creates a Team, it creates a SharePoint site. If the Team membership is wide, the site access is wide. People forget this.
- OneDrive sharing gone wrong. Users sharing folders from their OneDrive to half the company because it was easier than setting up a proper shared location.
Microsoft has added some decent tooling for this. The SharePoint Advanced Management features include site access reviews and data access governance reports. Use them. Seriously — run them before you enable Copilot and prepare yourself for what comes back.
Step 2: Sensitivity Labels
If you haven’t rolled out sensitivity labels yet, now is the time. No more putting it off. Sensitivity labels let you classify and protect content based on how sensitive it is. When a document is labelled “Confidential” with encryption applied, Copilot respects that encryption and won’t surface the content to unauthorised users.
What you want in place:
- Default labels on new documents and emails — classify content from the moment it’s created, don’t leave it up to users
- Auto-labelling policies to catch existing content that should’ve been labelled ages ago but wasn’t
- Container labels for Teams, SharePoint sites, and Microsoft 365 Groups — these control site-level settings
Sensitivity labels combined with proper permissions are your best defence against Copilot surfacing things it shouldn’t. The Microsoft Purview Information Protection documentation covers the details.
Step 3: Restricted SharePoint Search
This one was built specifically for the Copilot rollout scenario. Restricted SharePoint Search lets you control which SharePoint sites Copilot and Microsoft Search can actually index and return content from.
The concept is simple: you maintain an allowed list of sites you’ve reviewed and confirmed are properly governed. Copilot only surfaces content from those sites. As you clean up more sites, you add them to the list.
It’s a pragmatic approach when you know your permissions aren’t perfect but you don’t want to hold off on Copilot indefinitely. Think of it as a phased rollout for your data, not just for licences. I’ve found it takes a lot of the anxiety out of early Copilot deployments.
Step 4: Purview DLP for Copilot
Microsoft has extended Data Loss Prevention policies to cover Copilot interactions. So you can now write DLP policies that stop sensitive information from being pulled into Copilot responses.
You can set up policies that:
- Stop Copilot from summarising content containing specific sensitive information types (credit card numbers, national insurance numbers, that sort of thing)
- Block Copilot from referencing content with certain sensitivity labels
- Fire alerts when Copilot interactions touch regulated data
This is still fairly new functionality, so keep checking the DLP documentation for updates. It’s evolving quickly.
Step 5: Monitoring Copilot Interactions
Once Copilot is live, you need to see what it’s doing. Microsoft Purview gives you audit logs for Copilot interactions, so you can track:
- What content is being surfaced to users through Copilot
- Who’s using Copilot the most
- Whether sensitive content is turning up in responses where it shouldn’t
The Copilot activity in the Microsoft Purview audit log has what you need. I’d set up alerts for Copilot interactions involving your most sensitive content, at least during the first few weeks. Better to catch something early than find out about it from a worried user — or worse, not find out at all.
The Practical Reality
I’ve worked with a good number of organisations on Copilot readiness now, and the pattern is almost always the same. They come in wanting to get Copilot running quickly. Then they discover the amount of housekeeping they need to do first. What caught me off guard the first couple of times was just how bad the oversharing tends to be — and these were organisations that thought they had things under control.
Bottom line: Copilot is a brilliant productivity tool, but it amplifies whatever state your data governance is in. Good permissions and solid labelling? Copilot works beautifully. Messy permissions? Copilot will make that obvious to everyone, fast.
What I recommend — and this hasn’t changed across any of the rollouts I’ve been involved in:
- Audit SharePoint permissions and fix the worst offenders
- Get sensitivity labels deployed (default labelling at minimum)
- Turn on Restricted SharePoint Search as a safety net
- Set up DLP policies for Copilot
- Monitor usage through Purview audit logs
- Expand access gradually as you get more confident
Don’t rush it. Get the groundwork done properly and Copilot will deliver real value without the data exposure headaches.
If you’ve got questions or want to swap notes on your own Copilot rollout, drop a comment or find me on LinkedIn. Always happy to talk this stuff through.