Microsoft 365 Copilot can read every file, email, chat, and SharePoint document a user has permission to access. That sounds obvious until you realize most small businesses have years of accumulated oversharing in OneDrive and SharePoint that nobody noticed because nobody was searching for it. Copilot searches for it instantly, every time an employee asks a question. If your finance folder is shared with "Everyone" or your HR site has broken inheritance, Copilot will happily summarize salary data, performance reviews, and acquisition documents for the intern who asked an innocent question. Before you turn Copilot on, you need a permission audit. This guide walks through the audit we run for every Copilot rollout we manage in New Jersey.

We have been deploying Microsoft 365 environments for 25 years, and the permission sprawl we find inside SMB tenants is consistently worse than what executives expect. Copilot does not create new security problems. It just makes the existing ones impossible to ignore.

Why Does Copilot Expose Oversharing Problems?

Copilot is grounded in the Microsoft Graph, which respects the same permission model your tenant has used since you migrated. The phrase Microsoft uses is "Copilot honors existing permissions," and that is technically true. The practical reality is different. For a decade, oversharing inside SharePoint and OneDrive was a low-impact problem because nobody was indexing or surfacing those files at scale. An employee would have to know exactly where to look, type the right keywords, and have time to read through results. Copilot removes all three barriers at once.

Ask Copilot "what is our pricing for the Acme deal" and it will pull from any document in your tenant that the user has access to. If the sales VP shared their entire Deals folder with the company a year ago because it was easier than managing groups, Copilot will use it. The breach is silent, fast, and policy-shaped, which means your DLP rules, your audit logs, and your SIEM alerts will not flag it.

The Three Sources of Oversharing in a Typical SMB Tenant

In our M365 audits across Morris, Essex, and Bergen counties, we see the same three patterns over and over.

The first is the "Everyone except external users" link. This is the default sharing option in many SharePoint sites and the OneDrive sharing dialog. It feels safe because it excludes outsiders, but it grants access to every internal employee, every contractor on a license, and every service account. We routinely find HR documents, financial models, and merger files shared this way, often by people who clicked through the dialog without reading it.

The second is broken permission inheritance. SharePoint allows site owners to break inheritance and grant unique permissions on subfolders or files. A common pattern is that someone needed to share a single file with an outside auditor, broke inheritance to do it, and then forgot to put it back. The auditor relationship ended five years ago. The file is still shared, and Copilot can use it.

The third is forgotten Microsoft Teams. Every Team has a SharePoint site behind it. When an employee creates a Team for a project and adds the company-wide group, the SharePoint site inherits that membership. The Team gets abandoned, the channel goes quiet, and the SharePoint site sits there with hundreds of files that everyone in the company can technically read. Copilot reads them.

Quick gut check: Open SharePoint admin center, sort sites by "external sharing" set to anything other than "Only people in your organization," and see how many surprise you. If the number is over five, you have an audit to do before turning on Copilot.

The Pre-Copilot Permission Audit We Run

We use a four-stage audit before flipping the Copilot switch for any client. This takes most SMBs between two and four weeks of part-time work. It is not glamorous, but it is the difference between Copilot being a productivity tool and Copilot being a data leak.

Stage one: inventory. We pull a SharePoint and OneDrive sharing report from the M365 admin center, supplemented by a PowerShell script that walks every site and lists items shared with "Everyone," "Everyone except external users," company-wide groups, or anonymous links. The report typically runs to several thousand rows for a 50-person business. We do not try to fix everything in this stage. We just measure.

Stage two: triage. We classify each oversharing finding by sensitivity. Anything in HR, Finance, Legal, or Executive sites moves to the top of the queue. Anything containing keywords like "salary," "compensation," "acquisition," "merger," "termination," or "PHI" gets flagged for immediate review. Marketing files shared too broadly can wait. Sensitive files cannot.

Stage three: remediation. For each high-priority finding, we either restrict the sharing scope, move the file to a properly governed location, or apply a sensitivity label that overrides the inherited permissions. Microsoft Purview sensitivity labels are the right long-term answer because they travel with the file regardless of where it ends up. Setting up Purview labels properly is part of our managed IT services for clients running Microsoft 365.

Stage four: governance. Once the high-priority findings are clean, we configure SharePoint site policies, default sharing settings, and data lifecycle policies so the problem does not come back. We also enable Restricted SharePoint Search, a Copilot-specific control that lets you whitelist which sites Copilot can read from. For most SMBs, restricting Copilot to a curated list of sites for the first 90 days of rollout is the safest default.

What About Sensitive Data Already Inside Files?

Permissions are only half the problem. The other half is what is sitting inside files that look harmless. A spreadsheet labeled "2024 budget v3" might contain customer credit card numbers in a tab nobody opens anymore. A PDF in a marketing folder might be an old offer letter someone forgot to delete. Copilot will summarize all of it in plain English.

Microsoft Purview Data Loss Prevention and auto-labeling policies are designed for this. They scan content for patterns like Social Security numbers, credit card numbers, PHI identifiers, or custom keyword sets, and apply labels or block access automatically. For NJ businesses subject to the New Jersey Data Privacy Act or HIPAA, having auto-labeling running before Copilot rollout is no longer optional. Our cybersecurity services practice configures Purview for clients alongside the permission audit, because doing one without the other leaves a major gap.

Common Mistakes We See in DIY Copilot Rollouts

The fastest way for an SMB to create a regulatory incident is to buy Copilot licenses, hand them to the executive team, and let them start asking questions. We have walked into post-incident cleanups where this happened. The mistakes are predictable.

Companies often skip the audit because the SharePoint sharing report looks overwhelming. They turn on Copilot for "just a few people first" without restricting the data it can read, which means those few people can now query the entire tenant in plain English. They forget that OneDrive personal folders are also indexed, which means anything an employee dragged into their OneDrive over the past five years is now Copilot-accessible. They also forget about Teams chat history, which Copilot can summarize across every channel a user has joined.

The fix is not to avoid Copilot. The productivity gains are real and your competitors are using it. The fix is to do the audit, configure Purview, restrict initial scope, and only then start the rollout.

Frequently Asked Questions

Does Microsoft 365 Copilot train on our data?

No. Copilot does not use your tenant data to train Microsoft's foundation models. Your prompts and responses stay within your tenant boundary, and Microsoft contractually commits to data residency and isolation. The risk is not that Microsoft sees your data, it is that your own employees can now query data they should not have had access to in the first place.

How long does a permission audit take for a 50-person business?

Plan for two to four weeks of part-time effort with a competent M365 administrator, plus another week if you have heavy SharePoint customization or many Teams sites. We can usually compress that timeline by running the inventory and triage stages in parallel, but remediation depends on getting decisions from data owners, which is the actual bottleneck.

Can we use Copilot if we are subject to HIPAA or NJ DPA?

Yes, but only if you have configured Purview sensitivity labels and DLP for protected data, restricted Copilot's data sources to vetted sites, and updated your business associate agreements and privacy notices to reflect AI processing. Microsoft offers a HIPAA BAA that covers Copilot for E5 customers. The compliance work is non-trivial and is a major reason we recommend a managed rollout rather than a self-service one.