Microsoft Copilot Cowork’s Security Blind Spots Are Worth Knowing About

Microsoft Copilot Cowork has real gaps around custom skills reliability, Anthropic API dependency, and Purview DLP enforcement. Here is what organizations should understand before rolling it out widely.

  • Ben Stegink
  • May 12, 2026

Microsoft Copilot Cowork's Security Blind Spots Are Worth Knowing About

Microsoft Copilot Cowork is positioned as a powerful agentic workspace for knowledge workers, but some meaningful gaps around data protection and reliability deserve attention before organizations roll it out widely.

Custom Skills Are Rough Around the Edges

Getting started with Cowork is straightforward enough, but the custom skills experience still has friction. When attempting to create a skill through the Cowork frontier agent interface, the agent can confidently generate a polished HTML report scoring the skill’s quality and declaring it ready, without ever actually creating the skill file or placing it in the correct OneDrive directory. That kind of confident hallucination about its own actions is a meaningful usability problem, not a minor quirk.

Even after manually provisioning a skill into OneDrive, the agent’s ability to recognize it is inconsistent across chat sessions. A skill visible in one session can be completely unknown to the next, despite no changes to the underlying file. The output artifact experience compounds the frustration: session files land in GUID-named folders buried under a Cowork/sessions/ path in OneDrive, and the built-in option to open artifacts directly in OneDrive does not reliably work. Navigating to outputs requires manual file browsing.

The Anthropic API Dependency Has Real Consequences

Cowork routes requests through the Anthropic API rather than keeping processing within Microsoft’s cloud boundary. This has several practical implications that are easy to overlook until something breaks. Microsoft’s own FAQ confirms that Cowork connects to external models for processing, and a recent Anthropic API outage demonstrated what that means in practice: active sessions crashed, and the chat history for those sessions became inaccessible. The session files in OneDrive survived, but the conversational context did not.

This external routing also explains why Cowork cannot open Microsoft Purview Information Protection-encrypted documents. Even though a user has the appropriate credentials and permissions, the encrypted content cannot be passed to an external API. Standard Copilot chat sessions using GPT models handle Purview-protected files without issue because processing stays within Microsoft’s infrastructure. For organizations with significant volumes of protected content, this is a meaningful workflow limitation.

Organizations in regions where GDPR or other data residency requirements apply should note that certain countries and regions may not have access to Cowork precisely because of this external data routing, creating uneven capability across global teams.

The DLP Policy Gap Is the Most Pressing Issue

The most significant finding is that Cowork does not appear to respect Microsoft Purview Data Loss Prevention policies configured for Copilot. A document with a DLP policy explicitly blocking Copilot access returns the expected blocked response in standard Copilot chat. The same document, queried through Cowork, returns the content without restriction.

Sensitivity labels for Microsoft 365 Copilot are documented as applying to Copilot and Copilot Chat, but the boundary of where those policies apply across Cowork, declarative agents, and Azure AI Foundry agents is not clearly defined. Purview-encrypted files offer a harder technical barrier. Sensitivity labels used for classification without encryption, however, do not appear to block Cowork from accessing the underlying content.

Microsoft’s data handling documentation for Cowork covers tenant isolation, file storage in OneDrive and SharePoint, and data subject rights, but is less

Recent Blog Articles

Practical guidance for IT leaders navigating Microsoft 365, Azure, and modern cloud environments.

blog image

March 31, 2026

Microsoft 365 E7: What’s Inside the Frontier Suite and Whether It’s Worth It

Microsoft 365 E7 bundles E5, Copilot, Entra Suite, and the new Agent 365 at $99 per user per month. We break down the pricing math, what Agent 365 actually does, and whether the new SKU makes sense for your organization.

Read More Details
blog image

April 28, 2026

Claude Cowork vs. Copilot Cowork: Same Name, Very Different Tools

Anthropic’s Claude Cowork and Microsoft’s Copilot Cowork share a name but take very different approaches to agentic AI. We compare desktop vs. cloud architecture, Microsoft 365 data access, MCP connectivity, context windows, and custom skills to help you decide which fits your workflow.

Read More Details
blog image

April 12, 2026

Why Microsoft 365 Governance Matters More Than Ever in the Age of AI

AI tools like Microsoft 365 Copilot amplify whatever governance practices you already have in place. Learn why cleaning up permissions, oversharing, and data hygiene before deploying AI is critical to avoiding risk at scale.

Read More Details
Intelligink cloud consulting decorative background

Make the most of your Microsoft 365 Investment!

Let us focus on Microsoft 365 and Azure so you can focus on your business.

Let's Talk