Did your shopping list just get exposed? Or your location got tracked? This could very well happen to a Co-Pilot user. Recent news suggests that Co-Pilot AI is secretly stealing data from private emails.
Microsoft has acknowledged an issue with the Copilot AI service. It seems to allow the system to process and summarise users’ confidential emails without permission. Here is a detailed look at what happened.
Also read: Microsoft Recruits Ex-Google Cloud President Hayete Gallot for Top Cybersecurity Role
Reportedly, Copilot Chat can now access and generate summaries of personal and private emails. This issue seems to have started in January 2026 and lasted for weeks. It has raised concerns about the company’s AI tools handling users' confidential information.
Copilot Chat allows Microsoft 365 subscribers to access AI-powered chat capabilities. This capability is available in Microsoft’s most favored functions: Microsoft Word, Excel, and PowerPoint
The feature was introduced to help users draft content, analyze information, and quickly and easily summarize conversations and documents. However, a bug has led to confidential and private emails also being analyzed. The bug, named CW1226324, can read and draft ‘with a confidential label applied that is being incorrectly processed by Microsoft 365 Copilot chat.’
Even in organizations that have implemented strict data-loss prevention policies, Co-Pilot has been able to drain data without authorization. Most of these policies were originally put in place to top personal/secret content from being shared with Microsoft.
If large organizations with their complicated data security policies cannot be safe, how can individuals be assured of data security on their personal devices? Here are some immediate measures that can be taken:
Do not allow Copilot to access the entire device. Try to maintain restrictions, especially on apps that carry sensitive personal data.
Try sharing fewer details about yourself while using Copilot.
Delete or use external storage for old emails that contain sensitive data.
The aim here is to prevent the AI tool from accessing personal data, not to stop using the tool entirely.
Also read: Zero-Day Bugs Under Attack; Microsoft Issues Alert to Windows and Office Users