Copilot Privacy Settings

How to Verify Copilot’s Privacy Settings and Understand Session Data

  • TIPS & TRICKS/
  • How to Verify Copilot’s Privacy Settings and Understand Session Data/

How to Verify Copilot’s Privacy Settings and Understand Session Data

With AI tools like Microsoft Copilot becoming essential in the modern workplace, privacy has quickly risen to the top of users' concerns. For IT admins, team leaders, or everyday users, understanding how to verify Copilot’s privacy settings and track session data is key to maintaining control, trust, and compliance.

As organizations integrate AI into daily workflows, the question isn't just what Copilot can do but what it does with your data. Despite Microsoft's emphasis on enterprise-grade security, the mechanics of session data handling, especially in tools powered by large language models (LLMs), can feel opaque. That lack of clarity creates friction in adoption and risks in governance.

This article provides a practical guide to verifying Copilot’s privacy settings, understanding how session data works, and interpreting Microsoft’s official statements on data handling.

Understanding Copilot’s Privacy Framework

Microsoft 365 Copilot operates within the same security and compliance boundaries as Microsoft 365, adhering to standards like the General Data Protection Regulation (GDPR), the Health Insurance Portability and Accountability Act (HIPAA) coverage, and Various International Organisation for Standardisation (ISO) certifications, amongst others. Its design emphasises key principles, including:

Data Minimisation: Copilot accesses only the data necessary for tasks, such as the document you’re working on or emails you have permission to view, reducing the possibility of unnecessary data exposure.

Encryption: All data, including prompts and responses, is encrypted in transit and at rest, ensuring protection against unauthorised access.

Role-Based Access Control (RBAC): Copilot respects your organisation’s permission model, only surfacing data you’re authorised to see.

Now that you have a clearer view of the foundational safeguards behind Copilot, let’s walk through how to verify its privacy settings and ensure they align with your organisation’s security expectations.

How to Verify Copilot's Privacy Settings

Before you can fully trust any AI assistant with sensitive tasks, it's essential to understand and manage its privacy settings. Microsoft Copilot's personalisation feature gives users control over how their data is used.

The following section walks you through how to verify and adjust these settings across different platforms, ensuring your data preferences are respected and your AI experience aligns with your privacy expectations.

Verify and Control Personalisation Settings

Copilot’s personalisation feature tailors responses based on your past interactions. Depending on how much context you want Copilot to retain, you can toggle this setting on or off.

On copilot.com:

  • Click your profile icon in the top-right corner.
  • Select your profile name.
  • Navigate to Privacy > Personalisation.
  • Toggle personalisation on or off as desired.

In Copilot for Windows or macOS:

  • Click your profile icon.
  • Go to Settings > Privacy > Personalisation.
  • Adjust the personalisation toggle.

In the Copilot mobile app:

  • Open the menu and tap your profile icon.
  • Select Account > Privacy > Personalisation.
  • Enable or disable personalisation.

Note: When you disable personalisation, Copilot will stop remembering details from your conversations. While you can still access your past chats, future interactions won’t be tailored until you re-enable personalisation.

Managing Specific Conversation Data

Microsoft Copilot allows you to manage your general privacy settings and the specific information it keeps from past interactions. Whether you want to delete a single conversation or reset everything it knows about you, Copilot gives you complete control.

1. Delete Specific Conversations

If you’d like to remove individual conversations from Copilot’s memory:

  • Open your conversation history in the Copilot interface.
  • Locate the chat you want to delete.
  • Remove it to ensure it no longer contributes to personalisation.

2. View, Add, or Delete Copilot’s Stored Information

You can ask Copilot directly about what it knows and conversationally control that memory:

  • View Stored Info: Ask, “What do you know about me?” to see what details Copilot has remembered.
  • Delete Specific Info: Say, “Forget that I like Fridays” or any other detail you want removed.
  • Add New Info: Say, “Remember that I love reading thriller novels” to personalise future responses.
  • Verify Changes: Re-ask “What do you know about me?” to confirm updates have taken effect.

3. Clear All Memories at Once

If you prefer to wipe everything and start fresh:

  • Turn off personalisation from your privacy settings.
  • This deletes all stored details and disables memory-based personalisation going forward.

Control Use of Conversations for Model Training

By default, Microsoft may use your interactions with Copilot, both text and voice, to improve its AI models. However, you have the option to opt out and ensure your conversations are not used for training purposes.

Steps to Manage Model Training Settings

The following explains how to manage these settings across different platforms.

On copilot.com:

  • Click your profile icon, then your profile name.
  • Go to Privacy > Model training on text and Model training on voice.
  • Toggle the settings to opt out.

In Copilot for Windows or macOS:

  • Click your profile icon.
  • Navigate to Settings > Privacy > Model training on text and Model training on voice.
  • Disable as needed.

In the Copilot mobile app:

  • Open the menu and tap your profile icon.
  • Select Account > Privacy > Model training on text and Model training on voice.
  • Turn off to opt out.

Note: Once disabled, your past, current, and future conversations will be excluded from training Microsoft’s AI models. Changes may take up to 30 days to take full effect.

Understanding Session Data in Copilot

Session data in Microsoft Copilot includes your prompts (what you ask), Copilot’s responses (what Copilot generates), and any files or content accessed during a session. This data is used to complete your requests, such as drafting documents, creating presentations, or summarizing meetings, while maintaining strict privacy controls.

Copilot only accesses the data it needs, respecting existing Microsoft 365 permissions so content doesn’t leak across users or applications. You can further control what Copilot accesses, including documents, emails, or third-party plug-ins.

Uploaded files to Copilot Chat are stored in your OneDrive for up to 30 days and are visible only to you. All session data is encrypted in transit and at rest to prevent unauthorized access and safeguard sensitive information.

To monitor session activity or data use, check the Microsoft Privacy Dashboard or ask your administrator to review Microsoft Purview audit logs.

Interpreting Microsoft’s Data Handling Practices

As previously mentioned, Microsoft 365 Copilot is designed with privacy, security, and compliance at its core. The data used by Copilot is task-specific, blending your prompt, related Microsoft 365 content, and AI-generated responses to deliver accurate output.

Model training does not use your data unless explicitly permitted. However, unless you opt out, Microsoft may use de-identified data from Bing, ads, or Copilot interactions to improve its products.

Crucially, Copilot operates within Microsoft’s secure service boundaries using Azure OpenAI, not public OpenAI systems. Microsoft complies with global data regulations, including GDPR and the EU Data Boundary, ensuring data residency where required.

EU user data stays within the EU by default, though some requests may be routed globally under high demand. Users can manage or delete activity history via My Account portal, while IT admins can configure retention policies through Microsoft Purview.

Best Practices for Privacy-Conscious Use

While Microsoft 365 Copilot is built with enterprise-grade privacy and compliance in mind, proactive steps to protect sensitive information are still essential, especially in today’s threat landscape.

To get the most out of Copilot while remaining privacy-conscious:

  • Avoid directly entering sensitive or confidential data such as passwords, financials, or client details into prompts.
  • Use Microsoft Purview sensitivity labels to classify and protect documents Copilot might access.
  • Run regular audits using Microsoft Purview to catch misconfigured permissions, overshared SharePoint sites, or risky Teams settings.
  • Apply multi-factor authentication through Microsoft Entra to enhance identity security and prevent unauthorized access.
  • Verify Copilot’s responses, especially in external communications, presentations, or reports.
  • Stay updated on new privacy features and changes to Copilot’s behavior via Microsoft Learn or the Microsoft Privacy Statement.

By combining Copilot’s intelligent support with these best practices, business users and IT teams can strike the right balance, unlocking productivity without compromising data security or trust.


Frequently Asked Questions (FAQ)

  • Is Microsoft Copilot safe?

  • Does Copilot share my data with third parties?

  • Can Copilot access all my Microsoft 365 data?

  • How do I ensure Copilot doesn’t misuse my sensitive information?

  • Is Copilot a huge security vulnerability?

Related Articles
  • Copilot vs. ChatGPT vs. Gemini
    Copilot vs. ChatGPT vs. Gemini: How to Choose the Right AI Assistant for Your Task

    Microsoft Copilot, OpenAI’s ChatGPT, and Google’s Gemini are leading AI assistants, each excelling in different environments. Copilot integrates deeply with Microsoft 365 to automate documents, data analysis, and email, while ChatGPT shines in open-ended conversation, creative writing, and flexible plugin-driven workflows. Gemini prioritises speed and factual accuracy within Google Workspace, offering powerful research and summarisation capabilities. Choosing the right tool depends on your ecosystem, need for customisation, and whether productivity, creativity, or precision is the top priority.

  • Microsoft Copilot Excel
    From Zero to Hero: Practical Ways to Boost Productivity with Copilot in Excel

    Microsoft Copilot for Excel, a generative-AI assistant built into Microsoft 365, turns natural-language prompts into instant formulas, pivots, charts, and summaries - eliminating much of Excel’s manual grunt work. After enabling Copilot in a cloud-saved workbook, users simply describe tasks like highlighting duplicates, cleaning data, generating complex formulas, or visualising trends, and Copilot does the heavy lifting while explaining its logic. The article also offers step-by-step setup guidance and a list of ready-made prompts to help users go from “zero to hero” in productivity.

  • Microsfot Copilot
    Deploying Copilot Effectively: A Guide for IT Managers on Integration, Training, and Change Management

    Microsoft 365 Copilot can super-charge Word, Excel, Outlook, PowerPoint, and Teams, but IT managers must align licensing, data governance, and clear business goals before launch. In this article, we discuss how engaging stakeholders early, piloting with a small cross-functional group, and phasing the rollout lets teams refine guidance and measure real productivity gains. Role-specific, hands-on training - prompt-engineering tips, quick-start resources, and “Copilot champions” - converts into confident daily use while resolving emerging user challenges.