Microsoft CopilotAI-powered productivity tool integrated within Microsoft 365 applications such as Word, Excel, PowerPoint and Outlook. It uses generative AI to assist users by automating tasks, summarising information, generating content, and providing data insights. privacy concerns have been in the spotlight recently. The technology has quickly become a powerful example of how AI-enhanced tools are transforming the capabilities of Software as a Service (SaaS) platforms. Integrated into Microsoft products like Office 365, Copilot uses generative AI to automate tasks, generate content, and analyse data, offering businesses new ways to streamline operations and boost productivity.
From drafting documents and spreadsheets to analysing complex datasets, automating meeting scheduling, summarising lengthy texts, and even suggesting email responses, Microsoft Copilot provides a range of features designed to simplify previously time-consuming tasks.
However, Microsoft Copilot’s rapid adoption across multiple industry sectors has sparked debate and concerns around data privacy and compliance.
In this blog, we explore these privacy concerns and offer practical compliance tips for 2025 for organisations operating in the EU or UK under the General Data Protection RegulationRegulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation). (GDPR).
But first, let’s explore the reasons why there are Microsoft Copilot privacy concerns and how the controversy emerged.
Copilot works by accessing data that already exists within an organisation’s Microsoft environment to train and refine its capabilities.
Examples of use:
Copilot offers several advantages for businesses, both in relation to convenience and data protection:
Whilst Copilot offers businesses several advantages, as discussed above, it is a double-edged sword, with certain privacy risks that need careful consideration.
Below, we outline four key risks to be aware of and offer practical tips to address them effectively. There are additional risks that we do not list here. If your organisation would benefit from comprehensive advice and support from our privacy experts, contact us for more details.
One of the most significant risks with Microsoft Copilot relates to poor permission management. Many organisations struggle to maintain tight control over user permissions, which can result in sensitive data being exposed to individuals who should not have access.
According to Microsoft’s 2023 State of Cloud Permissions Report, less than 1% of granted permissions are actually used.
Solution: Implement strict, role-based access controlsA series of measures (either technical or physical) which allow personal data to be accessed on a need-to-know basis. and regularly review permissions across your organisation to ensure data is only accessed by authorised users.
Another concern is the potential for data repurposing. Copilot could inadvertently use data that was collected for a specific and lawful purpose and use it for another incompatible reason.
For example, if you use personal dataInformation which relates to an identified or identifiable natural person. in a data set to improve Copilot’s AI (i.e. training it to perform better), it could create a compliance issue, because the data wasn’t collected for that purpose.
Or employees might start using the data via Copilot for different business reasons, which could lead to situations where the data is being used in ways that are not within the original purpose agreements.
Solution: Train employees on the appropriate use of Copilot, emphasising the importance of adhering to the original purpose of data collection and the risks of repurposing data. Also give guidance on tasks that may not be suitable to use Copilot for, such as comparing CVs to job specifications that may involve automated decision-making if done without proper safeguards.
There is also a possibility that if Copilot is used to analyse largescale data it could amplify and implicit biases contained in that data, leading to inaccurate or unfair outcomes. If you have large quantities of old or outdated information, Copilot may also use these as the basis of decision making. Users may also over rely on the output, not checking or questioning its validity or accuracyIn data protection terms, the concept of ensuring data is not incorrect or misleading..
Solution: Conduct regular dataset audits to identify any biases in historical data or outdated information. Encourage users to validate Copilot’s outputs by cross-checking with reliable and reputable sources. And ensure regular reviews of your data retentionData retention refers to the period for which records are kept and when they should be destroyed. Under the General Data Protection Regulation (GDPR), data retention is a key element of the storage limitation principle, which states that personal data must not be kept for longer than necessary for the purposes for which the personal data are processed. policies.
Copilot is not a single product, but a suite of products governed by different sets of terms. Where Copilot can’t answer a query internally, it will use Microsoft’s search engine, Bing to search the web and summarise results back to the user. This means the search term used will leave the security of your tenancy and is governed by a different, less stringent set of terms. In highly regulated industries, or when using highly sensitive data, this could be a regulatory violation or reputational risk to an organisation.
Solution: Configure Copilot settings to limit or block external searches where feasible to reduce the risk of sensitive data leaving your tenancy. Also establish clear privacy policiesA term used to describe a series of documents (such as Privacy Notices and Registers of Processing Activities) which are used to account and explain to data subjects how their data is to be processed (most commonly associated with website ‘privacy policies’). prohibiting the use of external queries for sensitive or regulated data.
A successful Copilot rollout involves more than simply adopting the tool. To classify and mitigate the risks, it is essential to regularly review your data management and retentionIn data protection terms, a defined period of time for which information assets are to be kept. procedures to ensure they align with your objectives and comply with data protection regulations.
Here are some useful reminders for organisations operating in the EU or UK under the GDPR:
If you’re using Microsoft Copilot for large-scale processing of personal data or sensitive data, you will probably need to carry out a Data Protection Impact Assessment (DPIA). A DPIA is mandatory under the GDPR when data processing is likely to result in a high risk to individuals’ rights and freedoms.
A well-completed DPIA helps identify and mitigate risks, allowing you to assess the benefits of Copilot outweigh the potential risks based your data processing and objectives. For some organisations, the benefits may justify the costs.
Read more information about conducting a DPIA
If you’re using Legitimate InterestsLegitimate interests is one of the six lawful bases for processing personal data. You must have a lawful basis in order to process personal data in line with the ‘lawfulness, fairness and transparency’ principle. as the lawful basis for Copilot’s business use or data access for training and testing, you must complete a Legitimate Interests Assessment (LIA).
This will be informed by your DPIA and must show that your data processing is not unreasonably intrusive to individuals. If you are processing special category dataTypes of personal data listed in Article 9(1) GDPR that are considered sensitive and thus require extra protection. Article 9(1) lists data relating to: • racial or ethnic origin • political opinions • religious or philosophical beliefs • trade union membership • genetic data • biometric data • health • sex life • sexual orientation Where these types of personal..., you must ensure you have specific justification as this type of data has stricter processing requirements.
To maintain GDPR compliance, you must also update your Record of Processing Activities (RoPA) to include Copilot’s data use. If you allow users to personalise their use of Copilot, ensure that the data usage is tracked, recorded, and assessed for legal compliance.
You must also clearly articulate how Copilot is used to update your Privacy Notices. This ensures you can properly handle and respond to data subject rightsUnder UK and EU data protection regulation, data subjects have a number of rights available to them, including the right to be informed, access, rectification, erasure, restrict processing, data portability, to object and further rights in relation to automated decision making and profiling. requests such as Data SubjectAn individual who can be identified or is identifiable from data. Access Requests (DSARs) or the Right to Object.
Microsoft Copilot can drive efficiency and provide businesses with operational advantages. However, as its use expands, and without proper oversight, Copilot could expose organisations to certain privacy risks.
Having effective data governance structures in place and following data protection best practices is essential, not only for compliance but also for maximising the tool’s potential whilst safeguarding personal information. By implementing clear guidelines for data use, access, and retention, businesses can mitigate risks such as data repurposing, bias, and improper permission management.
With the right privacy safeguards in place, organisations can take full advantage of Copilot’s capabilities.
If your organisation would benefit from expert data protection guidance and support, The DPO Centre offers a range of outsourced data protection and privacy services.
______________________________________________________________________________________________________________________________
______________________________________________________________________________________________________________________________
For more news and insights about data protection follow The DPO Centre on LinkedIn
Fill in your details below and we’ll get back to you as soon as possible