GDPR Risks and Microsoft

By Raymond Girbes
- Jan 27, 2026 - 3 minutes read

You're at risk of GDPR violations if you use Microsoft Windows this way

In this blog, I will explain the dos and don'ts of safeguarding the privacy of yourself, your employees and your customers.

Many companies still rely on Windows. The issue isn't Windows itself, but rather the fact that Windows and the Microsoft ecosystem increasingly contribute to data processing, which is something you want to minimise as a business or individual. Avoid diagnostic data (telemetry), cloud synchronisation and new features that track screen and user activities.


Telemetry: Limited transparency, limited control
Windows 11 (and other versions) sends diagnostic information to Microsoft. In practice, however, your organisation cannot check exactly what is being sent because much of this data is encrypted, and the telemetry is neither fully transparent nor independently auditable. Furthermore, most versions of Windows do not allow you to turn off telemetry completely. This raises privacy and compliance issues because you need to be able to explain what data is processed, why and how.


Consent through buttons and licenses is not a privacy strategy
Microsoft frequently asks for consent to use features that it claims you will find "useful". Many businesses find it challenging to make an informed decision due to the unclear impact on data and privacy. Behaviour can change and settings can be modified.


GDPR: Purpose and data minimization
The GDPR requires that data processing has a legitimate purpose, and that no more data is processed than is necessary. The design choice of 'capturing everything and filtering later' significantly increases risk. Organisations should manage data through structured storage, good document management, clear permissions, logging and search functionality designed for this purpose. This is preferable to risky methods such as periodically capturing screen content, which often results in more data being captured than is necessary.


Recall and Windows 11: Screenshots of your screen as a new risk
When you activate Recall, Windows 11 periodically saves snapshots of your screen to help you recall past activities. In practice, this also captures moments when sensitive information is visible, such as names, personal data, contracts, emails, orders, payments, customer records, internal systems, usernames and, occasionally, passwords that are briefly displayed for verification purposes. The risk of exposing sensitive information increases with every new capture location, even with filters, especially in the event of a security incident.


My main point
I'm more concerned with the approach than with "new useful features". A system that inherently leans towards capturing and syncing more data while offering less control is not viable for businesses that take privacy and security seriously.


Handle data and software responsibly
Treat data and software seriously.
If everything goes well, there's no problem, but if something goes wrong, the consequences can be severe.
  • If you can later demonstrate that you made a conscious decision and took appropriate measures, you'll be in good standing.
  • Saying "Oh, it should be fine" is not making a decision; it's gambling with other people's data.

Conclusions
  • Disable functions like Recall if you process personal data.
  • Consider limiting cloud synchronisation via Microsoft services such as OneDrive, or stop using them altogether. Consider alternatives such as your own NAS or services like Tresorit. If you want end-to-end encrypted synchronisation, take a look at Proton Drive.
  • Consider a platform that can be transparently and controllably configured: Linux.