Law is complicated. Privacy is complicated. But I feel like in the case of X v Russmedia and Inform Media Press (Case C-492/23)1, we’ve worried so much about the details that we’ve forgotten about the principle underpinning GDPR, the right to privacy. The case’s ironic outcome is that the same organisations that have a history of abusing that very data are now required to process more personal data in order to align with the EUCJ ruling.
Case Background
A bit of background, The case involves a malicious ad falsely portraying a woman as offering sexual services, which included her photograph and telephone number. Russmedia (the ad platform) neither posted the content nor knew of it at the time of publication but removed it when the woman notified the platform. Unfortunately, what had happened is that the advertisement was scraped and duplicated elsewhere before the ad was taken down. What was the legal precedent at the time was that platforms hosting user-generated content (adverts, social media sites, blogs, and similar websites) had an obligation to monitor and remove illegal content promptly once notified, as defined under Articles 14 and 15 of the e-Commerce Directive (2000/31/EC)2.
The goal of notice and takedown requests is not to proactively limit the amount of harmful content on a website, but rather to respond quickly and effectively once illegal content is identified. This ex-post measure worked as intended and worked well, as the website co-operated with the request. If the aim were to reduce the damage of false and malicious content, it seems like this approach would work. The aim is not to proactively reduce harmful content with ex ante measures.
Prior to this case, the individual posting would have been considered the controller of the personal data, whilst the platform operated as an intermediary entitled to protections under Article 14 of the e-Commerce Directive. Under GDPR, the controller is solely responsible for accurate data, identifying the reason for using the personal data, and other obligations3. A processor is responsible for processing the data on behalf of the controller, following their instructions and ensuring data security4. Therefore, in this scenario, the platform would not be held liable for the harmful content posted by users, provided it lacked knowledge of the illegality and acted swiftly to remove content upon notification.
The Russmedia Ruling
The Russmedia case changed this approach; it identified platforms not as processors but as joint controllers – that they share the responsibility with users for the content on the platform and cannot just act as an intermediary. The requirements that were previously put on the user are now jointly shared with the platform. This seems like a good thing, right? That now platforms are responsible for making sure the content is accurate before it’s even posted, that we can limit these revenge posts before they even pop up on websites.
This is a good thing, but the issue is the how. How can we do that? And ultimately, the tool used is more general monitoring and identity verification by platforms. More surveillance and personal data captured by these websites. The Russmedia case didn’t stop there. It also establishes joint controllership between platforms and users, triggering the GDPR’s requirements for defining respective responsibilities under Article 26. And if the ad might contain special category data5, they need to verify whether the person posting is actually the person in the ad or get proof of explicit consent from the actual data subject.
On top of all this, they’re expected to implement technical measures to prevent these ads from being scraped and republished elsewhere, something that seems near improbable at the moment. And just to make sure platforms can’t escape these obligations, the Court ruled they can’t rely on the e-Commerce Directive’s safe harbour provisions to avoid GDPR compliance. This reasoning would logically extend to the Digital Services Act’s equivalent provisions as well.
General Monitoring
This new approach appears to conflict with Article 15 of the e-Commerce Directive, which explicitly prohibits Member States from imposing general monitoring obligations on platforms. The Court addressed this tension in a single sentence, stating that these obligations ‘cannot, in any event, be classified as such a general monitoring obligation‘6. With respect to the Court, just stating that these actions just don’t meet the definition of monitoring because the court says so is not a convincing argument. Simply declaring that systematically checking every single advertisement before publication doesn’t constitute ‘general monitoring’ because you’ve decided to call it something else is not a sufficient justification.
However, this semantic argument does not hold up under scrutiny. This ruling makes it clear that platforms must screen every ad, verify every identity, and scan for sensitive data across their entire service before anything goes live. Whatever they may be called, this is undoubtedly general monitoring. By requiring such extensive pre-screening measures, the Court is effectively mandating a level of oversight that goes beyond what Article 15 was meant to prevent.
The death of the Old Internet
These requirements demand broader monitoring of user content and the collection and verification of additional personal information, which will chill speech in digital spaces. It pretty much kills throwaway accounts if you need ID verification to post an ad or comment, anonymity is dead. (In)famous spaces online like Reddit’s “Am I the asshole” would see far less activity if accounts were connected to real identities, even if those identities weren’t publicly revealed. People share messy divorces, workplace conflicts, and family drama precisely because no one knows who they are. Take that away, and those conversations disappear.
There’s a notable tension here. The GDPR derives from Articles 7 and 8 of the EU Charter of Fundamental Rights7, which protect the right to private and family life and the right to protection of personal data. These provisions were established to safeguard individuals from intrusive surveillance. Yet the practical effect of Russmedia requires systematic monitoring of user-generated content online. Privacy protection now mandates privacy intrusion.
This speaks to something larger. One of the early promises of the internet was the ability to reinvent yourself, to shed your offline identity and be someone else entirely. Pseudonymous Wikipedia editors, anonymous whistleblowers, people seeking advice in support forums without judgment all relied on the ability to participate without connecting your real name to your words. That’s dying. The internet has centralised to a handful of platforms, content has become increasingly sanitised and algorithm-optimised (or, if we’re being honest, enshittified), and the sense of freedom that once existed is fading. Russmedia accelerates this. The requirement for identity verification before publication doesn’t just affect classified ad sites but sets a precedent that could reshape how we interact online, trading the messy, anonymous internet of the past for something cleaner, more controlled, and far less free.
Conclusion
It feels like we are losing sight of the forest in the trees. Ireland has issued approximately €3.5 billion in fines since May 2018, whilst total EU-wide GDPR fines have reached around €6.7 billion as of December 20258. Now companies that have been fined billions for excessive data collection and inadequate transparency will now be required to collect additional personal data through identity verification systems and implement monitoring infrastructure. The basis for some of these fines are now, in effect, being mandated as compliance measures. It is a reminder that the road to hell is paved with good intentions.
- Case C-492/23, Russmedia Digital ↩︎
- Directive 2000/31/EC – E-Commerce Directive ↩︎
- GDPR Article 24(Responsibility of the controller) ↩︎
- GDPR Article 26 (Processor) ↩︎
- GDPR Article 9 (Processing of special categories of personal data) ↩︎
- Case C-492/23, Russmedia Digital, para 132 ↩︎
- Charter of Fundamental Rights of the European Union, Articles 7-8 ↩︎
- GDPR Enforcement Tracker, www.enforcementtracker.com ↩︎
