Your IP:Unknown

·

Your Status: Unknown

Skip to main content


Do we have to sacrifice privacy to keep children safe? An interview with Ella Jakubowska

Can we protect children online without turning the internet into a surveillance tool? The EU’s proposed Child Sexual Abuse (CSA) regulation has sparked intense debate. Supporters call it a crucial step against online child exploitation, while critics warn that this regulation could weaken encryption, undermine privacy, and put everyone’s security at risk. Finding common ground remains a challenge.

Mar 20, 2025

10 min read

An interview with Ella Jakubowska

Meet Ella Jakubowska, Head of Policy at European Digital Rights (EDRi), who believes that the regulation may do more harm than good. She argues that while protecting children is crucial, breaking encryption and monitoring private messages isn't the answer. In this interview, she explains why this proposal is so dangerous, what’s at stake for everyday users, and how we can fight child abuse without sacrificing digital privacy.

Note: The original interview has been edited with permission from the interviewee.

You've been working on the EU's proposed Child Sexual Abuse (CSA) regulation. For those unfamiliar with it, could you briefly explain its goals and why it's currently a major topic of debate?

“This law was proposed a couple of years ago by the European Commission. Its stated aim is to scan people’s private messages — WhatsApp messages, emails, Facebook Messenger chats, Signal messages, and so on — to detect child sexual abuse material and report it to law enforcement.”

“This material is of course illegal, it’s a terrible crime, and it has profoundly harmful consequences for the victims. It’s not just legitimate but absolutely necessary for the EU to act against this horrific crime. The problem, however, is with the measures being proposed purportedly to achieve this aim.”

“These measures are not technically feasible. The proposal wasn’t developed in consultation with tech experts but was instead drafted without much regard for how technology actually works. Worse, it wouldn’t even be effective in its goal of protecting children.”

“Essentially, the regulation proposes measures requiring platforms like Facebook, Google, or Signal to scan all private messages of their users — just in case someone is doing something illegal. This approach undermines the EU’s foundational principle of the presumption of innocence. Even more concerning is the requirement to scan encrypted messages.”

“As the technical community knows, you cannot scan encrypted messages without breaking the encryption itself. Once you do that, the messages are no longer protected — whether from governments, hackers, child abusers, or other malicious actors who might exploit this access.”

What specific aspects of the CSA proposal concern you the most?

“From the beginning, we had three major reasons why we opposed ‘chat control.’"


Chat control

Chat control is a term used to describe the EU’s proposed Child Sexual Abuse (CSA) regulation, which would require online platforms to scan all private chats, messages, and emails for illegal content, including child abuse material.


“The first is that it treats everyone in the EU as a suspect. The regulation suggests that people’s private communications could be scanned on a massive scale, which undermines the presumption of innocence. Once you start treating everyone as a suspect, you’ll need to look through enormous volumes of material.”

It treats everyone in the EU as a suspect.

“We know that technology isn’t perfect. It generates errors all the time. AI-based scanning of everyone’s private messages is bound to have mistakes. Even if an AI system was theoretically 99.99% accurate, with billions of messages being sent daily, that small margin still amounts to millions of wrongly flagged images.”

“This floods already under-resourced law enforcement with false reports. A picture of a child playing on the beach or even a kitten could be wrongly flagged. The police, who should be on the front lines protecting children, are instead stuck behind computers, sorting through these false reports.”

“The second issue is encryption. For decades, governments have wanted to break encryption, arguing that encrypted messages make it difficult to police. Recently, though, there’s been a concerted push from European police to position encryption as something bad, suggesting only ‘bad guys’ use it to hide their activities.”

The second issue is encryption.

“Encryption is crucial for safeguarding our human rights in the digital age. It's a fundamental part of our right to privacy, just like VPNs and other privacy-preserving tools. Encryption ensures our right to remain anonymous in digital public spaces, unless we've done something wrong.”

“This is vital for everyone, but especially for politicians, journalists, human rights defenders, doctors, lawyers, and even individuals seeking healthcare, particularly in contexts like reproductive rights.”

“Finally, the third issue with the CSA regulation is the idea of mandatory age verification. The proposal suggests that many platforms and services operating in the EU should use technical means to verify the ages of their users. This raises major concerns.”

The third issue with the CSA regulation is the idea of mandatory age verification.

“Mandatory age verification could force people who don’t have identity documents — such as undocumented individuals — or those without the right smartphone tools to be locked out of the Internet. Even if you have access to the right tools, it would require processing sensitive data.”

“There’s also the question of who gets to decide what’s ‘age-appropriate.’ It’s a slippery slope. The idea of mandatory age verification could even involve using facial recognition to determine if someone is old enough to use a service. We’ve fought against biometric mass surveillance for years, and this proposal feels like a step toward that.”

Why do you believe there’s no safe way to implement this regulation without compromising encryption?

“On the most basic technical level, if you insert a third party into an encrypted message between two people, it’s no longer end-to-end encrypted. It doesn’t matter where you try to insert this ‘back door’ or ‘front door’ into the message chain.”

“You’re breaking the very promise of encryption, which is that only the sender and recipient can access the information. Introducing a third party creates a vulnerability — a cybersecurity risk — that can let malicious actors in. The technology is built on this premise, and altering it undermines security for everyone relying on it.”

You mentioned that law enforcement is already overwhelmed with alerts. How could the CSA regulation make this issue worse?

“Some law enforcement agencies … have said that they already receive far more reports of child sexual abuse than they can handle. They spend a huge amount of time sifting through these reports, and a very conservative estimate suggests that if this regulation were to pass, they would receive at least double the number of reports they currently get from the internet, many of which would be false alerts.”

“For instance, police officers would be forced to go through family pictures, kitten photos, or consensual images of young people, even though they were sent between 16-year-olds to one another. These cases would still need to be investigated, which creates a lot of unnecessary work and, at times, very invasive investigations into lawful and consensual sexual self-expression.”

What alternatives do you see to address this problem?

“Technology can assist in triaging reports, helping to identify whether an abuse material report matches one that has already been flagged, without requiring a human to review it. This would help speed up assessments and lessen the number of horrific images that professionals need to see.”

“However, the real issue is more systemic and societal. If we don’t have strong welfare systems that tackle the root causes of crime, justice systems that treat survivors with respect and dignity, or education systems that teach young people, parents, and carers properly, we’re not addressing the problem at its core. These are some of the fundamental issues that need to be addressed…”

Do you think the public understands the potential risks of this regulation, or does a lack of awareness affect understanding of its impact on everyday internet use?

“I think very few people are aware of this proposed regulation. Germany might be an exception, but even there, the general public often doesn't have full knowledge of what ‘chat control’ could mean for their daily lives… To many, this kind of obscure law from Brussels seems far removed from their everyday concerns.”

“However, organizations like ours, EDRi, are trying to raise awareness before the law impacts people’s lives directly. It’s important to have a democratic voice in these discussions, and part of that is educating people about what's at stake.”

“We co-developed a poll with a research agency to ask young people how they'd feel about this regulation. We asked, for example, whether they'd feel comfortable expressing themselves politically or exploring their sexual identity if scanning were happening in the background. Overwhelmingly, the response was no — young people said they wouldn’t feel safe engaging in these activities under such surveillance.”

“In contrast, the European Commission conducted its own survey, which I find very biased. It asked an unrealistic, abstract question: ‘If you had to choose between protecting children or protecting privacy, what would you choose?’ Of course, most people said they would protect children because who wouldn’t?”

“This framing presents a false dichotomy. The European Commission presents it as though you have to choose between children and privacy, which isn’t the case… The truth is that some European countries are aware of the flaws in this law but feel pressured to support it to appear like they’re doing something for children. The need to be seen as taking action often trumps the need for balanced, thoughtful debate on the issue.”

Looking ahead, if policies like this regulation are implemented, how do you think they will shape the internet and digital freedom in the EU over the next decade?

“It’s really hard to predict exactly what would happen if a law like this is passed — especially because it’s asking companies to do something that isn’t technically possible. I sometimes compare it to telling doctors they’re now legally required to cure death. Nobody is allowed to die anymore. But that’s not something they — or anyone — can realistically do.”

“In this case, some companies providing private messaging services have already said they wouldn’t be able to comply with these rules while still protecting their users’ privacy and security. So if this law is passed, they may stop operating in the EU altogether. This could mean people here would lose access to encrypted communications entirely.”

“In the long term, this kind of regulation could break trust in the internet. People might stop feeling safe expressing their thoughts or sharing personal information online. Privacy would go from being a fundamental right to something people can no longer count on.”

As a final thought, what is the one key message you would like to share with our audience?

“Sometimes the challenges to our privacy feel overwhelming, and it can seem like there’s nothing we can do. But when I look at the progress we’ve made in recent years, I’m reminded that change is possible.”

“As a movement, we’ve been able to influence laws and push back against harmful surveillance practices. The CSA regulation, for example, has been in limbo for two and a half years — that's no small achievement. We’ve made sure governments listened to scientific and legal evidence.”

“So despite how daunting things may seem, all is not lost. There’s a powerful community working together to protect digital human rights, and we can continue to make a difference.”

Like what you’re reading?

Get the latest stories and announcements from NordVPN

We won’t spam and you will always have the choice to unsubscribe


author Violeta L png

Violeta Lyskoit

Violeta is a copywriter who is keen on showing readers how to navigate the web safely, making sure their digital footprint stays private.