Should Social Media companies such as Facebook, continure to scan for evidence of child sexual abuse, exploitation and grooming on their platform?
That is the most important question ever? It’s at the heart of an explosive polarised argument in Europe regarding online privacy, and everyone is being roped in. From actor-turned-tech investor Ashton Kutcher, to the EU’s top Privacy Regulator, online safety experts as far afield as Australia, and also ourselves.
What’s it all about?
On one side there is the EU’s executive branch and its defenders, including Kutcher, who want such automatic digital DNA image scanning to continue. This tool enables investigators to identify known child sexual exploitation content. As new content comes to the attention of investigators, they are added to the search data base.
We would put forward the argument that the use of these essential tools, used for scanning big data, don’t infringe upon privacy at all. They are algorithms that essentially can’t “understand” the content, they seek out specific content, which once identified, flag it for human review. It seeks to identify and match the digital fingerprint or photo DNA for child pornography. It also seeks to identify certain related keywords or terminology.
On the other side of this argument, are privacy activists, EU lawmakers and the bloc’s top privacy regulator. All of whom cite automatic scanning, particularly of text exchanges, is a major infringement of people’s fundamental right to privacy. Going then further to suggest, even if its intent is limited, it still opens the door to abuse because the practice has no clear legal basis.
In a 11th of November last, the European Data Protection Supervisor blasted a Commission proposal that would allow the scanning as contrary to EU privacy rules. And the Parliament’s rapporteur on the draft law, Birgit Sippel, has voiced concern, saying Parliament is unlikely to meet a December 21 deadline to pass the derogation into law.
Now Ylva Johansson, the EU’s home affairs commissioner who is behind the derogation initiative, is pushing back — with unexpected support from Kutcher, who co-founded an organization called Thorn in the U.S. to combat child sex trafficking and abuse.
In an interview with POLITICO, she said the European Data Protection Supervisor (EDPS) Wojciech Wiewiórowski — who’s in charge of policing EU institutions — had ignored children’s well-being.
“What I’m criticizing is that the EDPS are only talking about the privacy of the users. But there is also the privacy of the children, the abused children who are the subject of illegal content … The EDPS left that whole angle out,” she said.
“I had expected the EDPS to help us with that. Instead, he [Wiewiórowski] acted a bit blind in one eye, not seeing there is a huge infringement of the fundamental rights of those children. You have to realize there is a balance to find, and not only to protect the rights of the users.”
Johansson’s comments come as the clock is ticking down on a deal within EU institutions. If lawmakers can’t agree on Johansson’s draft law, platforms will face new privacy rules without an exemption for child sexual abuse material — rendering the automatic scanning illegal.
But the Swedish commissioner argued that it was urgent to give them a chance to carry on the practice, which she says is already in use to detect copyright-infringing material.
According to the head of Europol, who spoke to POLITICO in March, there has been a substantial increase in examples of child exploitation online during the pandemic because kids are spending more time on their phones and computers during lockdown.
Johansson said that trend hasn’t let up: “There are a lot of signs that child exploitation, especially online, is growing.” She added that her office planned to propose permanent legislation to combat child sexual abuse online next year, but that in the meantime platforms needed a legal means to keep detecting the illegal content.
“That’s what I hope now: that Parliament will not follow the draft from the rapporteur [which watered down Johansson’s proposal] and rather opt for an opinion that is much closer to the [Commission’s] proposal,” she said.
With emotions running high on either side of the debate, the issue of automatic scanning has drawn attention far beyond the bloc.
Australia’s eSafety commissioner, tasked with protecting people online, has written to the Parliament’s civil liberties committee, which has the lead on the file, advocating for Johansson’s proposal.
Ashton Kutcher has thrown himself into the mix, by tweeting directly on the issue addressing EU lawmakers.
“Time is running out to ensure a proactive and voluntary online child abuse detection methods are preserved in the #EU,” he tweeted on Wednesday.
Kutcher’s star power has opened doors. Last week, he secured a videoconference with Commission President Ursula von der Leyen, and Johansson cited him as proof that the scanning issue was one of global importance.
Is this legal?
But the other camp bristles at the outside interventions and time pressure. Not only are defenders of the derogation oversimplifying the issue, they run the risk of creating a precedent that will allow platforms to flag and remove all manner of content, some of it harmless, without any solid legal grounding.
Rather than opposing a clampdown on online child sexual abuse, they favor an approach they say would be more in line with the bloc’s privacy rulebook, the General Data Protection Regulation. Sippel, for instance, objects to the part of Johansson’s draft law that pertains to child grooming — i.e., text or audio communication — not the part that pertains to child pornography, which she wants more clearly defined.
The Commission “does not wish to take a stance on whether current voluntary practices to detect and report child sexual abuse material are in fact legal under EU law,” Sippel said in her draft report on Johansson’s proposal.
The Commission wants its proposal to be finalized by December 21, but some lawmakers dismissed the deadline as artificial, since scanning would not stop overnight without the derogation.
But David Lega, who heads a Parliament group on children’s rights, says a deal is not only necessary but possible within the time limit.
“I think it [the deadline] could be met and I hope that it will be,” he said. “There is time both procedurally and legally to do this now.”
The derogation is meant to apply until the European Commission presents a fully fledged piece of legislation on the fight against sexual abuse online next year.
Ultimately, if police agencies around the world are prevented from using these tools to identify victims of online child sex abuse and exploitation, then countless children may never be identified. This is completely contrary to a child’s basic human rights.
One can only imagine the dire consequences for millions of children, if online sexual predators were to be afforded the sanctuary of a fully encrypted platform. This would fling the ability to investigate such crimes back to the Dark Ages.
Hence, why it is so important for a this polarised argument to receive the public attention it deserves. From our perspective, privacy while a right, isn’t absolute. Especially if the health, safety and welfare of a child is put in mortal danger with increasing frequency as witnessed during the covid19 lockdown.