EU rules planned to protect children online are an attack on privacy, critics warn | Children

Critics have accused the European Commission of seeking to end encrypted communications after the EU’s executive body unveiled tough regulations for messaging apps meant to tackle the spread of sexual abuse images on people. children.
Under the proposed regulations, messaging services and hosts would be required to search for and report any child pornography, even in the case of encrypted messaging services like Apple’s iMessage and Facebook’s WhatsApp which cannot be analyzed from this way.
“The detection, reporting and removal of online child sexual abuse is urgently needed to prevent the sharing of child sexual abuse images and videos, which re-traumatizes victims often years after the abuse ends. sex,” said Ylva Johansson, European Commissioner. for home affairs on Thursday.
“Today’s proposal establishes clear obligations for businesses to detect and report child abuse, with strong safeguards guaranteeing the privacy of everyone, including children,” she added.
Andy Burrows, head of online child safety at the National Society for the Prevention of Cruelty to Children (NSPCC), welcomed the proposals, which he described as an “impressive and ambitious” attempt to “systematically prevent preventable child abuse and grooming, which is taking place at record levels.”
He added: “If approved, it will place a clear mandate on platforms to address abuse wherever it occurs, including in private messaging where children are most at risk.”
But the plans have also been criticized as an invasion of privacy. “Private companies would be responsible not only for finding and stopping the distribution of known child abuse images, but could also be required to take action to prevent ‘grooming’ or alleged future child abuse “said Joe Mullin of the Electronic Frontier Foundation (EFF), a privacy group.
“It would be a massive new surveillance system, as it would require an infrastructure for detailed analysis of user messages.”
The regulation does not technically require providers to completely disable end-to-end encryption. Instead, it allows vendors to apply automated techniques to scan users’ devices for abusive material — a compromise that proponents say preserves privacy while helping to deter child abuse.
It’s a similar approach to that offered – but indefinitely postponed – by Apple, when the company suggested scanning photos before they were uploaded to its cloud storage.
Apple’s approach, however, only scanned photos for examples of existing child abuse images. The commission’s proposal includes a requirement to detect newly created images, and even seek active grooming, which is much more difficult to achieve successfully with existing automated systems without compromising privacy.
It also suggests that app stores, like those run by Apple and Google, should be involved in enforcing the requirements, pulling out apps that don’t meet scanning requirements.
Patrick Breyer, a German MEP from the Pirate Party, called the proposals “fundamental rights terrorism against trust, self-determination and security on the iInternet”.
Sign up for First Edition, our free daily newsletter – every weekday morning at 7am BST
He added: “The proposed chat control threatens to destroy the digital privacy of correspondence and secure encryption.
“Digitalization of personal cloud storage would result in mass surveillance of private photos. Mandatory age verification would put an end to anonymous communication. »
The proposal does not include the delay duty of enforcement agencies to report and delete known abusive material on the net, nor does it provide European standards for effective prevention measures, support and advice to victims and effective criminal investigation.
“This Big Brother attack on our cell phones, private messages and photos with the help of error-prone algorithms are a giant leap towards a Chinese-style surveillance state,” Breyer said.