Tech companies should do much more to curb the spread of child abuse online, if in line with the European Commission. No sane person would argue that the sharing of texts and images about abuse should be stopped. But critics say the way Brussels sees this has profound implications for the privacy of all European citizens.
Chat services, hosts and other technology platforms need to analyze user messages to see if, for example, child porn has been shared. It’s a noble pursuit, but privacy has been criticized a lot. Many chat services use end-to-end encryption, so messages are only visible to the recipient and sender. Even the makers of the apps cannot read the content of the messages.
According to Europol, among others, encrypted chat apps register the spread of child abuse less and less often. Currently, only the victim or the abuser can report the crime. In addition, metadata can be read, which shows when there was contact between people. But what the content is, remains confidential.
The European Commission writes in its proposal that encryption is “necessary for cybersecurity and the protection of fundamental human rights such as freedom of expression, privacy and personal data”. It also makes detecting child pornography difficult, if not impossible. Brussels says this creates a haven for criminals who can freely share illegal images.
There are various ideas for detecting the spread of child abuse on the Internet. with technology client side device analytics it would still be possible to scan such conversations for malicious content. Databases with photos of known abuse are used for this. The data of these images (or texts) can be compared completely automatically with images shared in chats, even if they are encrypted.
Mixing everything up for abuse
Rejo Zenger of the digital civil rights movement Bits of Freedom says it is not only possible for tech companies to track down images of abuse. Compare this to a card catalog. ‘Suppose you are looking for images of an apple and a banana. Then you have to look at all the pictures in the card catalog to see which cards have an apple and a banana.”
Other critics were also heard. WhatsApp CEO Will Cathcart wrote on Twitter that he was disappointed that the bill “didn’t protect encryption”. secret code Matthew Green A doctor from Johns Hopkins University described the proposal as “the scariest thing he’d ever seen.” “Let me clarify what that means: Algorithms are used to read your messages to understand what you are saying.”
No one doubts the European Commission’s intention to combat the spread of child abuse. “But they are not well informed,” says Zenger. “This proposal is technically impractical and in violation of European privacy legislation.”
“The danger of the openness of the internet”
Zenger says the privacy of the Internet is at stake. “If this continues, you can no longer rely on no one to read encrypted chats. The word comes out more cautiously. The open character is very important, especially for children. also.”
It is then possible that the authorities later want to search for other (criminal) messages. “This kind of infrastructure is always set with high goals,” says Zenger. “We have seen this in the past, for example, with license plate cameras that have to tackle human trafficking at the border. A few years later it proved to be useful in detecting other types of crime and now they are being done. That’s what it’s used for.”
There are alternatives, Zenger thinks. Sex offenses are often suspended, the police are understaffed. Also, most illegal images are hosted on European servers. Why don’t we put more energy into doing something about it?”
Source: NU
Jason Jack is an experienced technology journalist and author at The Nation View. With a background in computer science and engineering, he has a deep understanding of the latest technology trends and developments. He writes about a wide range of technology topics, including artificial intelligence, machine learning, software development, and cybersecurity.