Wednesday, February 8, 2023

EU law will require Big Tech to do more to combat child sexual abuse, but a key question remains: How?

The European Commission recently proposed regulations to protect children by requiring technology companies to scan the contents of their systems for material for child sexual abuse. This is an extremely far-reaching and ambitious effort that will have far-reaching implications beyond the borders of the European Union, including in the USA.

Unfortunately, the proposed regulations are for the most part technologically unworkable. To the extent that they can work, they need end-to-end encryption breaks, which will enable the technology companies – and possibly the government and hackers – to see private communications.

The regulations, proposed on May 11, 2022, will impose various obligations on technology companies that offer content and communication services, including social media platforms, SMS services and instant messaging programs, to track certain categories of images and text.

Under the proposal, these companies will be required to detect material previously identified for child sexual abuse, new material for child sexual abuse and requests from children for sexual purposes. Companies will be required to report detected content to the EU Center, a centralized coordinating entity that will implement the proposed regulations.

Each of these categories presents its own challenges, which together make it impossible to implement the proposed regulations as a package. The balance between protecting children and protecting user privacy highlights how combating online sexual abuse of children is a “wicked problem”. This puts technology companies in a difficult position: they have to comply with regulations that serve a laudable purpose, but without the means to do so.

Digital fingerprints

Researchers have known for more than a decade how to track down previously identified material for child sexual abuse. This method, first developed by Microsoft, assigns a “hash value” – a kind of digital fingerprint – to an image, which can then be compared to a database of previously identified and hacked child sexual abuse material. In the US, the National Center for Missing and Exploited Children maintains several databases of hash values, and some technology companies maintain their own hash sets.

The hash values ​​for images uploaded or shared using a company’s services are compared with these databases to detect previously identified sexual abuse of child material. This method has been proven to be extremely accurate, reliable and fast, which is critical to making any technical solution scalable.

The problem is that many privacy advocates consider it incompatible with end-to-end encryption, which, strictly speaking, means that only the sender and the intended recipient can see the content. Because the proposed EU regulations require technology companies to report any detected child sexual abuse material to the EU center, it would violate end-to-end encryption, thus forcing a trade-off between effective detection of the harmful material and user privacy.

YouTube video

Here’s how end-to-end encryption works, and what popular messaging applications it uses.

Recognition of new harmful substances

In the case of new content – that is, images and videos that are not included in hash databases – there is no such proven technical solution. Top engineers have worked on this issue and built and trained AI tools that can accommodate large volumes of data. Google and the non-governmental organization Thorn for Child Safety have both achieved some success by using machine learning classifiers to help companies identify potential new material for child sexual abuse.

However, without independently verified data on the tools’ accuracy, it is not possible to determine their usefulness. Even if the accuracy and speed are comparable to hash-matching technology, the mandatory reporting will again break end-to-end encryption.

New content also includes live streams, but the proposed regulations seem to overlook the unique challenges posed by this technology. Live streaming technology became ubiquitous during the pandemic, and the production of material for the sexual abuse of children from live streaming content increased dramatically.

More and more children are being seduced or forced to stream sexually explicit acts live, which the viewer may record or screen. Child safety organizations have noted that the production of “perceived first-person sexual abuse material” – that is, sexual abuse material of children of apparent selfies – has increased at exponential rates in recent years. In addition, traffickers can stream the sexual abuse of children directly for offenders who pay to watch.

The circumstances that lead to recorded and live streaming of child sexual abuse differ greatly, but the technology is the same. And there is currently no technical solution that can detect the production of material for child sexual abuse as it occurs. Technical security company SafeToNet is developing a real-time tracking tool, but it’s not ready to go.

Detect requests

Detection of the third category, “request language,” is also loaded. The technology industry has made dedicated efforts to identify indicators needed to identify provocative and enticing languages, but with mixed results. Microsoft spearheaded Project Artemis, which led to the development of the Anti-Grooming Tool. The tool is designed to detect seduction and recruitment of a child for sexual purposes.

However, as the proposed regulations point out, the accuracy of this instrument is 88%. In 2020, the popular messaging app WhatsApp delivered about 100 billion messages daily. If the instrument identifies even 0.01% of the messages as “positive” for recruitment language, human reviewers will be tasked with reading 10 million messages every day to identify the 12% that are false positives, making the instrument simply impractical.

As with all of the above detection methods, it will also break end-to-end encryption. But while others may be limited to revising a hash value of an image, this tool requires access to all exchanged text.

No road

It is possible that the European Commission is following such an ambitious approach in the hope of encouraging technical innovation that will lead to more accurate and reliable detection methods. Without existing tools that can carry out these mandates, however, the regulations are ineffective.

When there is a mandate to act but no path to follow, I believe the decoupling will simply leave the industry without the clear guidance and direction these regulations are intended to give.

Nation World News Desk
Nation World News Deskhttps://nationworldnews.com
Nation World News is the fastest emerging news website covering all the latest news, world’s top stories, science news entertainment sports cricket’s latest discoveries, new technology gadgets, politics news, and more.
Latest news
Related news