Scanning Plans On Europe’s CSAM May Violate International Law

By   Olivia William
Writer , Information Security Buzz | May 09, 2023 07:34 am PST

According to reports, legal experts for the EU have warned that plans to force tech companies to scan customers’ private chats for child abuse (CSEA) content are likely to be struck down by the courts.

A contentious clause of the UK’s Online Safety Bill, Clause 110, is similar to the proposed “chat control” laws. Providers of end-to-end encryption may be issued “detection orders” mandating pre-encryption scanning of client messages for CSEA content.

For this purpose, “client-side scanning” technology would be used to compare incoming media like videos, photos, and text against a blacklist.

Here’s more on the proposed Internet Safety Act: WhatsApp and Signal say that users’ privacy and security will be compromised by the online safety bill.

However, the legal office of the EU Council has reportedly cautioned in a leak that the measures represent a “particularly serious limitation to the rights to privacy and personal data,” and that there is quite a “serious risk” of them being struck down by judges.

Due to a European Court of Justice judgment that even communications metadata might only be vetted for national security, recent plans may not be suitable for CSEA.

The guidelines, which were printed in The Guardian, would call for a screening that is both general and indiscriminate of the data that is processed by a particular service provider. Furthermore, they would apply uniformly and without difference to everyone who uses that particular service, with no exceptions allowed.

Client-side scanning has a lot of problems in the eyes of privacy advocates. This is what they say; scientists have determined it may produce an excessive amount of false positives and is vulnerable to other forms of hacking.

Client-side scanning may put sensitive information at risk if it were exploited by governments or hackers from other countries.

Child molesters, as previously with services like EncroChat, will migrate to unpoliced apps if client-side scanning becomes mandatory.

In the nearest future, this technique could be used to secretly monitor a wider variety of user-generated content.

It’s worth noting that the UK’s client-side scanning measures would also reduce security for domestic businesses and customers, and that the heads of several prominent messaging applications have publicly indicated they’d rather leave the country than comply with them.

Lawyers in the EU are allegedly concerned that the bloc’s proposals would lead to widespread profiling of individuals, including their biometric information, by requiring messaging companies to implement age verification.

Conclusion

A legal opinion on a controversial European Union legislative plan proposed last May by the Commission to combat child sexual abuse online by requiring platforms to scan for abuse and grooming suggests the plan is incompatible with EU laws that prohibit general and indiscriminate monitoring of people’s communications. The Council’s legal advice on the proposed Child Sexual Abuse Regulation (also known as “Chat control”), which leaked online this week. finds that the regulation as drafted violates fundamental European rights like privacy and data protection; freedom of expression; and the right to give respect for a private family life, as critics have warned.

The Commission argued that the strategy is legal since it will only apply “targeted” and “proportionate” restrictions to platforms where online child sexual exploitation is a problem, combined with “robust conditions and safeguards”. The legal opinion destroys that defense. However, it’s “highly probable” that a judicial review of the regulation’s detection orders, which require platforms to adequately scan for child sexual abuse material (CSAM) and other related activity (like grooming), will find that the screening obligations are “general and indiscriminate” rather than targeted and proportionate, as EU law requires.

The Council’s legal guidance states that the Commission’s “targeting” of orders at dangerous platforms does not target specific platform users, requiring “general screening” of all service users.

Subscribe
Notify of
guest
0 Expert Comments
Inline Feedbacks
View all comments

Recent Posts

0
Would love your thoughts, please comment.x
()
x