As an EU citizen I didn't know exactly what that Chat Control thing was, so I web searched it:<p>> The EU wants to oblige providers to search all private chats, messages, and emails automatically for suspicious content – generally and indiscriminately. The stated aim: To prosecute child pornography. [1]<p>Yeah, that will go down well, a central government checking our private conversations for "suspicious content". Of course they would use the "think of the children" trope, they could also have gone with the "think of the bad terrorists" trope, but that would have been too American, too cowboy-ish, we need to feel special, we're Europeans, after all.<p>Minus some street protests I don't think we can actually stop this, and, even then, I have my very big doubts. It so happens that I live in the EU periphery (I still need to present my ID card if I want to travel to Budapest or West from there), and it sickens me to see that my privacy depends on countries and electorates on which I have no say (like Germany, with all due respect to the Germans who still care about their privacy). Why should my privacy be made fun of because of decisions taken by some people from half way around the continent with which I have no direct connection and no shared past? Did they have a <i>Securitate</i>-like thing? Many of them didn't, and even those that did (like the same Germans), it looks like it doesn't matter at this point, they're all too happy to see their private political conversations be scrutinised 24/7.<p>F. that, the only viable solution I see for my country is an exit from the EU, but the money (still) coming in from Bruxelles is too good to leave aside for pesky political principles, so of course that no serious politician from around these parts puts the problem that way.<p>[1] <a href="https://www.patrick-breyer.de/en/posts/messaging-and-chat-control/" rel="nofollow">https://www.patrick-breyer.de/en/posts/messaging-and-chat-co...</a>
The only "good" thing is that the current German government is very skeptical about this and two of the current governing parties (the Greens and the liberal party) have also long been opposed to more surveillance.<p>This was different with the previous government where the "law & order" mentality was much more entrenched, and which did nothing to prevent e.g. upload filters (despite promising to do so).<p>So I try to maintain some hope that at least Germany as a member state could tank this awful bill.<p>edit:<p>Here's the list of 61 questions that the German government sent to the EU concerning the bill (at the end of the article, in English): <a href="https://netzpolitik.org/2022/chatkontrolle-bundesregierung-loechert-eu-kommission-mit-kritischen-fragen/" rel="nofollow">https://netzpolitik.org/2022/chatkontrolle-bundesregierung-l...</a><p>From a cursory reading, it reads to me like the diplomatic equivalent of "what you're proposing doesn't make any sense".
It doesn't matter how low the error rate is. The fact that the European Commission wants to have a third (robotic) participant listening in on every digital communication is absolutely ridiculous.<p>Saying that this is about child protection is a blatant lie. This serves only as a stepping stone to introduce other screening criteria later. And with opaque ML models it will be very tedious to determine what the model is supposed to find.
I think there are some important questions that need to be answered before such a dangerous decision should be made.<p>How big is the risk of a child being groomed through these electronic means? Is it comparable to being struck by lightning? What is worse weighted by probability: being sexually assaulted as a child or being suspected and having your life turned upside down for years by these algorithms. We already see these things happen with relatively minor things like having your google account closed by an algorithmic mishap.<p>How was this 10% number of false positives determined? Is this only an expectation of false positives or an actual statistic.
What does 10% mean in the context of mass surveillance?<p>It might well be that millions of children are groomed and assaulted every year through chats. I don't have the data so i cannot say. I was under the impression that most sexual assault cases happen in the family and not by strangers.<p>What's worrying though is that these decisions are taken behind closed doors without any oversight, on the hope that they might save a child and possibly putting our lives in the hands of algorithmic justice.
I have such a strong reaction to news like this, it's hard for me to not think that it's appropriate for every member state to consider leaving the EU now. They've succeeded in shifting me to a very anti-EU stance in one proposed measure - brilliant! It wouldn't surprise me if the leaders of most states are quite keen on such surveillance though.
No comment on the policy etc, but I think the presentation of the numbers is a bit misleading.<p>That 10% is the percent of flagged images which are actually OK. Whether this represents a large fraction of all legal content depends on how much illegal content there is. It would be better if they quoted the false positive rate and false negative rate as a fraction of legal/illegal images respectively.<p>e.g. if 1/100,000,000 legal images are flagged incorrectly, and 100% of illegal images are flagged correctly, then a corpus of 100,000,000 legal images + 9 illegal images would result in the stats in the headline. That seems like a pretty good system (ignoring any principled objections to the scanning in the first place).
Due to the volume of messages and the endless need to maximise profits, companies will accept 10% flagged content may be false positives but act against all 100% of flagged content, meaning that the default will be innocent people having action taken against them but with no real recourse to clear their name.<p>I also didn't find anything in there about expectations for reducing numbers of false negatives (where automation fails to flag suspicious activity). Content control is basically just PR if it ignores the majority of activity it is designed to police.
Disgusting. Wonder if people working on these commissions ever consider the thought that they are "the baddies". In any case, the gestapo would have _loved_ this service.
Important to note that this is not 10% of all messages being falsely flagged (= 10% false positive rate), but 10% of flagged messages being false positives (= 90% precision). As someone who works with these types of classification problems in a different context, 90% precision is actually quite good - especially assuming there is some sort of manual review process to take care of the 10%.<p>Whether that makes this whole plan a good idea or not is obviously a very different question, but I think it's important to be clear about what this number actually means.
How many EU beaureaucrats got their training in East Germany, I wonder?
Or is it rather that they are too young and do not know about/remember what constant surveillance does to trust in a society?
With 90% accuracy, and assuming the incidence of true grooming in random conversation is way smaller, they are setting themselves up for the base rate fallacy.<p><a href="https://en.wikipedia.org/wiki/Base_rate_fallacy" rel="nofollow">https://en.wikipedia.org/wiki/Base_rate_fallacy</a>
I think what makes false positives scary in this context is that accusation is guilt for most practical purposes. There needs to be a tremendous ammount of openness around systems like this so that people understand the meaning of it's outputs.
This whole debacle reeks of stupidity. The only thing that will happen is that the criminals they are (allegedly) trying to catch will simply move their comms to different channels. What's stopping a sophisticated crime syndicate form simply creating their own app which will have a small enough footprint such that it will fly under the radar?<p>From the perspective of tech companies, they are being put between a rock and a hard place by simultaneously being asked for more privacy, and also less privacy.
Is there any estimate for what percentage of chat communication happening within the EU is done by "perpetrators/criminals"? The average crime rate is 40 incidences per 100000 people per year, which would mean 0.0004% of the population is considered a criminal every year. What % of that tiny margin are going to be using online chat to commit their crimes? is it really worth abandoning the privacy of 450mil other people in the hopes that you might stop a criminal?
It is so tiring to see the constant efforts to erode our rights.
Even if they succeed in creating a surveillance state, do they think that it will not blow up in their face one day? It will make violence and revolutions against the surveilling institutions innevitable, or maybe it is just wishful thinking...
It kinda gets me into the stance: every member of EU parliament and EU commission should make their bank accounts fully transparent so populi can check whether they are committing some act of corruption.
Which messenger do we switch to if this goes in? I don't think I'd have an issue convincing people.<p>Is Signal subject to this? Telegram? Do we need something "less mainstream"?
Maybe I'm missing something here but it seems to me there is a basic question here. Do two or more people have a right to privately communicate via a third party?<p>If the answer is yes, then regardless of the accuracy of the system or the mass nature of the communication network this is objectionable law making.<p>IMO, it is a fundamental human right to communicate privately.<p>The only real question is what is the responsibility of a third party. If I give a shipper illicit material are they responsible to inspect it and report it? I'm personally unaware of the law regarding this but I assume your shipper is not required by law to open every package it ships and report upon it. Are they required to do a percentage?<p>If not than what the state is claiming here is a right by convenience. It happens that digital communication is easier to inspect than crates. Therefore, the state can create an expectation of one third party it does not of another.
If this ever gets into production I hope the tech community can come together and work on a system to generate false positives until this system is no longer viable.<p>Let's hope its a typical EU project and will take at least a decade to complete, or better, let's just hope it will outright fail.
Ah just in time to monitor fledgling revolts due to their own policies. Can't rebel against the deliberate reductions in wealth if you start black-bagging anyone who wonders about the current state of the continent.
> The EU Commission is apparently aware of the problem and is consciously accepting it.<p>So they're fine to read 10% of all messages? Probably more, because of context? Besides this obviously being a massive DDOS on whatever dystopian spy center of sanitary thoughts they want to build, I wonder how the big EU honchos get their free pass on that? Or didn't they simply not consider that they're going to get monitored as well?
There is a fundamental problem to making everyone a suspect. I'm no criminal... why do I need to prove it to the state(s) and companies with EVERY message sent?<p>Not even when driving a car I'm tracked all the time. And I think driving a car can be also dangerous.
How ironic. On one hand EU forces legitimate businesses to spend billions of dollars to satisfy GDPR in the name of privacy. On the other, they are planning to stream teenagers' private pics directly to designated "investigators".
So the TL;DR is the EU Commission wants to implement surveillance in Chat applications to "protect and combat sexual abuse of minors", because nobody is against "combating sexual abuse of minors", right?<p>Probably later they extend that to "protect and combat right wing opinion", because nobody is against "combating the right wing opinion", or even "protect and combat the climate changes", because "Who are against it", right?<p>Sounds like the a lot of "paranoid people" were just right, i guess?
What is to stop criminals from using any specific form of encryption (ie. math)? Meanwhile the average person who follows the law will be at a much greater risk of identity theft, ransom-ware, etc…
This will never fly in Germany. Those people still have internet cafes without surveillance cameras so they can do their computing anonymously. Paranoia is a way of life there, for good reason.
<i>The commission does not seek to break encryption</i><p>Okay.<p><i>Encryption is not only important to protect private communication, but would also help perpetrators/criminals</i><p>No.<p>It is there to protect us from perpetrators, criminals and all the people which think they are on the good side. The road to hell is paved with good intentions. Authoritarian regimes on our planet always thought they were the "good guys". Encryption is actually there to protect us from you!<p>The mothers and fathers of the German Grundgesetz (~ constitution) learned that the hard way.<p><a href="https://www.gesetze-im-internet.de/gg/art_10.html" rel="nofollow">https://www.gesetze-im-internet.de/gg/art_10.html</a>
Actually 10% false positive rate is not that bad if the system really has a 90% chance of detecting child abuse. Keep in mind that it only raises a flag, it doesn't automatically results in a false conviction.<p>I don't have the numbers but I think that during an investigation, way more than 10% of suspects did nothing wrong. In fact, some people estimate that 10% of <i>convictions</i> are wrong (though I think that's an overestimate). A 90% effective system may actually end up preventing false arrests, search warrants, etc... a win for privacy!<p>The real concern is the potential for abuse, not that 90% bar that is, I think, completely reasonable.
That old adage "<i>if you've nothing to hide, then you've nothing to fear</i>" is oft berated in these sort of discussions, but it really does apply here.<p>This regulation is for the purposes of criminal investigation into serious harms against children, not for spying on whatever innocuous messages you happen to be sharing with friends and family. The privacy fears are being way overblown for us ordinary people.<p>Paedophiles, on the other hand, do not deserve privacy. They need to be scrutinised their entire lives to keep children - the targets of their vile depravity - from harm.<p>I support this regulation; every parent should.