That's going to be interesting to see if the whole credit scoring industry will have to disclose their algorithms as well.<p>Looking at you, Schufa.
If you read the article they're asking for more than explaining algorithms. Overall they want the the tech providers to be responsible.<p>Explaining algorithms could, in theory, give away a competitive advantage. However fairness to users seems to be a priority in this decision.
The new regulation requires a hand-wavey style explanation i.e build a retrieval / ranking / matching algorithm that learns from customer clicks and considers blah blah.<p>There will be no explanation of the actual algorithm.
Six percent annual turnover for non-compliance seems too low.<p>Should be six percent for first offense, 12% for second, 25% for third, etc.<p>Until the company fixes it's compliance or becomes insolvent.
Requiring an alternative to algorithmic sorting (chronological) is good even though most sites do it already. "Explaining the algorithms" sounds like an impossible-to-implement, feel-good clause.<p>Requiring transparency for bans and censorship though will probably have a major effect if people start asking nosy questions and exposing corporate and government abuses of power. Many EU governments will regret that users can expose them , that will be fun to watch. It will also make it very hard for companies like reddit to function: could reddit be legally liable for actions of its moderators?<p>the other clauses are the typical wishful thinking by EU legislators who think that you can legislate the solution to unsolved or unsolvable tech problems
> “Dark patterns” — confusing or deceptive user interfaces designed to steer users into making certain choices — will be prohibited. The EU says that, as a rule, cancelling subscriptions should be as easy as signing up for them.<p>This is an excellent addition.
IANAL, so happy to be corrected, but my understanding is that EU and US law work in quite different ways. EU law sets general rules, and law courts decide what that means with reference to existing legal precendents. US law is very, very specific about what each clause means and how it should be interpreted.<p>Every time I see these kinds of discussions I wonder if quite a few of the disagreements are due to e.g. US commenters worried by the relative lack of specific details.
Before people get overly excited about this - it will be very important to see how exactly it's worded in the legislation itself.<p>Anti-discrimination legislation has already made black-box algorithms illegal if they are deciding on anything that a user might take objection to - so for most use cases this is not a big change.<p>As for - the recommender systems will have to not be based on profiling - unless we're talking about removing recommender systems based on data altogether - it will be interesting to see what the legislation considers profiling. If I tie your recommendations to the last viewed piece of content (content contextual recommendation), is that profiling? It's arguably worse for the user and for society more than profiling recommendation. If the recommendations are based on your explicit categories is that not profiling? Yet it's the principle used in news aggregators for the last 30 years.<p>The wording is going to be important here.
> The greater the size, the greater the responsibilities of online platforms<p>> as a rule, cancelling subscriptions should be as easy as signing up for them<p>Overall I like these principles, but we'll see in a few years how they're enforced in practice. It's been 4-5 years since we've had GDPR and I still see sites that require tens of clicks to disable all advertising cookies (and the most I've seen was 300+ clicks). Even Google only this week announced they'll add "reject all" button to their cookie banners.<p>I expect it'll be similar in this case, companies will do bare minimum to try to stay compliant with the regulation, and it will take a few years to see real differences, but I hope it's at least a step in the right direction.
What actually constitutes a full explanation of the algorithm? Article doesn't get into this enough, it mentions a high level overview is required but not much else. I can imagine that it's not going to require sharing the codebase or IP, of course.
In principle looks good but lots of potential for going wrong.<p>Just hope this doesn't backfire. The cookie law was also a thing the EU created with good intentions after some politicians decided "omg cookies are bad" and we ended up still using cookies but pop-ups in every single website basically forcing you to accept the use of cookies.
Do you think the EU will enforce the law for non-US and non-EU companies like TikTok will disclose them? It will be interesting to see if they will uphold the law equally to all.
A good start. However let's go further, simply ban personal tracking and personalized algorithmic feeds. This would combat the echo chamber effect and social media can become a broad community experience, like TV and newspapers. It would also cripple tech advertising revenues, thus redressing the balance with traditional media.
EU is becoming more like an authoritarian state. They put constraints on companies but allow governments to have full control and surveillance over their citizens. It's so hopocrit