TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Dutch scandal serves as a warning for Europe over risks of using algorithms

244 pointsby Kialaabout 3 years ago

29 comments

jacquesmabout 3 years ago
This scandal and the issues at the dutch tax office runs much deeper than just this one set of issues, they also maintained illegal lists of &#x27;potential fraudulent people&#x27; on which 100&#x27;s of thousands of people that had done nothing wrong, such as having the &#x27;wrong&#x27; surname.<p>It&#x27;s rotten to the core and to date there have been zero consequences for the perps and the damage continues to pile up. Our government has fallen over this and - surprise - the exact same players were re-elected without taking any responsibility for any of this.
评论 #31014382 未加载
评论 #31014353 未加载
评论 #31014328 未加载
评论 #31020345 未加载
评论 #31014304 未加载
评论 #31017429 未加载
wallaBBBabout 3 years ago
By reading, seems that the algorithm is just a scapegoat used in the title. This is racial discriminations dating way back, purposely introduced in the algorithm, and supported for 6 years.<p>&gt; Having dual nationality was marked as a big risk indicator, as was a low income.<p>&gt;In 2020, Trouw and another Dutch news outlet, RTL Nieuws revealed that the tax authorities also kept secret blacklists of people for two decades, which tracked both credible and unsubstantiated “signals” of potential fraud. Citizens had no way of finding out why they were on the list or defending themselves.<p>&gt;An audit showed that the tax authorities focused on people with “a non-Western appearance,” while having Turkish or Moroccan nationality was a particular focus. Being on the blacklist also led to a higher risk score in the child care benefits system.
评论 #31015332 未加载
评论 #31016522 未加载
评论 #31015464 未加载
评论 #31015124 未加载
评论 #31014329 未加载
logifailabout 3 years ago
&gt; Having dual nationality was marked as a big risk indicator<p>Dual nationality indeed has its ups and downs. FWIW, our children have dual nationality, and on top of that, are all bilingual.<p>On enrolling them in their first school the school insisted on tweaking the enrollment data we&#x27;d submitted to list the locally-spoken language second, and so put the non-local language first, as &quot;mother tongue&quot; (and yes, that&#x27;s exactly how it&#x27;s described round here &lt;sigh&gt;). This meant the school would get extra money from the government from a pot of money intended to support migrant kids with language issues.<p>The first time this happened we pushed straight back and asked to get this corrected (all our kids were born locally and speak the local language and dialect like everyone else round here), but the school wasn&#x27;t having it. They couldn&#x27;t understand why we didn&#x27;t want to support the school getting extra money.<p>What did we learn from this? If the incentives and&#x2F;or system are broken by design, it can get very hard for anyone wanting to get the system corrected later.
评论 #31015817 未加载
评论 #31015918 未加载
sidewndr46about 3 years ago
&gt; The Dutch tax authorities now face a new €3.7 million fine from the country&#x27;s privacy regulator.<p>How does this work? Wouldn&#x27;t the government be fining the government and paying itself with the citizens tax revenues?<p>Also, this doesn&#x27;t seem to serve as a warning at all. Unless all the politicians and civil servants were removed from their job, barred from public positions and given hefty fines then this is actually a clear sign encouraging more of this behavior from governments. Until there are clear consequences enacted by the populace, governments will continue this kind of behavior.
评论 #31014338 未加载
评论 #31014551 未加载
tomputerabout 3 years ago
In the documentary <i>Alone Against the State</i>, five mothers who became victims of the Dutch childcare benefits scandal share the chilling stories of how their lives and that of their children were destroyed by a state system deprived of humanity.<p>It is a Dutch documentary, unfortunately without English subtitles:<p><a href="https:&#x2F;&#x2F;www.2doc.nl&#x2F;documentaires&#x2F;series&#x2F;2doc&#x2F;2021&#x2F;alleen-tegen-de-staat.html" rel="nofollow">https:&#x2F;&#x2F;www.2doc.nl&#x2F;documentaires&#x2F;series&#x2F;2doc&#x2F;2021&#x2F;alleen-te...</a>
Tozenabout 3 years ago
This really isn&#x27;t about &quot;algorithms&quot;, so much as it is about xenophobia and ethnic discrimination. The &quot;algorithm&quot;, is just another plausible deniability cover for institutionalized discrimination against various minorities and the poor who are least able to fight back.<p>Those in charge, know exactly what they are doing when they create their &quot;blacklists&quot; and &quot;targets&quot;. They know the names, they have the bias and prejudices, and they want to target those minority groups. The poorer and less able to fight back, the better. They know the type of damage they are doing to those peoples lives, and sadistically enjoy inflicting it.<p>It&#x27;s only when they get caught, do they start running around and pulling out excuses from their orifices, and &quot;it was the algorithm&quot; is just one of them.
评论 #31016617 未加载
nicgrev103about 3 years ago
Reminds me of this: <a href="https:&#x2F;&#x2F;www.bbc.co.uk&#x2F;news&#x2F;business-56718036" rel="nofollow">https:&#x2F;&#x2F;www.bbc.co.uk&#x2F;news&#x2F;business-56718036</a> 700 Post office franchisees made criminals by faulty accounting software.
fallingknifeabout 3 years ago
&gt; Authorities penalized families over a mere suspicion of fraud based on the system’s risk indicators<p>This is the real story here. The algorithm is incidental.
评论 #31014590 未加载
评论 #31014442 未加载
评论 #31014562 未加载
rowanG077about 3 years ago
I&#x27;m dutch and this really is disgusting. But you have to understand that the Dutch population has no real problem with this. The exact same political parties were elected a few months after this scandal. It&#x27;s just democracy really.
评论 #31015380 未加载
BlueTemplarabout 3 years ago
Around here we have a law where any decision taken by the public sector with the help of a computer program must (upon request) be explained in plain language, the calculations detailed step by step.<p>Of course, this indirectly prohibits the use of the kind of programs that seem to have been used in this case, since instead of an algorithm, they use a neural network which is a black box even to his trainer (or architect).<p>Sadly, I hear that our own tax authorities have started to use them <i>anyway</i>, but then they have long had this tendency of consider themselves &quot;special&quot;, and that laws didn&#x27;t apply as much to them...
评论 #31014843 未加载
gsliepenabout 3 years ago
Another issue that contributed to this scandal is that the rules for benefits (not just social benefits) are often such that you have to request it up front at the beginning of the year, but if at the end of the year it turns out that your situation changed midway and you no longer had the (full) right to the benefit, you have to pay it back along with a fine. It&#x27;s really easy to not be aware of this.
einpoklumabout 3 years ago
I don&#x27;t understand the article.<p>Suppose they had an algorithm for computing a risk score. Ok, so someone is determined to have a &quot;high risk score&quot;. That doesn&#x27;t mean that they&#x27;ve done anything wrong; at most, it means prioritizing them for closer scrutiny. That could still be discriminatory (e.g. focus on Turkish&#x2F;Moroccan immigrants), but in itself - it&#x27;s not supposed to lose anyone their benefits.<p>Also, the article says:<p>&gt; Citizens had no way of ... defending themselves.<p>Why could they not sue the tax authorities in court, asking the court to order the tax authorities to withdraw their payment demands? If there was no evidence of them having evaded taxes, I mean.
评论 #31016574 未加载
mint2about 3 years ago
This is utterly bonkers. The discrimination sounds completely blatant and is it actually saying any case flagged by the algorithm was considered and treated as proven fraud rather than just raised for investigation?!? That’s absurd.
kkfxabout 3 years ago
That&#x27;s not really an algorithm usage issue but an administrative ones. Really. The issue does not arise from an ineffective&#x2F;wrong system but for the absence of quick, powerful and effective human backup.<p>A small example: automated procedures, if well designed tend to be really effective *in means* witch means they are good. BUT they tend to have hard to spot (or even evident, but no one care up front) loopholes and corner cases that makes not them but their use a nightmare. In those case the solution is simple: a human that pick all issues not automatically treated.<p>The *administrative* issue is this: those who choose automation must know it can fail and so must be prepared for a normal and regular and effective backup.<p>Personally if a day my country tax agency ask me 100k euros for a clear error I feel no specific scandal, anyone can makes error, but I expect to have a phone number to call where someone pick the call quickly and in little time we can sort out the issue. It&#x27;s not a life-or-death thing, and it&#x27;s complex, issues normally happen and that&#x27;s is. If I can&#x27;t call because no one answer or they answer but they are powerless or there is no quick solution than it&#x27;s not the erroneous claim the scandal, but the absence of means to sort it out rapidly.
matheusmoreiraabout 3 years ago
The perils of false positives and making decisions based on probability. These governments are punishing innocents and hoping the guilty is among them. Horrifying.
a_bonoboabout 3 years ago
Sounds a bit like Australia&#x27;s illegal Robodebt scheme:<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Robodebt_scheme" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Robodebt_scheme</a><p>The Australian system was simpler than the Dutch system, comparing social benefits paid with averaged tax-income from the ATO. It was a stupidly wrong calculation, the averaged income is much higher in people with inconsistent income than it actually is, so people were hounded by debt collectors for nonexisting debt. A$1.2 billion false debt for a nation of 25 million people. A good number of associated suicides.<p>Of course, there was no real accountability, with the system having been introduced by the previous government who is now in opposition. Money has been paid back but everybody involved still has their job.
ai_ja_naiabout 3 years ago
&gt; &quot;An audit showed that the tax authorities focused on people with “a non-Western appearance,” while having Turkish or Moroccan nationality was a particular focus&quot;.<p>This has nothing to do with algorithms, mates. It had to do with how it wasn&#x27;t tested and monitored. It appears that it was just rolled out live.<p>&gt; &quot;Fraud prediction and predictive policing based on profiling should just be banned&quot;<p>And here are the usual luddists pandering on the confused using wrong arguments. Sigh.
评论 #31014756 未加载
评论 #31015045 未加载
goatcodeabout 3 years ago
From what the article spelled out, using algorithms isn&#x27;t precisely the risk, but using them in a completely horrendously unpracticed and uninformed way seems to be.
RcouF1uZ4gsCabout 3 years ago
&gt; Chermaine Leysner’s life changed in 2012, when she received a letter from the Dutch tax authority demanding she pay back her child care allowance going back to 2008. Leysner, then a student studying social work, had three children under the age of 6. The tax bill was over €100,000.<p>A student can get €100,000 for child care?<p>That is amazing.<p>For me, that is the most surprising thing in the entire article.
评论 #31014564 未加载
评论 #31014530 未加载
评论 #31014435 未加载
turbo_beanabout 3 years ago
Born and raised in EU South, I grew up looking up to Northern Europe with regards to their efficiency, values, etc.<p>This is news I&#x27;d expect to hear from a backwater country I&#x27;d never heard off. But I&#x27;m afraid it is just yet one data point of how wrong I was and how bad things have gone in the heart of EU and maybe the whole western world.
jokoonabout 3 years ago
I&#x27;m on welfare in france, and sometimes I wonder if the government is going to find some excuse to stop giving me money.<p>Could be &quot;not applying to enough job offers&quot;, while most employers are always complaining about candidates not being skilled enough, which is why unemployment is so high.
评论 #31019328 未加载
puffoflogicabout 3 years ago
When are we going to bite the bullet and require governmental licensing for algorithms? I know HN hates the idea because we&#x27;re all too busy using algorithms to ruin people&#x27;s lives for profit, but it needs to happen and it will happen sooner or later.
mahesh_rmabout 3 years ago
Once upon a time I made the mistake of incorporating a startup in the Netherlands. Never. Again.
评论 #31014556 未加载
评论 #31014734 未加载
评论 #31015068 未加载
drno123about 3 years ago
If I calculate correctly, Dutch child care allowance is around $2000 per month, which seems generous. What is the average child care allowance in other EU countries?
kragenabout 3 years ago
&quot;Risks of using algorithms&quot;? Clearly the Enlightenment has ended, because this is a headline that would only make sense in the age of the Inquisition.
smclabout 3 years ago
Nadine Dorries (UK Secretary of State for Digital, Culture, Media and Sport) was right, we should simply get rid of algorithms. They are clearly harmful &#x2F;s<p>[0] - <a href="https:&#x2F;&#x2F;www.huffingtonpost.co.uk&#x2F;entry&#x2F;nadine-dorries-microsoft-algorithm-twitter_uk_62331aaee4b0d39357c37f9c" rel="nofollow">https:&#x2F;&#x2F;www.huffingtonpost.co.uk&#x2F;entry&#x2F;nadine-dorries-micros...</a>
raxxorraxorabout 3 years ago
&gt; That’s not good enough, argues Renske Leijten, a Socialist member of the Dutch parliament<p>Exactly, and especially in the public sector on the EU level this could be even vastly more damaging. The warning were loud enough and people didn&#x27;t listen or care, so it is probably necessary that many will have to feel it before anything will change. This is probably too small scale, even if thousands are affected.
liftmabout 3 years ago
&gt; risks of using algorithms<p>Bubblesort so dangerous. Not exactly the best headline, is it?
评论 #31014627 未加载
评论 #31022005 未加载
chekibrekiabout 3 years ago
I hate how the correlation between income&#x2F;country of origin and fraud probability is discarded as „racist“ by at least Amnesty International.<p>Maybe we should first do a correlation and causality analysis and see what it yields instead of closing our eyes because of PC?
评论 #31014451 未加载
评论 #31015175 未加载