A long, long time ago (within the past ten years), I had to verify my age with a site. They didn't ask for my ID, or my facial scan, but instead asked for my credit card number. They issued a refund to the card of a few cents, and I had to tell them (within 24hr) how much the refund was for, after which point they'd issue a charge to claw it back. They made it clear that debit and gift cards would not be accepted, it <i>must</i> be a credit card. So I grabbed my Visa card, punched in the numbers, checked my banking app to see the +$0.24 refund, entered the value, got validated, and had another -$0.24 charge to claw it back.<p>Voila, I was verified as an adult, because I could prove I had a credit card.<p>The whole point of mandating facial recognition or ID checks isn't to make sure you're an adult, but to keep records of who is consuming those services and tie their identities back to specific profiles. Providers can swear up and down they don't retain that information, but they often use third-parties who may or may not abide by those same requests, especially if the Gov comes knocking with a secret warrant or subpoena.<p>Biometric validation is surveillance, plain and simple.
This is never about protecting the children.<p>This is always about government overreach.<p>People are less likely to criticize the government, or even participate in political debate, if their online identities are know by the government. Governments like obedient, scared citizens.<p>The only ethical response to laws like this, is for websites and apps to terminate operations completely in countries that create them. Citizens who elect politicians without respect for human rights and privacy don't really deserve anything nice anyway.
Aside from the privacy nightmare, what about someone who is 18 and just doesn't have the traditional adult facial features? Same thing for someone who's 15 and hit puberty early? I can imagine that on the edges, it becomes really hard to discern.<p>If they get it wrong, are you locked out? Do you have to send an image of your ID? So many questions. Not a huge fan of these recent UK changes (looking at the Apple E2E situation as well). I understand what they're going for, but I'm not sure this is the best course of action. What do I know though :shrug:.
I don't think the problem is that young people are finding porn on the internet. There is a problem, though, and it has to deal with psychological warfare on attention<p>Formats like shorts or news feeds to you algorithmically with zero lag are the problem. It makes for the zombification of decision making. Endless content breaks people down precisely because it's endless. I think if you add age verification but don't root out the endless nature, you will not really help any young person or adult.<p>When you look at people with unhealthy content addiction, it is always a case of excess and not necessarily type of content. There are pedophiles but honestly, we have had that throughout all time, with and without the internet. But the endless feeding of the next video robs people of the ability to stop by mentally addiciting them to see just one more. And because content is not really infinite, endless feeds invariably will feed people with porn, eating disorders, and other "crap" in quantities that slowly erode people.
> The social media company requires users to take a selfie video on their phone and uses AI to estimate the person's age.<p>What I did not see in this article was anything about how AI can tell a 13 year old from a 12.9 year old with confidence. This seems unlikely to me.<p>I agree with the article's implication that websites will now want a scan of everyone's faces forever. Their insistence that they won't store the face scans is like one those cute lies that kids tell, and adults aren't fooled by. Either you're outright lying, or you're using the loophole of not storing the image, but rather storing a set of numbers, drived from the image, which act as a unique fingerprint. Or, you're sending it to a third party for storage. Or something like that. But you're definitely keeping track of everyone's faces, don't try to pull a fast one on me young lady, I've been around the block before.
I would like to think there there is a solution that can be engineered, in which a service is able to verify that a user is above an appropriate age threshold, while maintaining privacy safeguards, including, where relevant, for the age-protected service not to be privy to the identity of the user, and for the age verification service to not be privy to the nature of the age-protected service being accessed.<p>In this day and age, of crypto, and certificates, and sso, and all that gubbins, it's surely only a matter of deciding that this is a problem that needs solving.<p>(Unless the problem really isn't the age of the user at all, but harvesting information...)
I'm in the UK and discord has asked me to complete this check (but I haven't, yet). I can still use discord just fine, it just won't let me view any media it considers "adult".<p>I am an adult but refuse to let them scan my face as a matter of principle, so I've considered using <a href="https://github.com/hacksider/Deep-Live-Cam">https://github.com/hacksider/Deep-Live-Cam</a> to "deepfake" myself and perform the verification while wearing a fake face. If it works, I'll write about it.
I suspect the endgame of this campaign is to have mandatory ID checks for social media. Police would have access to these upon court orders etc and be able to easily prosecute anyone who posts 'harmful' content online.
Like many other people here, I'm wondering what we'll end up having to do at work do deal with this. We don't have the resources to put a full time person on this, and the UK's not a huge market.<p>For unrelated reasons, we already have to implement geoblocking, and we're also intentionally VPN friendly. I suspect most services are that way, so the easy way out is to add "UK" to the same list as North Korea and Iran.<p>Anyway, if enough services implement this that way, I'd expect the UK to start repealing laws like this (or to start seeing China-level adoption of VPN services). That limits the blast radius to services actually based in the UK. Those are already dropping like flies, sadly.<p>I hope the rest of the international tech community applies this sort of pressure. Strength in numbers is about all we have left these days.
This feels more like spying on everyone than making the internet safe for kids. Big companies and the government are already tracking what we do online. This just seems like a further reduction of our privacy on the internet.<p>Parents need to be more involved in what their kids do online, just like in real life. Grounding them isn't enough. We wouldn't let them wander into dangerous places, so we shouldn't let them wander online without adult supervision. Also, parents need to prepare for having tough conversations, like what pornography or gambling is.<p>Online companies need to really work to make their sites safe for everyone. They should act like they own a mall. If they let bad stuff in (like pornography, scams, gambling), it hurts their reputation, and people will leave.<p>Instead of banning everything, because some people take pleasure in those activities, maybe there should be separate online spaces for adults who want that kind of content, like how cities have specific areas for adult businesses. This way, it would be easier to restrict children's access to some hardcore stuff.<p>If we all put some effort into figuring out easy and privacy-friendly solutions to safeguard kids, we can rely on simple principles. For example, if you want to sell toys to kids, you shouldn't sell adult toys under the same roof (same domain) or have posters that can affect young minds.
Of all the terrible, dumb-headed ideas. I would <i>not</i> want my kids scanning their face into who-knows-what third party's service.<p>I already decline this technology when finance companies want to use it for eg. KYC verification ("Sorry, I don't own a smartphone compatible with your tool. If you want my business you'll have to find another way.
Happy to provide a notarized declaration if you'd like" has worked in the past).
Frankly I'm scared by governments and corporations going "papers, please" for people to be allowed to access the Internet. On top of endangering privacy by tying pseudonymous online interactions to real-life ID and biometrics, attempts to block under-18 people from finding information or interacting online will only amplify how society regards them as not having rights. This will isolate people (especially gay and trans teens) living with abusive parents from finding support networks, and prevent them from <i>learning</i> (by talking to friends in different situations) that being beaten or emotionally put down by parents is abusive and traumatizing.<p>I know all too well that when you grow up you're psychologically wired to <i>assume</i> that the way the parents treated you is normal, and if they harmed you then you deserve to be hurt. I've made friends with and assisted many teens and young adults in unsafe living situations (and talked to people who grew up in fundamentalist religions and cults), and they're dependent on online support networks to recognize and cope with abuse, get advice, and seek help in dangerous situations.
It's interesting how the "features" which many claim IRC is missing turn out to be a huge liability. Adult content is applied via image hosting, video/audio chat, etc. All things IRC lacks.
I think regulation could be done better...<p>Let's assign one or ideally two adults to each underage child, who are aware of the childs real age and can intervene and prevent the child from installing discord (and any other social media) in the first place or confiscate the equipment if the child breaks the rules. They could also regulate many other thing in the childs life, not just social network use.
So, what will be the proper technology to apply here? I have no problem with verification of my age (not the date of birth, just the boolean, >18yo), but I <i>do</i> have a problem with sending any party a picture of my face or my passport.
How do we fight back against this? I don't want my face scanned on a smartphone to use goods and services. Kyc checks for banks are bad enough.<p>I miss the internet of the early 2000s.
Relevant news article from yesterday:<p><a href="https://www.wired.com/story/new-jersey-sues-discord/" rel="nofollow">https://www.wired.com/story/new-jersey-sues-discord/</a><p>> Platkin says there were two catalysts for the investigation. One is personal: A few years ago, a family friend came to Platkin, astonished that his 10-year-old son was able to sign up for Discord, despite the platform forbidding children under 13 from registering.<p>> The second was the mass-shooting in Buffalo, in neighboring New York. The perpetrator used Discord as his personal diary in the lead-up to the attack.<p>In other words, this is yet another attack on privacy in the name of "protecting the children".
This is how you lose your comfortable market monopoly like Skype did. Recall that Skype had better P2P tech than Discord did and would still be the market leader if MS had chosen to update anything at all besides the logo bi-yearly.
Regulators would never comprehend internet. They are making it look like they have no idea that on the internet you can: move to another country without visa in 2 minutes, change your face, voice, fingerprints to whatever you like. Get any passport, any document you want to mock any KYC or impersonate anyone without a trace, all within 10$ range.<p>Sure, companies have no option but to implement funny policies like these, and I'm sure any kid is much smarter than the government, so he will feel good circumventing it.
Maybe the start of a bigger shift to another platform. I'd wager a large portion of the Discord user-base is underage, and they've got nothing but time.
why wouldn't an identity/age verification scheme that blinds both sides work?<p>e.g. a site wants to have some proof of identity. it generates a token and sends the user with it to a government service. the service verifies the user's identity, signs the token and sends the user back.<p>now the site knows that the government service has verified the identity (and relevant characteristics, like age threshold), but doesn't know the identity. the government service obviously knows the user but doesn't know the online account tied to the identity. this can be further separated by using a middleman authentication provider, so that even the site identity itself doesn't reach the government.<p>am i missing something obvious why that wouldn't work?
This will definitely just apply to social media and the situation won't be abused by other companies even if they have no legal requirement, absolutely not, no sir.
It's worth noting that Matt Navarra, the sole source of "this is part of a bigger shift", is an ex member of the UK government who worked in the PM's office <i>and worked for the BBC</i>.<p>This story is a tempest in a teacup. The administration found someone to spread this nonsense so every later goes "well that was inevitable, the BBC predicted it would be."<p>Yeah, and bank robbers can predict that a bank is going to have less cash after a certain day.<p>This obsession the British have with kids online is so tiresome. You want to stop child sexual assault? Maybe do something about your royalty flying to island getaways organized by a human trafficker and ultra-high-end pimp for underage kids? Or do something about your clergy diddling kids?<p>Maybe the reason the UK government thinks this is such a big issue is because these legislators and officials are so surrounded by people who do it...because politicians are right there next to clergy in terms of this stuff.
I am getting sick and tired of the thinly veiled excuse of "we need to strip away more of your privacy in order to protect the childen" we all know they are doing it because they want to surveil/track you more easily.<p>and for those that think they are actually doing this to protect the children and you are concerned about what your children sees online this might sound a bit harsh but why dont you actually parent. Stop giving your kids unlimited access you tablets/computers etc. back in my day there was the option of having a single computer for the child in a public room that could not be moved. you could create whitelist only sites nowadays very easily even for laymens.<p>i understand it is a bit harder nowdays because more parents are both working to support the family but i rather not loose what little privacy we have left as a society because its requires more work for you to parent
I'm on several UK-based soccer message boards and none of this seems to be required there. The forums are running on Xenforo or PhpBB, self-hosted by the admin. Some of those forums have thousands of user accounts registered.<p>Is Discord considered to be different as it's a centralized aggregator platform like Reddit, vs a standalone thing like a message board?
Identity verification remains unsolved and likely will remain that way. Any attempts at improvement are authoritarian. And the status quo leave massive room for circumvention.<p>Personally, I grew up in an era before there was any expectation of validation, and enjoyed the anonymity of message boards and forums. But when people are posting blatantly illegal activity online, I can see the appeal for more validation. Just makes me sad.
A book recommendation on the topic:<p>> This is the first book to examine the growth and phenomenon of a securitized and criminalized compliance society which relies increasingly on intelligence-led and predictive technologies to control future risks, crimes, and security threats. It articulates the emergence of a ‘compliance-industrial complex’ that synthesizes regulatory capitalism and surveillance capitalism to impose new regimes of power and control, as well as new forms of subjectivity subservient to the ‘operating system’ of a pre-crime society.<p><a href="https://www.amazon.com/Compliance-Industrial-Complex-Operating-Pre-Crime-Society/dp/3031192230" rel="nofollow">https://www.amazon.com/Compliance-Industrial-Complex-Operati...</a>
This is gong to do a real number on YouTube drama documentary channels.<p>Where are you gonna get your content if the lolcows can't creep on minors on Discord anymore?<p>I mean, in theory, they could find ways to circumvent it, but if they were that smart, they wouldn't be the subject of YouTube drama documentaries.
Nope. There's better ways to check your over 18; credit cards have been mentioned above. If a platform I'm using attempts this, with no other option, I'll delete my account and data on that platform.<p>VPNs are really a requirement for UK residents now.
They will do just enough so that they comply with the law while kids will be able to easily bypass it.<p>Where there is a will, there is mean and teenager looking for porn... That's a big willpower.
discord is shit, poorly designed software, with all the most obnoxious poor security decisions (like requiring phone calls), with poor political decisions on top as well as spying. probably one of the worst pieces of software to ever exist. all it has is momentum. it's like pop music where a million people make bands and one wins the fame lottery.<p>it's primarily a windows program and they can't even make a proper windows gui but embed a website, so clicking on anything is like a link instead of focusing into it. for instance if you middle click someone's name it opens it in a browser. fuck off. pressing alt+f4 closes discord instead of sending it to the tray (despite being a tray program). it's always updating something and then it just says "logging in" instead of saying what it's doing. it gets stuck indefinitely if you log in on a slow connection or you unplug lan while it's logging in or doing whatever it's doing at any given moment. absolutely the most frustrating crap to use. it has a billion options for stupid "hardcore" gamers (i am, too, but i don't need it) with special needs while not being a basic quality application that conforms to any UI standard.<p>they openly spy on you, not even trying to hide it.<p>instead of real software, it's a stupid fucking social media "community", so you can't just use it as a MECHANISM NOT POLICY program, instead every time you do something like log into a different account you have to check whether this is morally correct or will somehow harm their "community". like say i want to work on my blockchain, who are dumb enough to use discord as their main communication platform. i obviously then would want one account for that, then another - completely separate (but perhaps with the same phone number to make it logistically easier which SHOULDNT EVEN BE A THING, this is the internet) - account for playing games (often during work), i can't just log into these simultaneously, i have to go check what their policy is on that. literally, my first thought is that like typical incompetent american software devs, they will think i'm trying to scam people or some other kind of "abuse". and of course, they appear to have conceded to partially implement this "feature" (by undoing their nonsense about forensically attempting to forbid this)
They’re using the databases to go after illegal immigrants right now. Soon it’ll be using the porn databases to go after Gay people. They’re trying to use the healthcare databases to go after Trans people. All this verification is nothing but a way to commit genocide against minorities. Porn is so far down on the list of harmful things. There’s no pearl clutching over alcohol and other drugs like Americans have with porn. Nation of pansies.
I see a lot of comments here arguing age requirements are overreach and these decisions should be left to the parents. To those presenting such arguments, do you think that applies to other activities as well? What about smoking/drinking/firearms? Pornography? Driving?<p>I haven't researched the topic of social media's effect on young people, but the common sentiment I encounter is that it's generally harmful, or at least capable of harm in a way that is difficult to isolate and manage as a parent.<p>The people closest to this issue, that is parents, school faculty, and those who study the psychology and health of children/teens, seem to be the most alarmed about the effects of social media.<p>If that's true, I can understand the need to, as a society, agree we would like to implement some barrier between kids/teens and the social media companies. How that is practically done seems to be the challenge. Clicking a box that say's, in effect, "I totally promise I am old enough." is completely useless for anything other than a thin legal shield.