Crimes are committed all the time using phones and yet, we do not allow phone companies to monitor, record and report the content of pur phone conversations. And they can't just add a line in a terms of service, they have no legal right to do it. (Ignore the issue of Metadata, I mean the audio portions of our calls)<p>So why have we accepted it is ok for all of our private electronic conversations to be monitored then?<p>These are horrific crimes descrbied here but all the solutions proposed reinforce the idea and start from the presumption that every SV company has a legal right to snoop on every conversation you have on their platforms.<p>I don't think you can have it both ways. If you routinely spy on your customers to sell ads, manipulate their behaviors and build sophisticated AI to make inferences about what every keystroke and mouse movement means, then you don't get to turn your head and pretend the crimes you witness on your platform aren't real.
> But the vast majority of the content that Meta reports falls under child sexual abuse materials (CSAM) – which includes photos and videos of pornographic content – rather than sex trafficking. Unlike with child sexual abuse imagery, there is no legal requirement to report child sex trafficking, so NCMEC must rely on all social media companies to be proactive in searching for and reporting it. This legal inconsistency – the fact that child sexual abuse imagery must be reported, but reporting child sex trafficking is not legally required – is a major problem, says Staca Shehan, vice-president of the analytical services division at NCMEC.<p>That says it all. When legally required, Facebook is aggressive and expedient about enforcement. Otherwise, they are not.
Not to absolve Facebook or instagram but it’s clear that the entire model of unrestricted access to global internet is ripe for abuse of vulnerable demographics. Also see scamming elderly with fake profiles of relatives claiming they’ve been kidnapped and so on.<p>I wouldn’t claim to know the solution. But our starting point is platforms built to provide strangers with unsupervised digital time spent with other strangers. One of those parties may be perverts and predators and the other may be children or teenagers.<p>You could destroy Facebook and instagram and I guarantee this moves to Snapchat, to TikTok, or to whatever other platforms fill the void.
Having traveled to a 3rd world country, I can easily buy weed (illegal there) from many Instagram accounts. So I totally believe that Instagram and FB are often used to facilitate illegal activities.<p>Also, Telegram is another - probably even more on Telegram.
Facebook is a "marketplace" for all sorts of crimes by virtue of having three billion users. Which is about 37% of the world's population. The fact that some horrible crimes slip through the cracks, as is inherent to every large population, is no excuse to treat that 37% as criminals.
> This legal inconsistency – the fact that child sexual abuse imagery must be reported, but reporting child sex trafficking is not legally required – is a major problem, says Staca Shehan, vice-president of the analytical services division at NCMEC.<p>This seems to be a pretty significant legal oversight.
Facebook has a history of letting crime groups operate even if they are obvious and don't even try to avoid detection: <a href="https://krebsonsecurity.com/2019/04/a-year-later-cybercrime-groups-still-rampant-on-facebook/" rel="nofollow">https://krebsonsecurity.com/2019/04/a-year-later-cybercrime-...</a><p>Is it really surprising though? Criminals, including pimps and child rapists, still watch ads, "engage" with the platform and increase the user numbers, and even encourage their peers to also join/engage with the platform. For something like Facebook surely it's a no-brainer to not miss out on all this "engagement".
I probably left too many comments in this thread, but I just remembered something. WTF happened to FOSTA (Fight Online Sex Trafficking Act) and SESTA (Stop Enabling Sex Traffickers Act)? The article doesn't make a single mention of either of these acts. Is Meta somehow exempt from these laws?
This is really hard to read. It fits in with what we know about meta’s content moderation in other instances though (see genocide in Myanmar). If you are an employee of Meta or even just a user of their products, I would ask you to take a step back and think about whether in totality this company is worth working for.<p>- doesn’t care that one group of humans is using their platform to plan to kill another group (Myanmar)
- doesn’t care about child sex trafficking
- doesn’t care that teen girls are committing suicide.<p>Even if you say it isn’t metas fault, don’t they bear more responsibility then they are taking?
It is imperative to enforce the law and block internet platforms that fail to comply with legal regulations. The internet cannot serve as a sanctuary for promoting neo-Nazi groups and other illegal activities, as it must remain subject to legal jurisdiction. All individuals and organizations, whether online or offline, must be held accountable to the law. It is unacceptable to allow hate speech, homophobia, and the promotion of heinous crimes, such as child murder, to proliferate unchecked. The platform Telegram, for example, was rightfully blocked for refusing to provide authorities with phone numbers. It is essential that this platform and others that violate legal standards be severely punished to ensure compliance with the law.