People who blame "algorithms" are completely besides the point.<p>Facebook/IG make some people feel bad, because they open the website and see people having fun, people out partying, people looking more attractive, and they feel "worse". You can't blame Facebook for that! Those people will exist regardless, and the "social competition" will happen regardless, simply through other formats. And by blaming Facebook, you're really blaming these "happy" people for making [you/others] feel worse by comparison. You will never be better than everyone alive at everything, and comparing yourself to others will always be pathological, leading to either a superiority or inferiority complex.<p>The point people miss: it is simply impossible to have a website where a billion people post pictures of themselves, that doesn't "make people feel bad". You can argue the human brain isn't meant for that, but it's clearly non-pathological for the otherwhelming majority of people, except those with preexisting low self-esteem and anxieties. People with inferiority complexes "externalise" their feelings onto Instagram when the core problem is their inability to deal with the <i>awareness</i> of other people's lives. Seeing a YouTube video of someone backstage at a fancy private concert would have the exact same impact.<p>Look at any social media websites without these "evil" algorithms: Mastodon, Gab, Parler. All equally toxic, though for very different reasons. First, you can't moderate at 1-billion-users-scale without moderation algorithms. Second, specifically, "discovery/recommendation algorithms" don't create toxicity; the toxicity is caused by flawed, sometimes-pathological humans, and amplified not by algorithms, but by the social dynamics among these flawed humans. Fixing that requires more "manipulative algorithms", not less!<p>The Facebook problem is clearly not, as millions believe, that Facebook creates toxicity from the top-down, and makes people feel bad. The core (and only) problem is <i>other people</i>. i.e.: their moderation needs to improve. Algorithms will inevitably recommend eating-disorder content based on common interests, the solution is to ban such content. Simple.<p>Same with the heavily edited IG modelling photos. Some people like that look, some people hate it. They should require disclosure when a photo is heavily edited, but the overwhelming majority of people will never feel insecure from that, maybe pity at most. Your own psychological makeup determines how you react, not any Facebook policy, and any such policy categorically <i>cannot</i> solve the core problem, which (I say this with respect and sympathy) is that a large amount of people have emotional wounds and need help.<p>People are throwing the baby with the bathwater, believing that mass-scale social media is fundamentally nefarious. It only is, for the same reason that large corporations become less innovative, or that any large social group becomes less cohesive: people's emotional problems get amplified the more complex a social group/organization is. Entropy sets in, and the social dynamics become too complex; conflict increases. The core problem, that everyone misses, is that <i>this cannot be addressed by the organization</i>, at best mitigated. The root cause is in individuals. Society is blaming Facebook for its own defects, at great cost to its worst-off members.