I did an internship with a very profitable Swedish mobile game before the pandemic, and it was really interesting to see the doublethink: everyone knew we were making a drug, but people avoided speaking of it in that way. It was always "our most engaged players readily spend on in-game items", and never "the most addicted users spend money so they can continue playing"<p>It was a great workplace, but I don't think it was adding much value to the world.
I'd add that algorithmic feeds are another part of the triad (quartet?), and probably the worst offender. Personalized addictive content is on the rise (tiktok, YouTube, even Facebook and Twitter curate your newsfeed "for you" ), and these ml-determined feeds drive engagement to the highest possible levels, leading to an incredible waste of time, a distorted world view (mostly the posts with the highest engagement will be those that call for outrage, which leads to radicalization [1]), and just in general a society where our information consumption is determined by black boxes with opaque goals that do not optimize for the common good.<p>I attribute a significant part of the political rift and radicalization that has occurred in the western world in the past few years to specifically this feature, the algorithmically determined, engagement maximizing, feed.<p>[1] <a href="https://www.nature.com/articles/s41599-020-00550-7" rel="nofollow">https://www.nature.com/articles/s41599-020-00550-7</a>
I still haven't fully grasped that we reached this point of perversiveness online. We have brilliant minds and a ridiculous amount of money working on making Alabama's soccer moms click more ads, and we have reached the point of making people addicted to <i>websites</i> to reach that goal.<p>Obviously this is nothing new, and the cigarette industry has been profitting off addiction for decades, but I never expected the same thing (except the very immediate threat of lung cancer) to happen to tools that could have been optimized for so much good instead.<p>And the worst part is that some of it is just <i>so incompetent</i>. I've been trying to break away from Facebook for a while, and recently (the past year or so) I started realizing that my Facebook feed always shows me the exact same posts in the exact same order for <i>days</i>, to the point where I know exactly what post will be first when I open the website. Where is all the Facebook money going? What are these people working on? What are they <i>doing</i>?
I always thought it was pretty obvious when YouTube rolled out screen time reminders.<p>It had nothing to do with anything but minimizing liability in case the general public ever catches on to what they are doing to kids mental health.<p>What’s hilarious is they probably have some “data retention policy” and will get away with it.<p>I guess it’s why SV engineers want to care about social issues so much. When you’re up in your $300k+ a year ivory tower built on something you know is dirty, you are exactly the type of person to be angry and project frustration.<p>I saw on Reddit a screenshot of AOC dogging on old people in Congress for not understanding digital. She listed a bunch of modern day issues and this wasn’t one of them. If even she doesn’t know, we really are screwed.
Anything that provides a 'hit' stands a chance of being addictive. Including cheeseburgers and fun video games.<p>>Most alarming is the “internet points”. On Reddit, this is called Karma. On Twitter, it’s likes and retweets. Ostensibly, this simple numeric score displays the community’s overall attitude toward a given piece of content. On its face, this appears to be a radically democratic concept; Everyone can vote! The reality is very different. Reddit, for example, has always obfuscated the true Karma score (“to prevent vote brigading”), and the position of a piece of content within the feed can be purposely decided by the Reddit home office, not by the community. This is incredibly, deeply sinister.<p>Why is it 'deeply sinister' -- he just seems to assert it, reddit home office isn't putting other content there that isn't sponsored -- astro turfing exists and explains a lot of that but I don't think it's the reddit admins. The voting mechanism is pretty good at determining interesting content from uniteresting content.<p>This whole article reads like a conspiracy theorist rant like some luddite against tech. There are negative behviors associated with modern technology but this article just asserts that without really elaborating exactly why. It just blankly gestures "you know -- bad tech, feedback loops, doom srolling" all the keywords!
I'm increasingly finding that I have to actively struggle to take ownership of my time and attention. Without an active, conscious, effort to reclaim your mental resources you WILL be caught up in the never-ending cycle of dopamine-dependency to novel information or visual stimulation. Your attention is a valuable resource and many companies have spent billions on figuring out how to farm it.<p>Some of my more effective methods have included: strict site-blocking via OpenDNS, apps to implement time based blocking of information-novelty sites (Twitter, HN, reddit, instagram, CNN), almost complete disabling of notifications, with the exception of text messages and async work chat during business hours.<p>All of these methods, I can undo, but it prevents or at least slows down the automatic, reflexive app/site opening when my reptile brain craves a dopamine hit.
This is what I think The Social Dilemma missed. I worked at one of these companies and did not see anybody evil looking to enslave users in front of their screen. Instead we built new features and celebrated when time spent in our app went up or a new feature was used because we assumed the user must be getting value. After some time we switched metrics from time spent in app or daily active users to Net Promoter Score but at the end of the day if analytics didn’t show that a feature was used it was dropped and if it was being used we’d double down on making it “better” (used even more).<p>I did not see any evidence of malicious intent towards our users but a genuine belief we were enabling them to be more productive.
Can't remember who this quote came from but it went something like this: "The only two things that call their clients 'users' is drugs and software."
Often something made in the name of user engagement makes me less addicted instead<p>Say we have something that shows all expanded comments immediately: keeps me on that site<p>Then they do an "engaging" redesign where only 10% of the comments actually show and requires lots of clicking to expand them, and lots of unrelated animated images show below: makes me want to close that tab and do something else
Yes all the social medias are addiction merchants. The people know full well that they sacrifice their data, but they do not care about that or ads.
The people use them for gratification about likes and as a platform to be "heard" or voice their opinions.<p>Society is punishing drug users and vendors, which might be the right thing to do or not, I am not a decision maker.
But the addiction should be measured in dopamine emissions and withdrawal symptoms. Does not matter if the product is a recreational drug or a virtual saas. Do not let the addiction merchant hide behind marketing terms like social media.
The corporations use social to tick a compliance box and mislead the customers. To make it sound harmless or less dangerous. Similar to vegan, bio labeled food.
Social media is targeting the most vulnerable and young.
Sounds drastic? I do not think so
Lots of good points here. I particularly like the "dark triad" bit, though I think a lot of nuance around different types of likes/upvotes/whatever is missing. It <i>makes a difference</i> whether those are shown to others and/or used to sort others' feeds. That last point is very close to montenegrohugo's about algorithmic feeds, with which I completely agree. <i>Algorithmic feeds</i>, especially those which show content from people/groups I don't even follow, definitely deserve a place in the Dark Tetrad.<p>To illustrate these points, consider HN itself. Yes, HN uses an algorithm to determine story prominence. Yes, it has upvotes. Yes, it uses relative timestamps. OTOH, the upvotes are not shown to others and it doesn't have infinite scroll. I also wouldn't consider it a "feed" like Twitter or Facebook. On the third hand, even the features HN does have arguably contribute to echo chambers and audience pandering. I think this well illustrates that it's not easy to determine good vs. bad social media, and it would be hard to argue here that all social media are bad. ;)
I'm blessed to be an older programmer who started my career back when we were selling products to people who had to make a business case for the purchase of our software systems. We were helping businesses achieve more. Most of the businesses my previous employers were helping were public utilities, communication companies, and manufacturers. You could feel good about your work and that you were contributing to a better world.<p>Today I'm blessed to work for an electric utility. Whenever I'm having a bad day I remind myself that I'm helping to provide the foundation of modern civilization. It's much better for your soul than knowing that in the end you're just working for a drug dealer and contributing to the destruction of people's lives. I'm glad I'm close to retirement and don't have to do that kind of work just so I can put food on the table.
> It might be, for all we know, that the primary reason someone posts on social media is anger. If a proper study was done, I bet it would show exactly that.<p>This is a low quality assertion.<p>> “How could they have just scrolled and scrolled all day? Didn’t they know what it was doing to them?” Social media is the new cigarettes. Everyone does it, it’s addictive, it’s harmful, and you should quit.<p>Scrolling is not different that sitting in front of your TV or consuming other types of information. I think social media is bad, but that's just my opinion, this article sounds like an unsubstantiated opinion, too.
This guy is a consultant giving a talk to a mobile game company and has slides titled "Use of Coercive Monetization". It's pretty much an evil TED talk on dark patterns.<p><a href="https://www.youtube.com/watch?v=zex3b2mDnUw&t=17m16s" rel="nofollow">https://www.youtube.com/watch?v=zex3b2mDnUw&t=17m16s</a>
Discussion of the addictive qualities of social media often strikes me as bizarre.<p>Where is examination of the person doing the scrolling? Isn't the flaw there?<p>(The following applies to adults):<p>Ostensibly, the person should be allowed to spend their time as they like if they're engaging in legal activity. If people thought that engaging in social media was bad for them, they would stop. Are we really saying that people cannot stop using social media? Are we saying that people don't think it's bad for them?<p>If people can't stop doing an activity that is definitely hurting them, don't they need professional help? Do all these social media users need professional help?<p>What amount of time on social media is sufficient for it to have a negative effect? Is it anything besides zero?<p>To be meta: if hackernews is social media and social media is bad, are we all hurting ourselves?
The opening sentence:<p>> <i>There is something about social media that human beings are not psychologically prepared for.</i><p>I would state it: Engineers and business people have distilled what creates craving and then satisfies it by creating more craving.<p>No different than distilling the sugar, heroin, or cocaine from otherwise healthy, innocuous plants. We regulate some of that distillation. Even what we don't regulate, as a society, we generally look down on people that addict others. Why we reward programmers as we did the Sacklers doesn't make sense to me. I would think we would consider employees and investors of antisocial media companies villainous pariahs.
I recently joined tiktok and their algorithm really freaked me out. It took maybe twenty minutes for it zone into some incredibly specific interests of mine. I enjoy the app, but I've never felt so pushed into the content I want to see so quickly. I'm quite A bit older than their average user but the mental impact on a child seems immense.
The author knew that adding a strong and clickbaity title is code for "addiction" still used it and published it in various places for getting the engagement. The companies and their managers are driven by same sort of metrics as the author.
I don’t understand how people can stand to use Facebook. It was fun a long time ago. Then, when I didn’t use it for a few weeks, I started getting “notifications” that, instead of saying “someone commented on your post” or “someone liked your post” they were “someone commented on someone else’s post”, or “someone posted something”, and I thought “These are not notifications, these have nothing to do with me.”<p>And that kind of turned me off, and I intentionally stayed away from Facebook for a bit, and now whenever I log in, there’s a ton of fake notifications clamouring for my attention, and my “feed” is 30-50% ads, and the content that isn’t ads is 30-50% memes. There’s very little actual content, and you have to work way too hard to find it.<p>In an attempt to further force me to “engage” Facebook has turned their platform into something I can’t stand to use.<p>I wonder if this is what social media looks like to anyone who “gets out” for a bit. Maybe we’re all like frogs in water that’s slowly getting hotter and hotter, not realizing that the water is boiling; Facebook keeps pushing more and more forced “engagement”, and no one who is in it realizes it’s turning into all ads and garbage.
Could we introduce this kind of addiction in healthy things, like excising or healthy diet or anything like that?<p>I've never seen it work in practice, so why are we not able to do that?
I recently saw an FB "data engineer" talking about "experiment review sessions", where the - presumably engagement related - results of interface changes are discussed (he also expressed hope that all companies work that way).<p>It's very rational. The user needs a good user interface, the best that can be built. What is the best? The one the user likes to use, day in day out. Is that a problem? Obviously not for the platform.<p>When FB CEO announced that he want's to make users happy, he acknowledged his real intents: mass, aggregate control of behavior. I cannot deny that it carries a purely instrumental view of humans - but that's again only rational in the context of business.
My smartphone is a now a net negative for me. I need a token apps, maps, uber but everything else really is wasting my time. Sure I've tried a bunch of things but the addictive apps keep coming back.
You mark my words; this is how we will be taken advantage of:
“Emotional Health” classes will be required in high schools. Where many things will be taught. And, somewhere around week 3 of the second semester will be an “avoiding addiction” module.
A certain large group of citizens, with a mysteriously large advertisement budget, will vehemently oppose teaching children such “politically correct nonsense”. Thus, the program will be underfunded, and the curriculum poorly developed; like DARE, sex ed, biological evolution…
Somehow, the same group of people are going to push for personal liability when it comes to managing your own addictions. The precedent set by the lawyers and courts, the litigation of which we won’t get to see, is that addiction management is a part of the core curriculum in high school and any accredited education facility. Therefore, “The Average Joe on the bus“ has already been equipped with the tools necessary to make educated decisions about what services they choose to participate in regardless of that service’s addiction-enabling behaviors and reinforced by a opaquely worded EULA’s. All of this being enabled by our cultural virtue of it “individual responsibility”.
Oh, and <i>something-something</i> Free Markets... <i>something-something</i> Regulations...
Really some of these techniques are just scaling what is standard at arcades and bringing it online. Anyone who has ever been to an arcade knows how many tokens/quarters you can go through to keep playing and playing.<p>The danger with mobile gaming is that you never really get to leave the arcade and you don’t have to intrinsically budget the way you used to at arcades. At the arcade you usually have a fixed number of tokens to spend and you always have the allure of other games pulling you away.
There are two kinds of capitalism: value creation capitalism (also known as industrial capitalism) and value <i>extraction</i> capitalism. The latter includes quite a lot of financial capitalism, profit-driven warfare, and casinos and other addiction-driven products.<p>The former creates value while the latter extracts and concentrates it while overall creating net negative value. Addicting people to Skinner boxes is destroying hours of what otherwise might be productive, rejuvenating, or enriching time. It's macroeconomically indistinguishable from killing people.<p>One of the central problems of modern Western capitalism is that we fail to distinguish between the two. A businessperson is successful an a genius if they make money; nobody bothers to distinguish between those that make money by creating value and those that make money by merely extracting it and leaving a path of destruction in their wake.<p>Maybe we can figure out a way to re-channel the impulses of "cancel culture" in this direction, cancelling those that promote addictive net value destroying products and services. Since the algorithmic timeline and other personalized recommendation engines are by far the largest pushers of fascist and neo-racist ideology, the original goals of "cancel culture" might still be indirectly achieved.
Unfortunately, our economic system guarantees outcomes like this in pursuit of perpetual and ever increasing growth and profit. We either need to add regulations (short term fixes) or change our economic system (long term fixes). We can't continue to do nothing if we want to maintain a civil democratic society. Misinformation and propaganda are too powerful to leave unchecked.
Where is the line between engineering for addiction and engineering for a quality user experience?<p>Everything we are told about building modern services is about optimisation. You have metrics, you experiment, you tweak things, you see what helps and what doesn't. This is as true for business models as it is for interface design.<p>Some of this is good - it is positive to improve your UI to reduce friction and make it more usable by your customers. Some of it is fine - choosing a landing page design that gets more people to sign up. Some of it is bad - things you do that mean people don't close the app as readily.<p>All these changes are the result of the same process, that is built into how modern businesses operate. How do you draw the line between optimising for good UI and optimising for addiction? If you're writing legislation, do you outlaw specific practices? If you're trying to operate companies ethically, do you just avoid certain metrics?
I, for one, want HN to have an option to hide my karma point next to my username. It's a distraction. It's okay for them to use that number for moderating purposes, etc. but I don't need to see it for every time.
I work in a small company that helps news sites with engagement. Our aim is for our clients to be a place where the community can find as much info as possible for their daily life. Trying to offer readers all the info in the most easily digestible way.<p>Everytime we get a new customer we have to discuss to explain to them that the real value is in providing the info people need as fast and simple as possible. Unfortunately we have to fight against some major forces, such as google ranking sites with longer content just because of that and not metrics related to value.
So many truism and dogma it's really hard to take this person seriously, why so many upvotes for something that is just conjecture at best:<p>> it harms our brains in a way that we don’t yet fully understand<p>He even mentions "we don't yet fully understand" so why keep building on this narrative just to try to prove a point? That should have been the last sentence of this post but I guess that would have made for a pretty mediocre post one that doesn't tap into FUD of the reader.
For me the pandemic has actually allowed me to spend more time outdoors and with my children and as a result less time online. We have been taking walks daily and started gardening. I've cut down social media time to almost nothing on all major platforms (HN is my last "addiction"). The author claims the pandemic forced us inside and on social media--do people find this true for them?
I was working in a mobile gamedev in 2013 before the whole thing about Facebook and stuff and after talking to one of the game designers I thought to myself "Drugs. We're making drugs and invent techniques to milk users for their money". I could never feel proud of what I'm doing after that and left shortly after to work for a telecom.
>> Relative timestamps (“3 hours ago” instead of “6:56 PM”). This creates IMMEDIACY.<p>TIL... I've always thought this was SO annoying on Twitter, because I want to see the exact time (I'd even be ok if I could hover and see the actual timestamp), and thought they were just 'dumbing it down'.<p>But it makes a lot more sense with that clue.<p>Ugh
Future of humanity stuff aside, isn’t the doublespeak for addition actually “retention”? “Engagement” is a classification for a set of actions that drives addition, not measures it.
> Fake internet points (clickable, often animated icons with incrementing numbers. Likes, reactions, upvotes, retweets, etc.). This creates ADDICTION.<p>Sounds like the HN karma system ;-)
>“How could they have just scrolled and scrolled all day?"<p>It would help if your article used more than just the middle 25% of my screen, thus requiring scrolling to read.
so many misconceptions on this text. this author knows very little about mental health and is just repeating this narrative these recent documentaries have been spreading - which was created, by the way, by those same fellas who created those systems and feel guilty now. Also, as an UX designer, he's pretty much getting money to apply the same stuff to his own projects, most probably.
Read 'The BITE Model Of Authoritarian Control: Undue Influence, Thought Reform, Brainwashing, Mind Control, Trafficking And The Law' by Steven Alan Hassan<p>or<p>'Thought Reform and the Psychology of Totalism: A Study of “Brainwashing” in China' by Robert Jay Lifton<p>(both readily available as PDF online)
This type of ad hoc pathologization of behaviour is itself very suspect, and isn't helping anyone. Not everything that's bad is violence or an addiction, there are plenty of different types and nouances of evil in the world.
I can’t find it now, but DHH wrote somewhere about optimising for engagement vs action. The idea behind Basecamp is to be a service that helps you manage a project in a faster and easier way, not a service that makes you stay "engaged" while not pushing the project forward. Having more engagement means it’s doing the opposite of of what it's meant to.<p>This can easily apply for many other services. The purpose of an email client is to surface the important information (filtering and maybe inbox zero?) and let the user act on it (reply and write with the write context). The purpose of a dating website is to find a match and create a connection. The purpose of a social network, on the other hand, is much more debatable.
this shit makes me angry af<p>it's no wonder we have seen a decline in cognitive ability, as seen by various world events (election of the 45th president of the USA, "Brexit", climate denial, rise of neo-nazis, vaccination denial, Rohingya genocide in Myanmar, ...). we are just endlessly scrolling and spamming each other with our running commentary bullshit that masquerades as a modicum of insight.<p>what happened to making the world a better place? the tech industry is no better than the tobacco or fast food industry.