I've been thinking about this a lot. In terms of political economy, there's a lot of reliance on the ideas of information, free will and rationality.<p>To cut a long story short, if people have information about the market, they can use their knowledge of their own desires to act, which creates some sort of efficiency. Glossing over a load of stuff here because it's a large subject, but that's sort of the spiel you get when people are talking about all sorts of political/economic decisions. And you can generate both leftie and rightie policies depending on how you think about things.<p>The problem is if deciding things depends on free will and information, what do we do when it seems there are exceptions? I'll leave aside information problems since we're more talking about rationality.<p>But if it turns out people can know what WoW does to your studies, and they still end up doing this, how should we think about this? Easy answer is that their real desire is to play WoW all day, but that doesn't seem very satisfying. If they're making an irrational decision, that has other consequences on our society.<p>For instance, a whole load of behaviours need to be controlled. Obvious things like smoking and drugs, but what about overeating? Too much soda? Social media? And what exactly is the correct policy response to irrational behaviour? Keep in mind solutions like taxes/subsidies also rely on rationality.
Back in the day, I began my career in programming intending to be a game developer. Coming from a family of addicts I noticed even in the late 1980s that games and chat-style apps could be powerfully addictive. (This awareness kept me off Twitter, Facebook, and so on.)<p>I moved into compiler work and line of business apps, which surprisingly I find just as enjoyable. I never really regretted avoiding games because there are enough things to get me in trouble already.
The individual, is basically legacy hardware, and we refined hacking this legacy hardware over the past 20 years. The combined interest in hacking this hardware will always be more effective and cheaper then all defense, bugfixes and updates the individual and his peers can muster.<p>We should criminalize the act of hacking individuals, as the hardware can not be changed and applying security updates and bugfixes has proven ineffective or impossible. It should be viewed not as a weakness of the person but a vile act on a vurnerable creature.
Similar in nature to the human rights, in which we grant us one another - animals as we are - inmuteable dignity. This dignity is violated, if one is hacked/manipulated against his/her own interest to serve the interests of another individual, cooperation, nation-state or a idea.<p>This sort of behaviour is not new. There have been precedences, where gurus, churches and whole nations, have overriden the individuals possibility to think or act different.
How do we differentiate from a compelling and effective form of entertainment with an addictive activity? Most people wouldn't look down on someone who stays up till 3 AM reading a gripping book, but many would if we swap the book with a video game. Definitely people who can't hold down jobs or education due to addition to video games or social media is a problem, but I suspect that a lot of the concern is due to an overall negative view towards games and internet use.<p>What I do think is a problem is the fact that we have much better data collection on how people use video games and internet services. Absent that information, developers had to think about crafting a good overall experience that people were willing to pay for. With that information, people optimize for overall time used instead of the quality of the experience. Hours and dollars spent is an easy metric to show to executives, judging the overall quality of the experience is much more subjective.
This is why the advances in AI sometimes look scary to me. "Limbic capitalism" (great term, btw) existed and thrived long before the internet and AI era - tobacco, alcohol, credit cards, etc. Finding and monetizing a human addiction has always been extremely profitable. What worries me, is that the power of AI and the easy ability to reach millions of people at once through social media and internet advertisement will make finding new human additions and exploiting them easier than ever. There is nothing that can stop businesses from basically hacking the human limbic system (aka lizard brain) and monetizing it using the vast power of AI. Facebook just pioneered it, but many will follow.<p>The most dangerous AI is not the killer robots scorching the Earth a-la the Terminator movie. The most dangerous AI is the one that will make humans fall in love with it.
I think usage of Capitalism here is a misnomer, because replacing 'capitalism' with 'human greed' or 'human fallibility' changes nothing.<p>The problem is not caused by markets or property rights. If anything, competition actually makes games least toxic. Look how pay2win games being replaced by vanity items selling games.