In the tutorial, the very first card I was presented with was reported for being "illegal", and the description was (from memory. I might not have it quite right)<p>"A post about the movie Cocaine Bear recommends doing cocaine while watching the movie and contains a photograph of a bag of white powder with the hashtag #cocaine"<p>That... seems like an obvious joke to me. And even if it's not, while "doing cocaine" is illegal, photographs of white powders are not, nor is suggesting that people do cocaine, nor is portraying them apparently doing so. We can know this is true because the producers and actors who worked on the film Cocaine Bear were not arrested.<p>Content moderation <i>is</i> hard, but I'm not very impressed that the very first card shown got it (in my opinion) totally wrong.
This game was made by a political action group for startups. Here is their agenda: <a href="https://engineis.squarespace.com/s/Startup-Policy-Agenda-2023-sthw.pdf" rel="nofollow">https://engineis.squarespace.com/s/Startup-Policy-Agenda-202...</a>
This is more like a game about shitty management communication. The rules are fairly arbitrary, and you're not told about any of them until you screw up, and even clear-cut cases like someone posting someone else's phone number are used to slap your wrist (apparently the second person consented, but nobody explains why that's your problem. Surely the second person can post their own phone number). Some advertisement is okay, other advertisement is forbidden. There's never a policy about which is which, you just have to guess. If this is how social media websites handle content reviewing it's no wonder the industry is a cesspit.<p>In all after playing for about ten minutes, I have to conclude that this is some kind of corporate apologetics for bad content management. The pointless timer really drives home that this problem could be solved by spending more money on training content reviewers and hiring more of them. I came away from the game feeling like it's a sequel to Papers, Please.
Where is my ability to ban anyone that disagrees with my point of view or my agenda? I think this game needs to be modified to autoban anyone that appeals my mod decisions :)
Wow,<p>The game is tense and boring and it illustrates both why paid Facebook "moderators" suffer PTSD and why this so-called "moderation" fails to improve the content of discussion or build community.<p>Actually worthwhile moderation involves someone caring about the whole direction of the discussion and not merely considering "is this content over the line", especially since a large portion of "don't talk about drugs" or whatever is pure CYA. "Don't say that 'cause someone might sue us".
Very interesting, got into level 6 then got a bit bored.<p>For eduction purpose would be cool to explain rules after the fact.<p>Small feedback: when there is an appeal, I'm not sure wether green means approving the appeal or stay with the initial decision.
Is <a href="https://www.engine.is/" rel="nofollow">https://www.engine.is/</a> an industry lobbying group? That's what it looks like.
I think the cards should just be what the post is, not a description of it. The descriptions (a) are hard to pattern match quickly and (b) the creators value judgments are inserted (for instance “ethnic slur” is ambiguous in many real cases, eg negro could be legit in a spanish-speaking or referencing context, or “white trash” which could both be a slur and the title of an edgy Vice documentary). In my experience the moderator personality types have actual blind spots in their biases around political topics (the misinformation, harassment, hate speech categories). They seem entirely unaware that had they lived 10 years earlier they would have completely different metrics.<p>That said, one important takeaway I got is that moderation needs priorities. While I’m thinking for 15s about whether a review about a herbal health product is “medically misleading” I have CSAM in the queue right after.
It's probably more of a spectrum of offensive to inoffensive, so the binary choice between permissible and not is going to vary depending on context. Maybe it could be a slider with "acceptable ranges" depending on platform? For example, I think 4chan would not mind most of these posts. But GlobalSocialNet might have some problems with some.
Elsewhere in this thread I and other people talked about how Engine probably made this with the goal of making people more sympathetic to tech companies' challenges with content moderation, by giving people some form of first-hand experience.<p>It turns out that we don't have to speculate about this motivation, because they spelled it out quite explicitly in a blog post:<p><a href="https://www.engine.is/news/category/moderator-mayhem" rel="nofollow">https://www.engine.is/news/category/moderator-mayhem</a>
A neat game, but my enjoyment of it is somewhat dulled since I watch my mom play this in real life on a daily basis - her job is moderating product reviews.
You might find this article from Mike Masnick of TechDirt, one of the contributors to the game, interesting:<p><a href="https://www.techdirt.com/2023/05/11/moderator-mayhem-a-mobile-game-to-see-how-well-you-can-handle-content-moderation/" rel="nofollow">https://www.techdirt.com/2023/05/11/moderator-mayhem-a-mobil...</a><p>He's been writing about the difficulties of site moderation for a long time.
"skip tutorial" was like that onetime i bought mod privilege on twitch using channel points.<p>first few minutes : yeah this is cool this is what i wanted<p>not very long later : this is awful, please take it back
This to me highlights the many delights of freedom of speech. You can choose what level of chaos to permit. To enforce a global ethic this way seems kinda snooty.
meh, its a bad interface<p>if i wanted to moderate something more effectively i would make a better one<p>i'd rather blame the difficulty of moderation itself on people's inability to innovate
I guess I’m boring somehow, I got 100% in several rounds then got promoted after round 8 (which ends the game!).<p>The trick seems to be that you have to check the “extra info” frequently, and be reasonably quick to decide as soon as you see it. It doesn’t seem possible to make the right decision on many of the cards without that context. It is kinda fun to see all the attempts to game the system.
I expected the tick to mean "take action on review/appeal", and the cross to mean "reject review/appeal", so that threw me for a few of them.<p>A few of the cards were <i>really</i> frustrating: the correct solution to some of them would be writing a ten-word response, or making an edit, but I only had "take down" and "leave up" buttons. (‘It's just a game!’ I remind myself.) I do hope <i>real</i> content moderators aren't so bereft of tooling.<p>The story is… really boring, actually? There are a couple dozen little things going on in the background, but not many, and there's not enough gameplay time for them to be explored. I'd hoped for something more like Papers, Please. (Judging by the achievements, there are actually several possible stories; I can only seem to get the election one.)<p>The sentiment of the general public doesn't seem to behave realistically, either: not when I made obviously-unpopular decisions, and not when the CEO made obviously-unpopular decisions.
Hey- not bad!<p>I got 4 out of 5 stars with high ratings.<p>One thing I'd add: doxxing of a quasi legal type such as posting details about a politicians family member and their private dealings.<p>Certainly captures the feel of going through modqueue and making snap decisions.<p>Glad I don't have a boss constantly reviewing my decisions though.
Is the point of the game supposed to be that the only possible way to keep up is to blindly approve 90% of them and then actually look only at the ones that get appealed?
This game is pretty accurate.<p>Theres a relatively well known video of multiple mods at a seminar asked to decide if pieces of content should stay up or be taken down.<p>At no point, was there consensus.<p>Content that most people would consider heinous, were allowed up by at least one mod, with a strong reason to do so.
I wonder if we could solve content moderation at scale through gamification. Imagine a game where the content you saw was actual content from somewhere and your decisions could influence if it gets taken down. In exchange, you get points.
This would be better if there was more moral/ethical depth. It also doesn't explore how content moderation policies shape discourse and how moderators often times abuse such policies to manipulate posts that fit their ideology.
this is a bad game, because it doesn't reflect reality<p>reality dictates that what is or isn't acceptable content is not a function of platform policy, it's a function of the legal jurisdiction in which the content is produced and/or consumed<p>this isn't really complicated