TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Decentralised content moderation

113 点作者 AstroNoise58超过 4 年前

18 条评论

colllectorof超过 4 年前
It&#x27;s pretty obvious that the tech crowd right now is so intoxicated by its own groupthink that these attempts to come up with &quot;solutions&quot; are going to have awful results. You don&#x27;t even know what the problem really is.<p><i>&quot;I fear that many decentralised web projects are designed for censorship resistance not so much because they deliberately want to become hubs for neo-nazis, but rather out of a kind of naive utopian belief that more speech is always better. But I think we have learnt in the last decade that this is not the case.&quot;</i><p>What you should have learned in the last decade is that social networks designed around virality, engagement and &quot;influencing&quot; are awful for the society in the long run. But somehow now the conversation has turned away from that and towards &quot;better moderation&quot;.<p>Engage your brain. Read Marshall McLuhan. The design of a medium is far more important than how it is moderated.
评论 #25777590 未加载
评论 #25777795 未加载
评论 #25777957 未加载
评论 #25777786 未加载
评论 #25778066 未加载
评论 #25778045 未加载
评论 #25786589 未加载
评论 #25779129 未加载
评论 #25777818 未加载
评论 #25779379 未加载
kstrauser超过 4 年前
I&#x27;m active with Mastodon and absolutely love its moderation model. In a nutshell:<p>- It&#x27;s made up of a bunch of independent servers, or &quot;instances&quot;. The common analogy here is to email systems.<p>- If you want to join the federation, stand up an instance and start using it. Voila! Now you&#x27;re part of it.<p>- My instance has a lot of users, and I don&#x27;t want to run them off, so it&#x27;s in my own interest to moderate my own instance in a way that my community likes. Allow too much in without doing anything? They leave. Tighten it so that it starts losing its value? They leave. There&#x27;s a feedback mechanism that guides me to the middle road.<p>- But my users <i>can</i> leave for greener pastures if they think I&#x27;m doing a bad job and think another instance is better. They&#x27;re not stuck with me.<p>The end result is that there are thousands of instances with widely varied moderation policies. There are some &quot;safe spaces&quot; where people who&#x27;ve been sexually assaulted hang out and that have zero tolerance for harassment or trolling. There are others that are very laissez faire. There&#x27;s a marketplace of styles to choose from, and no one server has to try to be a perfect fit for everyone.<p>I realize that this is not helpful information for someone who wants to run a single large service. I bring it up just to point out that there&#x27;s more than one way to skin that cat.<p>(That final idiom would probably get me banned on some servers. And that&#x27;s great! More power to that community for being willing and able to set policies, even if I wouldn&#x27;t agree with them.)
评论 #25780890 未加载
halfmatthalfcat超过 4 年前
I&#x27;ve been thinking hard about decentralized content moderation, especially around chatrooms, for years. More specifically because I&#x27;m building a large, chatroom-like service for TV.<p>I think it&#x27;s evident from Facebook, Twitter, et al that human moderation of very dynamic situations is incredibly hard, maybe even impossible.<p>I&#x27;ve been brewing up strategies of letting the community itself moderate because a machine really cannot &quot;see&quot; what content is good or bad, re: context.<p>While I think that community moderation will inevitably lead to bubbles, it&#x27;s a better and more organic tradeoff than letting a centralized service dictate what is and isn&#x27;t &quot;good&quot;.
评论 #25777412 未加载
评论 #25777814 未加载
评论 #25777204 未加载
评论 #25777205 未加载
评论 #25777228 未加载
评论 #25777049 未加载
评论 #25777159 未加载
评论 #25777340 未加载
评论 #25778313 未加载
评论 #25777063 未加载
AstroNoise58超过 4 年前
I find it pretty interesting that Martin does not mention the kind of community member-driven up&#x2F;downvote mechanism found on this site (and elsewhere) as an example of decentralised content moderation.<p>Edit: now I see Slashdot and Reddit mentioned at the end in the updates section (I don&#x27;t remember seeing them on my first read, but that might just be me).
评论 #25778740 未加载
评论 #25776754 未加载
评论 #25776620 未加载
评论 #25776382 未加载
评论 #25776997 未加载
评论 #25776654 未加载
gfodor超过 4 年前
You can cut off a large part of abuse by just creating a financial based incentive. Pay to gain access, and access can be revoked at which point you need to repay (perhaps on a progressive scale - ie the more you are banned, the harder it is to get back in.) Your identity confers a reputation level that influences filters so what you post is seen more often, so there is value in your account and you don&#x27;t want to lose it. The SA forums did this and it helped immensely with keeping out spam (though not a silver bullet.)<p>Any system where any rando can post any random thing with no gates is going to be much more of a slog to moderate than one where there are several gates that imply the person is acting in good faith.
emaro超过 4 年前
Matrix published an interesting concept for decentralised content moderation [0]. I think this is the way to go.<p>Edit: Discussed here [1] and here [2].<p>[0]: <a href="https:&#x2F;&#x2F;matrix.org&#x2F;blog&#x2F;2020&#x2F;10&#x2F;19&#x2F;combating-abuse-in-matrix-without-backdoors" rel="nofollow">https:&#x2F;&#x2F;matrix.org&#x2F;blog&#x2F;2020&#x2F;10&#x2F;19&#x2F;combating-abuse-in-matrix...</a><p>[1]: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=24826951" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=24826951</a><p>[2]: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=24836987" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=24836987</a>
JulianMorrison超过 4 年前
It&#x27;s worth remembering that content moderation is an activity which causes people mental illness right now, because it&#x27;s so unrelentingly awful. Attempts to decentralize this are going to be met with the problem that people <i>don&#x27;t want</i> to be exposed to a stream of pedophilia, animal abuse, murderous racism and terrorist content and asked to score it for awfulness.
评论 #25778586 未加载
评论 #25778884 未加载
neiman超过 4 年前
We (at Almonit) work on a self-governing publication system which would bring democratic control to content moderation.<p>We just wrote about its philosophy earlier this week.<p><a href="https:&#x2F;&#x2F;almonit.com&#x2F;blog&#x2F;2021-01-08&#x2F;self-governing_internet_organizations_part_I.html" rel="nofollow">https:&#x2F;&#x2F;almonit.com&#x2F;blog&#x2F;2021-01-08&#x2F;self-governing_internet_...</a>
评论 #25778563 未加载
jancsika超过 4 年前
&gt; Censorship resistance means that anybody can say anything, without suffering consequences.<p>I can&#x27;t even get to the heart of the poster&#x27;s argument. That&#x27;s because the shitty state of all current social media software defines &quot;anybody&quot; as:<p>* a single user making statements in earnest<p>* a contractor tacitly working on behalf of some company<p>* an employee or contractor working on behalf of a nation state<p>* a botnet controlled by a company or nation state<p>It&#x27;s so bad that you can witness the failure in realtime on, say, Reddit. I&#x27;m sure I&#x27;m not the only one who has skimmed comments and thought, &quot;Gee, that&#x27;s a surprisingly reaction from lots of respondents.&quot; Then go back even 30 minutes later and the overwhelming reaction is now the opposite, with many comments in the interim about new or suspicious accounts and lots of moderation of the initial astroturfing effort.<p>Those of us who have some idea of the scope of the problem (hopefully) become skeptical enough to resist rabbit-holes. But if you have no idea of the scope (or even the problem itself), you can easily get caught in a vicious cycle of being fed a diet of propaganda that is perhaps 80% outright fake news.<p>As long as the state of the art remains this shitty (and there are <i>plenty</i> of monetary incentives for it to remain this way), what&#x27;s the point of smearing that mendacity across a federated system?
iamsb超过 4 年前
My suggestion is to not indulge in any content moderation which is not illegal. Only take down content which is required by a court order. Limit use of automated content moderation only for easy to solve cases like child pornography.<p>Why?<p>It is fairly clear at this point that content moderation at internet scale is not possible. Why? A. Using other users to flag dangerous content is not working. Which users do you trust to bestow this power with? How do remove this power from them? How do you control it becoming a digital lynch mob? Can you have users across political, gender, other dimensions All mostly not solvable problems.<p>B. Is it possible to use machine learning? To some extent. But any machine learning algorithm will have inherent bias, because test data will also be produced by biased individuals. Also people will eventually figure out how to get around those algorithms as well.<p>The causality between content published on the internet and action in real world is not immediate. It is not like someone is sitting in a crowded place and shouting fire causing a stampede. As there is a sufficient delay between speech and action, we can say that the medium the speech is published in is not the primary cause of the action, even if there is link. Chances of direct linkage are fairly rare and police&#x2F;law should be able to deal with those.<p>Content moderation, at least the way Twitter has been trying to do, has not been effective, created lot of ways for mobs to enforce censorship, and there is absolutely no real word positive impact of this censorship is. Only use of this moderation and censorship has been for right to claim victimhood and gain more viewer&#x2F;readership to be honest.
评论 #25778146 未加载
评论 #25778404 未加载
评论 #25778219 未加载
jonathanstrange超过 4 年前
Since I&#x27;m using libp2p in Go for a side project, may I take this opportunity to ask how this could work in principle for a decentralized network? The way I see it, this seems to be impossible but maybe I&#x27;m missing something.<p>For example, in my network, anyone can start a node and the user has full control over it. So how would you censor this node? The following ideas don&#x27;t seem to work:<p>1. Voting or another social choice consensus mechanism. Problems:<p>- Allows a colluding majority to mount DOS attacks against anyone.<p>- Can easily be circumvented by changing host keys &#x2F; creating a new identity.<p>2. The equivalent of a killfile: Users decide to blacklist a node, dropping all connections to it. Problems:<p>- Easy to circumvent by creating new host keys &#x2F; creating a new identity.<p>3. Karma system: This is just the same as voting &#x2F; social choice aggregation and has the same problems.<p>4. IP banning by distributing the blocked IPs with the binaries in frequent updates. Problem:<p>- Does not work well with dynamic IPs and VPNs.<p>Basically, I can&#x27;t see a way to prevent users from creating new identities &#x2F; key pairs for themselves whenever the old one has been banned. Other than security by obscurity nonsense (&quot;rootkit&quot; on the user&#x27;s machine, hidden keys embedded in binaries, etc.) or a centralized server as a gateway, how would you solve that problem?
评论 #25778410 未加载
评论 #25778036 未加载
评论 #25777515 未加载
jrexilius超过 4 年前
Good piece. This line articulates the problem well: &quot;without objectivity and consistency, moderation can easily degenerate into a situation where one group of people forces their opinions on everyone else, like them or not.&quot; And it gets to the core of the problem. Objectivity and consistency are extremely difficult to scale and maintain over time. They require constant reinforcement from environment, context, and culture.
评论 #25777399 未加载
mikeveilleux超过 4 年前
Having been part of the email ecosystem where well established spam filtering and reputation management systems have been in place for years, I&#x27;ve found it interesting how close these more recent conversations are to those we&#x27;ve had around email messaging abuse.<p>&gt; Thus, as soon as a censorship-resistant social network becomes sufficiently popular, I expect that it will be filled with messages from spammers, neo-nazis, and child pornographers (or any other type of content that you consider despicable).<p>Unfortunately, I agree this is likely the case, and also agree with many of the other points where there&#x27;s unlikely to be an agreed upon approach at scale.<p>I feel the two most important aspects of any moderation are transparency and consistency. I&#x27;d always like to know what community I&#x27;m joining.<p>We&#x27;ll likely see more niche communities continue to pop up on centralized and decentralized networks where the moderation, content and community can be more tailored to their own expectations.
_gmnw超过 4 年前
Obviously upvoting and downvoting is not enough for adequate moderation. There&#x27;s still the aspect of people trolling and generally posting horrible things online. There&#x27;s a reason Facebook had to pay $52 million to content moderators for the trauma&#x2F;ptsd they suffered.
MrXOR超过 4 年前
The problem of censorship on Twitter and other social media can be solved with a good moderator like dang (thank you!)
EGreg超过 4 年前
Let&#x27;s take one step back. Just like in the Title I vs Title II debate, let&#x27;s go one step earlier. WHY do we have these issues in the first place?<p>It&#x27;s because our entire society is permeated with ideas about capitalism and competition being the best way to organize something, almost part of the moral fabric of the country. Someone &quot;built it&quot;, now they ought to &quot;own&quot; the platform. Then they get all this responsibility to moderate, not moderate, or whatever.<p>Compare with science, wikipedia, open source projects, etc. where things are peer reviewed before the wider public sees them, and there is collaboration instead of competition. People contribute to a growing snowball. There is no profit motive or market competition. There is no private ownership of ideas. There are no celebrities, no heroes. No one can tweet to 5 million people at 3 am.<p>Somehow, this has mistakenly become a “freedom of speech” issue instead of an issue of capitalism and private ownership of the means of distribution. In this perverse sense, &quot;freedom of speech&quot; even means corporations should have a right to buy local news stations and tell news anchors the exact talking points to say, word for word, or replacing the human mouthpieces if they don&#x27;t...<p>Really this is just capitalism, where capital consists of audience&#x2F;followers instead of money&#x2F;dollars. Top down control by a corporation is normal in capitalism. You just see a landlord (Parler) crying about higher landlord ... ironically crying to the even higher landlord, the US government - to use force and “punish” Facebook.<p>Going further, it means corporations (considered by some to have the same rights as people) using their infrastructure and distribution agreements to push messages and agendas crafted by a small group of people to millions. Celebrity culture is the result. Ashton Kutcher was the first to 1 million Twitter followers because kingmakers in the movie industry chose him earlier on to star in movies, and so on down the line.<p>Many companies themselves employ social media managers to regularly moderate their own Facebook Pages and comments, deleting even off-topic comments. Why should they have an inalienable right to be on a platform? So inside their own website and page these private companies can moderate and choose not to partner with someone but private companies Facebook and Twitter should be prevented from making decisions about content on THEIR own platform. You want a platform that can’t kick you off? It’s called open source software, and decentralized networks. You know what they don’t have?<p>Private ownership of the whole network. “But I built it so I get to own it” is the capitalist attitude that leads to exactly this situation. The only way we will get there is if people build it and then DON’T own the whole platform. Think about it!
eternalban超过 4 年前
<i>I fear that many decentralised web projects are designed for censorship resistance not so much because they deliberately want to become hubs for neo-nazis, but rather out of a kind of naive utopian belief that more speech is always better. But I think we have learnt in the last decade that this is not the case. If we want technologies to help build the type of society that we want to live in, then certain abusive types of behaviour must be restricted. Thus, content moderation is needed.</i><p>Let&#x27;s unpack this:<p><pre><code> Axiom: a kind of naive utopian belief [exists that asserts] that more speech is always better. But I think we have learnt in the last decade that this is not the case. </code></pre> False premise. The &quot;naive belief&quot;, based on the <i>empirical</i> evidence of history, is that prioritizing the supression of speech to address social issues is the hallmark of authoritarian systems.<p>Martin also claims &quot;we have learned&quot; something that he is simply asserting as fact. My lesson from the last 3 decades has been that it was a huge mistake to let media ownership be concentrated in the hands of a few. We used to have laws against this in the 90s.<p><pre><code> Axiom: By &quot;we&quot; as in &quot;we want&quot;, Martin means the community of likeminded people, aka the dreaded &quot;filter bubble&quot; or &quot;community value system&quot;. </code></pre> Who is this &quot;we&quot;, Martin?<p><pre><code> Theorem: If we want technologies to help build the type of society that we want to live in, then certain abusive types of behaviour must be restricted. </code></pre> We already see that the &quot;we&quot; of Martin is a restricted subset of &quot;we the Humanity&quot;. There are &quot;we&quot; communities that disagree with Martin&#x27;s on issues ranging from: the fundamental necessity for freedom of thougth and conscience; the positive value of diversity of thoughts; the positive value of unorthodox (&quot;radical&quot;) thought; the fundamental identity of the concept of &quot;community&quot; with &quot;shared values&quot;; etc.<p><pre><code> Q.E.D.: Thus, content moderation is needed. </code></pre> Give the man a PhD.<p>--<p>So here is a <i>parable</i> of a man named Donald Knuth. This Donald, while a highly respected and productive contributing member of the &#x27;Community of Computer Scientists of America&#x27; [ACM, etc.], also sadly entertains irrational beliefs that &quot;we&quot; &quot;know&quot; to be superstitious non-sense.<p>The reason that this otherwise sane man entertains this nonsensical thoughts is because of the &quot;filter bubble&quot; of the community he was raised in.<p>Of course, to this day, Donald Knuth has never tried to force his views in the ACM on other ACM members, many of whom are devout athiests. And should Donald Knuth ever try to preach his religion in ACM, we would expect respectful but firm &quot;community filter bubble&quot; action of ACM telling Mr. Knuth to keep his religious views for his religious community.<p>But, &quot;[i]f we want technologies to help build the type of society that we want to live in&quot; -- and my fellow &quot;we&quot;, do &quot;we&quot; not agree that there is no room for Donald Knuth&#x27;s religious nonsense in &quot;our type of society&quot;? -- would it not be wise to ensure that the tragedy that befell the otherwise thoughtful and rational Donald Knuth could happen to other poor unsuspecting people who happen to be born and raised in some &quot;fringe&quot; community?<p>&quot;Thus, content moderation is needed.&quot;
nathias超过 4 年前
There is absolutely no difficulty, if you don&#x27;t want censorship, just have a button that hides content you personally don&#x27;t want and leave that decision to all individual users. What others wish to see or not is not your decision to make, and if some of them are illegal that should be a job for the police not some ego-triping janny.
评论 #25776835 未加载