TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

How would you add 'reputation' to what people post?

5 点作者 gw666超过 5 年前
SF often talks about reputation systems as part of a society&#x27;s culture. True, it&#x27;s a big hairy problem, but how could some person or group make an initial attempt at doing this? It&#x27;d be great to have web pages, tweets, and various posts tied somehow to a measure of the author&#x27;s current trustworthiness (based on previous behavior). Even a high&#x2F;medium&#x2F;low&#x2F;unknown rating would help.<p>No matter how imperfect the implementation might be, it&#x27;d be great to have anything that exposes to the general public the idea that people need to consider the source of <i>anything</i> that&#x27;s posted on the Internet.<p>What would you try doing?

2 条评论

samkater超过 5 年前
Most systems seem to ask others to rate what they consume - up&#x2F;down, like&#x2F;dislike, etc. If we’re talking about content that competes for eyeballs, like a news feed, you might add the concept of having a person “stake” some of their reputation to move something to the top. If it is universally pilloried, they lose reputation, and vice-versa. Bonus points for a system that makes it so my reputation to you doesn’t have to be the same as my reputation to anyone else (if you have been “Up-voting” my content for a while, my rep with you is higher than with somebody seeing my post for the first time).<p>Not sure if or how this solves an echo-chamber problem though.<p>Edit: just also had the thought to help tackle bias problems - the platform itself could produce biased content on either side of an issue from time-to-time to deduce people’s positions based on their votes. Then the reputation algorithm has a chance to adjust for ideas that polarize vs ones with general agreement and scale rankings accordingly.
ben509超过 5 年前
The biggest problem of any moderation system is a majority downvoting things they don&#x27;t like into oblivion. You get a majority with a high reputation taking control of your site. We complain about cancel culture, but it&#x27;s nothing new because people naturally want to kick out people they don&#x27;t like. After all, the people they don&#x27;t like are heartless evil assholes.<p>I&#x27;d add a system to let people identify aspects of the post, and then use reputation to verify that they&#x27;re a good indicator of that aspect.<p>Liberals know what is &quot;liberal&quot; and conservatives know what&#x27;s &quot;conservative&quot; quite reliably (in aggregate), even though those concepts are <i>very</i> fuzzy. And the problem most systems see is that politics being politics, those are frequently gamed. (If you&#x27;ve ever listend to C-SPAN&#x27;s radio program, you know that half the &quot;republican&quot; callers are democrats...)<p>Then you have to determine which keywords to use. I think some basic guidance, &quot;don&#x27;t label a thing &#x27;spam&#x27; unless it&#x27;s someone selling crap&quot; would go a long way to incentivizing a critical mass of users to label a thing honestly.<p>And then you let readers decide what they want to read.<p>My other notion (building on another comment[1]) is that discussion should include a team-based element. I think offering a way for small teams to come together and present ideas is more useful than individual commentary. A team allows less personal investment because you are now motivated by a desire from admiration from known peers.<p>But I&#x27;ve put away good number of beers, so I might just be rambling like a drunk idiot.<p>[1]: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=21289078" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=21289078</a>