SF often talks about reputation systems as part of a society's culture. True, it's a big hairy problem, but how could some person or group make an initial attempt at doing this? It'd be great to have web pages, tweets, and various posts tied somehow to a measure of the author's current trustworthiness (based on previous behavior). Even a high/medium/low/unknown rating would help.<p>No matter how imperfect the implementation might be, it'd be great to have anything that exposes to the general public the idea that people need to consider the source of <i>anything</i> that's posted on the Internet.<p>What would you try doing?
Most systems seem to ask others to rate what they consume - up/down, like/dislike, etc. If we’re talking about content that competes for eyeballs, like a news feed, you might add the concept of having a person “stake” some of their reputation to move something to the top. If it is universally pilloried, they lose reputation, and vice-versa. Bonus points for a system that makes it so my reputation to you doesn’t have to be the same as my reputation to anyone else (if you have been “Up-voting” my content for a while, my rep with you is higher than with somebody seeing my post for the first time).<p>Not sure if or how this solves an echo-chamber problem though.<p>Edit: just also had the thought to help tackle bias problems - the platform itself could produce biased content on either side of an issue from time-to-time to deduce people’s positions based on their votes. Then the reputation algorithm has a chance to adjust for ideas that polarize vs ones with general agreement and scale rankings accordingly.
The biggest problem of any moderation system is a majority downvoting things they don't like into oblivion. You get a majority with a high reputation taking control of your site. We complain about cancel culture, but it's nothing new because people naturally want to kick out people they don't like. After all, the people they don't like are heartless evil assholes.<p>I'd add a system to let people identify aspects of the post, and then use reputation to verify that they're a good indicator of that aspect.<p>Liberals know what is "liberal" and conservatives know what's "conservative" quite reliably (in aggregate), even though those concepts are <i>very</i> fuzzy. And the problem most systems see is that politics being politics, those are frequently gamed. (If you've ever listend to C-SPAN's radio program, you know that half the "republican" callers are democrats...)<p>Then you have to determine which keywords to use. I think some basic guidance, "don't label a thing 'spam' unless it's someone selling crap" would go a long way to incentivizing a critical mass of users to label a thing honestly.<p>And then you let readers decide what they want to read.<p>My other notion (building on another comment[1]) is that discussion should include a team-based element. I think offering a way for small teams to come together and present ideas is more useful than individual commentary. A team allows less personal investment because you are now motivated by a desire from admiration from known peers.<p>But I've put away good number of beers, so I might just be rambling like a drunk idiot.<p>[1]: <a href="https://news.ycombinator.com/item?id=21289078" rel="nofollow">https://news.ycombinator.com/item?id=21289078</a>