TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Should Facebook, Google be liable for user posts? asks U.S. Attorney General

133 点作者 jhatax大约 5 年前

44 条评论

danShumway大约 5 年前
There are 3 options for moderation:<p>1. Platforms with no moderation (8Chan -- except probably even worse, because even 8Chan moderates some content)<p>2. Publishers that pre-vet all posted content (the NYT with no comment section)<p>3. Platforms that retroactively moderate content only after it&#x27;s been posted, in whatever way they see fit (Twitter, Facebook, Twitch, Youtube, Reddit, Hackernews, and every public forum, IRC channel, and bug tracker ever built)<p>Revoking section 230 just gets rid of option 3. It&#x27;s not magic, it just means that we have one less moderation strategy. And option 3 is my favorite.<p>Option 2 takes voices away from the powerless and would be a major step backwards for freedom of expression. It would entrench powerful, traditional media companies and allow them greater control over public narratives and public conversations. Option 1 effectively forces anyone who doesn&#x27;t want to live on 8Chan off of the Internet. Moderation is a requirement for any online community to remain stable and healthy.<p>Even taking the premise that Twitter is an existential threat to democracy (which I am at least mildly skeptical of), it&#x27;s still mind-boggling to me that people are debating how to regulate giant Internet companies instead of implementing the sensible fix, which is just to break those companies up and increase competition. All of the &quot;they control the media and shape public opinion&quot; arguments people are making about Facebook&#x2F;Twitter boil down to the fact that ~5 companies have become so large that getting kicked off of their services can be at least somewhat reasonably argued to have an effect on speech. None of this would be a problem if the companies weren&#x27;t big enough to control so much of the discourse.<p>So we could get rid of section 230 and implement a complicated solution that will have negative knock-on effects and unintended consequences for the entire Internet. Or, we could enforce and expand the antitrust laws that are already on the books and break up 5 companies, with almost no risk to the rest of the Internet.<p>What problem does revoking section 230 solve that antitrust law doesn&#x27;t?
评论 #22378273 未加载
评论 #22378215 未加载
评论 #22382318 未加载
评论 #22379296 未加载
评论 #22383304 未加载
评论 #22379223 未加载
评论 #22381052 未加载
评论 #22379213 未加载
评论 #22380206 未加载
评论 #22381659 未加载
评论 #22379250 未加载
评论 #22380281 未加载
评论 #22379013 未加载
评论 #22378274 未加载
评论 #22378695 未加载
评论 #22378375 未加载
protomyth大约 5 年前
It really seems like this article is a bit off on the reasoning they ascribe to people. The biggest objections I have heard is that Facebook &#x2F; YouTube &#x2F; Twitter should now be classed as &quot;publishers&quot; and not &quot;providers&quot; because of the perceived bias in their removal of individuals and content.
评论 #22376337 未加载
评论 #22376382 未加载
评论 #22379004 未加载
评论 #22377765 未加载
cassalian大约 5 年前
Can anyone explain why William Barr seems so intent on trying to change technology in such major ways as this? Does he not understand the implications of the things he proposes? Or worse, does he understand the implications and proposes them nonetheless? I just don&#x27;t get this guy&#x27;s motivation.<p>As far as I can tell, revoking section 230 would just result in people putting up fake content themselves and then suing the platform they posted to. Is there a reason why this wouldn&#x27;t be possible?<p>Also, I see a lot of people focusing on major platforms, but why wouldn&#x27;t such changes also impact tiny sites? In particular, it seems that anyone casually hosting their own site (not something they focus on often), will be forced to remove all user generated content or quit their day jobs to manage their site - am I misinterpreting the implications here?
评论 #22379789 未加载
评论 #22379086 未加载
评论 #22379548 未加载
评论 #22380010 未加载
评论 #22379391 未加载
throw7大约 5 年前
I have no idea what Barr&#x27;s exact problem is he&#x27;s trying to solve? And if I run a web forum&#x2F;mailing list&#x2F;etc. I am now liable for anything users say on these services?<p>It seems like he&#x27;s unhappy that facebook&#x2F;google&#x2F;et al. are shaping (or trying to shape) a narrative... I mean, he&#x27;s not wrong. But everyone is: businesses, politicians, cia, hacker news.<p>Opening up people to easier liability for running a web forum, just means fewer will be able to provide this type of service; not to mention, this favors those with lots of money and time to spend on a lawsuit of such nature e.g. the gov&#x27;t and large businesses... hmmm, maybe that&#x27;s the point, only the gov&#x27;t and monied interests should shape the narrative.
评论 #22377186 未加载
ogre_codes大约 5 年前
I don&#x27;t think it&#x27;s a good precedence to make companies liable for the posts of users. I do think it&#x27;s reasonable to examine the ways Google and Facebook profit off of extremist views and surface extremist views algorithmically.<p>As soon as Google and Facebook moved to having an opinionated queue of content (Youtube&#x27;s suggested videos and Facebooks timeline) based on things like engagement, I could see the argument made that they have both ceased being mere conduits of information and became publishers themselves.
bearcobra大约 5 年前
The question I have for people who advocate for &quot;platform liability&quot; is, at what size should they become liable for user generated content? Facebook &amp; Google definitely seem big enough to most people, and maybe Twitter &amp; Reddit. But what about Y Combinator?
评论 #22376627 未加载
gonational大约 5 年前
I think a couple big concepts are being conflated. There are a couple really important questions:<p>1. Should Google, Facebook, etc. be responsible for user generated content hosted on their websites (i.e., should they <i>not</i> be treated as “public square”)?<p>2. Should the government have any hand in telling any company or any person what they can or cannot say, as long as they are not making threats or publishing illegal materials?<p>I am of the personal opinion that most (all?) of the major tech companies have engaged in censorship and even politically driven enforcement of their content policies, and therefore should have lost their “public square” status a long time ago, making them responsible for illegal content posted by their users.<p>Pertaining the second question, there simply is no question; the government does not and should never have any authority here, because the Constitution protects free speech, regardless of what kind of Ministry of Truth they would like to implement.
评论 #22380727 未加载
eyeinthepyramid大约 5 年前
Will every post need to be pre-moderated to ensure that nothing objectionable is published? I wonder how this would affect sites like Hacker News and Reddit, or any forum sites really.
评论 #22376571 未加载
kryogen1c大约 5 年前
&gt; escape punishment for harboring misinformation and extremist content<p>its so bananas that statements like this are glossed over and unqualified. this is not a solved problem, and its not even being treated like its a problem at all.<p>this perception that there are a set of correct facts and incorrect facts is just so meaingless. what does it even mean to be true? $Person is on $Video saying $Statement. True or false? Well, it depends. It ALWAYS depends. are you asking if $Video.Words = $Statement.Words? almost never. You are not investigating $Video.Soundwaves and $Person.VocalCords, you are making a case for $Person.Beliefs. What if $Person.Beliefs @ $Video.TimeStamp != $Person.Beliefs @ $Today? Is it true but meaningless, or are you trying to imply conclusions contextually - but guess what, different people interpret the same context differently!<p>an example suitable for HN is talking about security. Is your company secure? You cant answer the question because _the question is bad_. the answer to security is ALWAYS it depends. Are you talking about physically secure against a wandering drunk trying to pee on your server, or physically secure against a disgruntled employee building a killdozer and driving through your building? Are you talking about secure from some kid who finds LOIC and tries to DoS you or from a long term campaign from a nation-state APT? The discussion _requires_ framing, and so does discussing &quot;misinformation and extremist content&quot;
评论 #22376498 未加载
评论 #22376506 未加载
评论 #22377042 未加载
评论 #22376849 未加载
评论 #22376466 未加载
评论 #22376428 未加载
评论 #22378129 未加载
评论 #22377517 未加载
评论 #22376608 未加载
评论 #22376637 未加载
评论 #22377174 未加载
kilo_bravo_3大约 5 年前
My favorite thing about the &quot;publisher&quot; vs. &quot;platform&quot; rabbit hole people of a certain political persuasion seem to be burrowing through as a not-so-veiled threat towards service providers that &quot;censor&quot; posts consisting of pictures of Michelle Obama photoshopped to look like a gorilla is the delusional alternate-reality plane of existence on which they seem to reside where they think that going through with the threat will mean that their preferred content will be more likely to be hosted.
评论 #22378882 未加载
kelnos大约 5 年前
I think we need to stop trying to fit these things into old laws that weren&#x27;t written with them in mind.<p>Twitter isn&#x27;t a telephone company <i>or</i> a newspaper. I think for the most part they should have the liability protection that a telephone company has. But <i>users</i> do want moderation. They often want to restrict what they see to posts by people in their own echo chamber. They want the ability to flag things as spam or abuse. They want to be able to block people. They want posts taken down if enough people complain about them. And to extend that further, they often won&#x27;t mind if there&#x27;s a system in place to automatically do the above without their prior action.<p>The problem ends up being bias, even if it&#x27;s just perceived, and not real. If a certain group thinks &quot;the algorithms&quot; are suppressing their speech, then the algorithms are either bad, or aren&#x27;t transparent enough to prove that they&#x27;re unbiased.<p>At the end of the day, people believe that these companies have an agenda that they push by shaping discussion in certain ways. Whether true or not, the best way to combat that is complete transparency, or just no filtering or reordering at all.
tasty_freeze大约 5 年前
It is easy to ascribe bad motives to a person of a party of a different affiliation and to assume this is just selective application of the law to advance political goals.<p>However, there is another way to look at this different from my own political leanings. As little effort as Democrats put into anti-trust prosecutions, Republicans (of the past 30 years) have been anti-anti-trust. In the late 90s when the DOJ had Microsoft on the rack, nominee Bush said he&#x27;d stop the antitrust effort. In fact even though MS had been found in violation of antitrust laws, then President Bush stopped the effort to break up MS and instead they were told to make relatively minor changes in their behavior.<p><a href="https:&#x2F;&#x2F;slashdot.org&#x2F;story&#x2F;01&#x2F;09&#x2F;06&#x2F;157258&#x2F;Bush-Administration-Stops-Microsoft-Breakup" rel="nofollow">https:&#x2F;&#x2F;slashdot.org&#x2F;story&#x2F;01&#x2F;09&#x2F;06&#x2F;157258&#x2F;Bush-Administrati...</a><p>So is it that the current administration finally believes there is a place for antitrust, or is it using the law as a political tool?
评论 #22379815 未加载
评论 #22379525 未加载
acd大约 5 年前
In Sweden they are probably liable for moderating user content. The Swedish law is called BBS lagen, the bulletin board system law. Yes the law is a bit old but it should regulate content published by users and that the hoster of content has liability for the data published on the platforms.
sneak大约 5 年前
Reading the following:<p>&gt; <i>“No longer are tech companies the underdog upstarts. They have become titans,” Barr said at a public meeting held by the Justice Department to examine the future of Section 230 of the Communications Decency Act.</i><p>&gt; <i>“Given this changing technological landscape, valid questions have been raised about whether Section 230’s broad immunity is necessary at least in its current form,” he said.</i><p>...all I can think of is “well, here comes the state-sponsored moat.”<p>If they weaken these protections, the big four will just hire a few more entire buildings of minimum wage content moderators (like most of them already have running) and it’s curtains for small entrants.<p>It makes me really sad to see the US thinking about shooting its only real growth industry in the foot.<p>Edit:<p>&gt; <i>while a few Democratic leaders have said the law allows the services to escape punishment for harboring misinformation and extremist content.</i><p>It’s also terrifying to think that parts of our government want to explicitly punish people for hosting legal content that they don’t like to read.
评论 #22376215 未加载
评论 #22376227 未加载
评论 #22376221 未加载
WaitWaitWha大约 5 年前
When we used to run BBSes, we were repeatedly warned by lawyers and courts that if we start actively manage the content of other people&#x27;s posts, we become publishers and our legal protection vanishes. Why is this not the case for large social media orgs that do exactly that?
nnq大约 5 年前
Look... <i>if platforms become responsible for content published on them, it is the end of free speech? Period. You want THIS?!</i><p>The point would be to limit&#x2F;regulate targeting: either (a) they&#x27;re a no-login and no-user-personalization place, and they do no targeting everyone gets a random sample from the same content (I&#x27;d <i>really prefer this!</i>), or (b) it needs to be very clear what kind of targeting is allowed... and the line gets very blurry here, amplifying hate speech for clicks and eyeballs can probably pay well and there need to be ways to solve this problem...
OrgNet大约 5 年前
Of course they should, because they moderate content. (you can&#x27;t have it both ways... you either moderate or don&#x27;t... but if you do moderate, you are responsible for what you let through)
aSplash0fDerp大约 5 年前
&gt;with any alterations to one of the internet’s key legal frameworks likely to draw unexpected consequences. “It’s hard to know exactly what the ramifications might be.”<p>Since there is no direct bridge to the digital money, power and influence, analog types will wreck the whole thing trying to implement legislation to give them any kind of foothold on all of that easy profit.<p>The lack of influence&#x2F;sway will eventually drive the traditional powers to contrive the shortest-term solutions to destabilize the ecosystem. Its more than a &quot;war of words&quot; at play.
baby大约 5 年前
BTW this would probably include reddit and HN as well.
AnimalMuppet大约 5 年前
Well, just to put the shoe on the other foot:<p>Should US AG Barr be liable for (or bound by) comments&#x2F;tweets by President Trump?<p>This isn&#x27;t as tight a parallel as I would like. But when I make a post on HN, say, it&#x27;s <i>my</i> words and <i>my</i> opinion, and does not represent the opinion of HN (even though they moderate). I don&#x27;t speak for HN; they don&#x27;t speak for me.<p>In the same way, when Trump sends his tweets-of-the-day, that doesn&#x27;t speak for AG Barr or the DOJ (despite Trump&#x27;s idea that he is the chief law-enforcement official).<p>As I said, that isn&#x27;t quite as tight as I would like it to be. But it&#x27;s something that Barr should be able to understand at both an intellectual and an emotional level.
dfischer大约 5 年前
The solution we need is a p2p social network with user opt-in moderation lists. The gov should be no where close to this.
013a大约 5 年前
I tend to believe that the only path forward is for these &quot;global platforms&quot; to become more sharded, allowing smaller, more focused communities to thrive and self-moderate.<p>Platforms like Reddit, Discord, etc have &quot;tiers&quot; of moderation whereby community leaders handle the day-to-day moderation of individual content posted within the community, yet there is still Big Company Inc. at the top capable of moderating entire communities (you can&#x27;t create a subreddit focused on school shootings, stuff like that). These platforms have problems; there are problems intrinsic to any situation where Speech and Social Interaction is involved. But their problems are far less in both magnitude and quantity than the global platforms.<p>It seems to me that holding any organization or moderator liable for what people post on their platform would have a Supreme Court-level case on their hands concerning the first amendment. Who would win, I don&#x27;t know, I&#x27;m not a lawyer, but that feels like the ground we&#x27;re treading on.
m463大约 5 年前
There&#x27;s another facet to this story.<p>If companies are to moderate, they must have the ability to view at the content.<p>Say there&#x27;s a requirement to moderate an encrypted chat client.<p>see where this is going? Even light moderation means they keep the data collection going.
KorematsuFred大约 5 年前
Tech, Aviation and Agriculture are some of the areas where Americans are the world leaders by far and yet the American government is totally set to hurt these very industries (we&#x27;&#x27; break the evil google) and so on.<p>This is beyond idiotic.
tboyd47大约 5 年前
The Section 230 saga just shows how dangerous it is for the government to interfere in industry.<p>The CDA was passed when people were scared of the internet and looked to government to protect them from its evils. Section 230 was added to save &quot;the little guy&quot; from becoming collateral damage of this legislation.<p>Fast forward 30 years and these &quot;little guys&quot; have grown into the scary forces that everyone wants the government to protect them from!<p>Imagine if ordinary people had been allowed to sue Google and Facebook over this time. There&#x27;s good reason to think that no one would have been able to monetize the internet in such a way as Google, Facebook, etc. if not for Section 230.<p>I don&#x27;t think anyone in Congress is interested in repealing Section 230 but I&#x27;m glad people in Washington are at least talking about it.
vinniejames大约 5 年前
No. Full stop.
评论 #22376724 未加载
acd大约 5 年前
In Sweden they most probably are by a law called BBS lagen. This is the bulletin board system law. Where the provider of content are to some extent liable for the content.
carapace大约 5 年前
Should Facebook, Google shield users from the legal consequences of posting illegal posts?<p>If we were using e.g. Ted Nelson&#x27;s Xanadu (instead of the WWW) every post and link would have provenance information and it would be <i>technologically feasible</i> to make the original source of a given piece of illegal content liable for the legal consequences of publishing it, as well as each and every person&#x2F;entity that promulgated it across the network.<p>As it is now, these platforms omit or delete provenance information, making it technically impossible to moderate <i>at scale</i>.
jeffdavis大约 5 年前
The protections designed for phone companies, etc., make perfect sense: the phone company is just facilitating communication in a content-neutral way. Phone companies should not be responsible for knowing or caring what content is shared, even if it&#x27;s some kind of slander or treasonous plot being discussed.<p>But does that apply to web platforms that aren&#x27;t content-neutral? I think probably not. There is such a huge volume of communication that they should have some protections built in, but not blanket protection.
评论 #22376865 未加载
admiral33大约 5 年前
Should International Paper be liable if an extremist writes down their ideas?<p>Should the US postal service be liable if they mail it to their friends?<p>The US postal service uses dogs to find drugs in the mail, and yet we don&#x27;t charge the post master general with drug smuggling.<p>Any attempt to get rid of undesirable content should not then make you liable for the content you miss. The platform vs publisher debate is silly.
rayvd大约 5 年前
Obviously, whoever has money lawyers can go after should be liable...
shmerl大约 5 年前
It&#x27;s the same Barr who wages war on encryption.
mikedilger大约 5 年前
Conservatives want platforms to moderate in a politically neutral way. Passing a law requiring such would be unconstitutional as it would violate the free speech of those companies. Making section 230 conditional upon political neutrality might not be unconstitutional. No Internet platform would ever risk operating without section 230 protections, so they would essentially be forced into political neutrality. So the same effect would be achieved.<p>Nobody is seriously considering simply removing section 230; that would be devastating to the economy and to free speech both. Any such assertions are no more than sword rattling and idle threats.<p>Neither is anybody seriously talking about ceasing all moderation entirely. Platforms would become flooded with spam, among other things making them virtually unusable.<p>Where this all gets very complicated very fast, IMHO, is in how you define political neutrality. And I&#x27;ll stop here because that&#x27;s much too long of a discussion to have in a HN comment.
lanternslight大约 5 年前
If they are censoring, then yes.
评论 #22380242 未加载
评论 #22380774 未加载
notamanager大约 5 年前
This is such a disingenuous framing from the AG as well from media outlets who keep misrepresenting section 230.<p>That law isn&#x27;t about protecting Facebook or Google it&#x27;s about ensuring that anyone can express themselves online without needing a highly paid lawyer and a protracted trial to do so.<p>It also isn&#x27;t about publisher vs. platform, section 230 protects the Times from being sued for comments on their website same as for any bigger or smaller operation.<p>It&#x27;s tragic how the powers that be in this country are trying to insert a lawyer into every transaction like it&#x27;s a jobs program, and the infuriating part is that they are trying to convince people that it&#x27;s for their own benefit.
评论 #22376778 未加载
评论 #22377666 未加载
fragsworth大约 5 年前
I don&#x27;t know how Barr expects to have a civil discussion about any topic in the midst of what he did with the Roger Stone prosecution, and in the midst of this presidency.<p>The public and his own Justice Department cannot have a reasonable discussion with him, when his behavior and actions up to this point have almost all appeared to be for one purpose - to help the President and his supporters in criminal issues.<p>The question we all find ourselves asking is: &quot;So how is this going to benefit the President at everyone else&#x27;s expense?&quot; and even if it doesn&#x27;t benefit him, it colors the entire discussion in a bad light.
评论 #22377588 未加载
评论 #22377322 未加载
cs702大约 5 年前
Sacha Baron Cohen proposed this in his widely seen&#x2F;read keynote speech at the ADL&#x27;s annual summit:<p><a href="https:&#x2F;&#x2F;www.adl.org&#x2F;news&#x2F;article&#x2F;sacha-baron-cohens-keynote-address-at-adls-2019-never-is-now-summit-on-anti-semitism" rel="nofollow">https:&#x2F;&#x2F;www.adl.org&#x2F;news&#x2F;article&#x2F;sacha-baron-cohens-keynote-...</a><p>If you haven&#x27;t seen it before, I would highly recommend it -- regardless of whether you agree with him or not.
m0zg大约 5 年前
The sentiment expressed by many in this thread would flip 180 degrees if Zuck e.g. one day woke up and decided he doesn&#x27;t like commies (which would be a very reasonable, and amply justified opinion, in my view), and had his underlings at Facebook censor the entirety of Bernie Sanders&#x27; presidential campaign from the network.<p>My position on the issue is simple: if a site owner censors&#x2F;throttles&#x2F;shadowbans&#x2F;detrends&#x2F;etc _any_ legal speech, they&#x27;re a publisher, and and they should be liable for the stuff that remains on their site. Don&#x27;t want that? Be a carrier and don&#x27;t censor legal speech. Nothing could be easier.
评论 #22378338 未加载
cletus大约 5 年前
The Trump administration complaining about &quot;harboring misinformatioN&quot;. The ironing [sic] is delicious [1].<p>There is no universal objective truth. Specifically there are things that reasonable people can disagree about and the same set of facts can be used to argue different positions. This fact is abused by the mentally challenged to argue ridiculous positions (eg anti-vaxxers, the Moon landings are fake, that sort of thing).<p>Likewise, as seen here, one side will argue those who disagree are engaging is misinformation (and in the Trump administration&#x27;s case, from the President down there are multiple claims per day that are demonstrably false such that no one can really keep up). The agenda is to silence the opposition and undermine confidence in any sort of news.<p>ISPs were given safe harbor from liability for traffic on their network, for good reason. They just need to comply with certain standards. Tech companies really are no different and to argue otherwise would set an incredibly dangerous precedent (IMHO).<p>[1]: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=7p23mA2VV0A" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=7p23mA2VV0A</a>
RustyBucket大约 5 年前
If porn sites are liable for their content - FB should be too. Porn sites managed to survive and thrive and so will FB.
评论 #22376650 未加载
评论 #22380550 未加载
goatinaboat大约 5 年前
Yes absolutely. They exercise editorial control even if they attempt to disguise it behind “algorithms”. Everything posted on Facebook should be treated as if it was a newspaper article, for all legal purposes.
评论 #22379246 未加载
评论 #22376328 未加载
psychlops大约 5 年前
Since Facebook and Google shape the information that is seen using a proprietary algorithm, they have become publishers. Perhaps if their algorithm&#x27;s were open and available, they may have an argument in their defense.<p>Until then, it is entirely possible they are shaping a narrative based on whatever model they want.<p>I don&#x27;t buy the argument made by Barr that the scale of the platform reaches a point that it therefore requires regulation. This seems to be a simple money grab where large tech companies need to tithe to lawmakers.
评论 #22376624 未加载
评论 #22377134 未加载
fareesh大约 5 年前
If these companies are treated as a &quot;public square&quot; then the first amendment ought to apply. It&#x27;s disappointing to see enlightened ideas like free speech being taken apart by these large corporations to push what seems to be a political agenda.<p>Recent example - a female Nascar driver shares a selfie with the President of the United States and Twitter&#x27;s algorithms flag it as sensitive content. When algorithms make mistakes that lead to race based discrimination, it&#x27;s treated extremely seriously. When this sort of thing happens it seems like everyone shakes their head and chuckles &quot;oh those silly algorithms&quot;. Outcomes that marginalize folks based on political views are dangerous for your country. The shoe will be on the other foot someday.
评论 #22377631 未加载
drannex大约 5 年前
Companies are liable for their employees.<p>Employees produce content&#x2F;products&#x2F;sales&#x2F;projects for the Company.<p>Social media users create the content that give value to social networks, thus social media users are, in a way, employees of the company. The Company has the requirements of limiting and being liable for the content that exists on their platform.