If Facebook decides which political ads are truthful, then either they reject <i>all</i> political ads (because when was the last time someone ran a political ad that was wholly truthful?), or else they have do decide where to draw the line. Wherever they draw the line, they're going to reject someone's ad, and be exposed to screams of how they're biased for the other side. That's not going to end well for them.<p>I mean, just running them all is <i>also</i> getting Zuckerberg raked over the coals[1], but I think picking and choosing which ads are "truthful enough" is going to be worse.<p>And that's if Facebook is actually completely unbiased. If this becomes a vehicle for the biases of those at FB charged with judging the ads, that's even worse.<p>[1] Zuck was getting raked over the coals in the name of truth, but I suspect it was at least partly because those who did so thought they would benefit politically if FB censored their opponents' ads.
One thing to note: this only applies to politicians. It seems normal people and political groups can't post false political ads. <a href="https://www.washingtonpost.com/news/powerpost/paloma/the-technology-202/2019/10/28/the-technology-202-facebook-takes-down-false-ad-from-political-group-but-it-still-won-t-police-politicians-directly/5db5bf61602ff10cf14f97e5/" rel="nofollow">https://www.washingtonpost.com/news/powerpost/paloma/the-tec...</a>
250 signatures on a letter at a company with 35,000 employees is an eruption of dissent?<p>I'm reminded of a quote about urban warfare, something like "In a city of 10 million, if 1% of the population opposes you, you have 100,000 adversaries." That seems to apply here.<p>0.7% of the company is writing to complain? Okay - what amount do you expect to complain? How many would complain about the opposite direction?
I don't want a private owned <i>platform</i> that as far as today plays a huge role in being a place for the public discourse to regulate speech.<p>Of course, and it pains me to have to state the obvious for fear of strawmaning, saved the applicable boundaries like distribution of child porn, terrorist groups recruiting pages and so on.<p>Take Twitter as an example, does an abhorrent job on that, they have no standard other than liked/disliked people.<p>- They have no transparency, people you follow will get banned and you won't be notified.
- People will get banned for tweeting exactly what "liked" people tweeted.
- They made the utterly stupid decision of changing the interpretation of their Verified mark from "This is the real person" which is perfect, to "we kind of support whatever this verified person says" which makes no freaking sense. Can you imagine a Bell PR guy saying at a press conference "We are sorry for what one of our landlines customers said on the phone, we turned their line off"?<p>It is transparent to me that any effort in making platforms like Facebook, Twitter and so on to take the role of speech regulators isn't coming from regular people, it comes to the detriment of the common folk like me and you.
It would give Facebook way too much power if they could decide what is true and what is not. Better allow lies than to block free speech.<p>I wonder if this open letter by employees advocating for more control over content combined with Mark Zuckerberg's 'hands-off stance on political ads' are just a coordinated act of 'good cop, bad cop' designed to manipulate the public. Also, my cynical side thinks that maybe some of these government authorities are in on this charade.<p>It seems like a show to make people think that the good employees of Facebook are on the public's side. Whatever the big mean Zuckerberg wants must be bad for everyone.<p>Facebook must have a PR team the size of a small country working for them by now. Of course everything they do is orchestrated. We have to be really cynical to see through the BS.<p>The government is completely under the thumb of these big corporations. Many of the regulations that are coming out of Washington are carefully crafted by corporate lobbyists to superficially look like they're bad for corporations and good for the public, but in reality they're intended to give corporations more power and to create a moat around their monopolies. The government and corporations are on the same team; their common objective is to fool the public into slowly accepting the erosion of their most basic rights so that corporations can have more money and governments can have more power for themselves.
I have a lot of issues with the framing of this article (it's hard to imagine <i>any</i> major strategic decision of Facebook that you couldn't find 250 employees to sign their name opposing it, and nearly everything that happens at Facebook has a corresponding, public, Workplace post).<p>Moving past that, the ideas mentioned in the final paragraphs did have one interesting suggestion: change the visual display for political ads. Zuck has consistently made the good point that it's very difficult to set a clear boundary for what constitutes a political issue, but it is not difficult to determine whether or not an ad is being run by or in service of <i>a given politician</i>. Changing the visual display (even something as draconian as a persistent disclaimer stating that this is an advertisement with claims made by a politician and that everyone should do their own research) would at least remind people of the policy.
Why should Facebook now be responsible for fact-checking political ads? Watching the testimony where a bunch of politicians basically berated Mark Zuckerberg about how it’s now Facebook’s job to police politicians and keep them honest (because, you know, ALL politicians lie and cannot be trusted) is very telling about the state of our government and democracy. Our leaders cannot police themselves or their peers so they are looking to an outside entity to do it, and moreover casting blame for their own failures.<p>Political attack ads have always been on cable TV, spouting bold-faced lies and half-truths for as long as I can remember. It now seems that politicians have found a new medium. And they want that service to bear the brunt of their operational status quo. Why not address the real problem in politics that leads to the symptoms of the disease at hand instead of shifting the work and burden of honesty to someone like Facebook? Has it even been proven they are equipped and capable of the task?
> For the past two weeks, the text of the letter has been publicly visible on Facebook Workplace, a software program that the Silicon Valley company uses to communicate internally.<p>For those who have never used Workplace, this literally just means "someone posted it to Workplace." It's not an abnormal or unique thing. It also wouldn't surprise me if "250 people signed it" means "250 people commented in agreement". I wish the reporting gave more details on <i>who posted</i> the petitions and what it means to "sign the petition". I understand protecting sources, but unless Workplace has added new features, anything posted <i>has</i> to come from someone with a profile.<p>That said, it's still (arguably, at least) news to cover internal divisions over a policy, but unfortunately the authors don't seem to realize how common it is at Facebook for employees to openly push back on leadership decisions while concurrently working as hard as they can to deliver impact downstream of them (it may sound odd, but it's entirely possible to disagree with a strategy and <i>vocally advocate for your preferred course</i> but also trust that your leadership may be better equipped to set said strategy and work to implement a strategy that is not what you would have chosen).
Facebook has no business saying which ads are wrong/lies. If they do this it opens up such a can of worms. Their stance is the only logical stance. I imagine if AOC's ads were blocked on Facebook she would suddenly want answers and claim she was censored.
This has came up in my conversations with friends who work at Facebook and they always seem to use some internal talking points about creating a "Ministry of truth" type of situation. They argue that Facebook cannot (or should not) be the arbitrator of truth. My answer to them is very simple, if you want to be a (social) media company then you have to take some (social) responsibility and not amplify falsehoods in an already charged environment. Corporate profits at the cost of ruining the society by spreading falsehoods should not be an acceptable norm.
I don't understand why Zuckerberg doesn't just cut his losses and remove political ads. It does not seem worth it financially or non-financially.
Honestly, I’ll believe that Facebook employees are sincerely concerned when I see them walking out or quitting in large numbers. “Open letters” will do zilch in a company known for lies, dishonesty and deception. Only if the earnings take a big hit will Mark Zuckerberg or Sheryl Sandberg do anything.<p>Edit: Where were these employees when fake news and misinformation resulted in the killing of thousands of people in other countries?
He should not have allowed political ads in the US election. I dont' remember what was his excuse for allowing them but it sounded like a bad decision. There's just no winning in that game. He allows himself to be used as a scapegoat.<p>Of course then they 'll go on and ask for facebook to censor all <i>user posts</i> , but that will probably hit free speech protections.
I can't blame Zuck to work so hard and try to execute the balancing act to get that political ad money, because it's targeting what is now Facebook's core demographic.<p>Young people aren't using Facebook anymore. This doesn't mean young people don't have an account, but I suspect no one under 35-40 is really engaging with the platform meaningfully. Facebook is the new TV and is going to go out like TV - in a slow, overly long drawn out whimper chock-full of pharmaceutical, lawyer, and mesothelioma ads aimed at the aging demographic.<p>Facebook has a stranglehold over older people but younger people are not falling into the trap. Facebook's ability to give Zuckerberg power is going to fade over time.
Silicon Valley's propension to introduce externalities into the world yet never want to deal with the negative ones because "you guys have no idea how hard this is" will never cease to amaze me. But hey, I guess this is why that book is named "Chaos Monkeys".<p>You know, if it's too hard to run a political ads business that doesn't enable mass scale targeted disinformation and wreaking havoc on democracies, then maybe the responsible thing to say isn't "sorry our platform has enabled 2 major election fuck-ups in the Western world in 2016, but it's not our role to be an arbiter of truth so we'll do nothing" but rather : "ok, we haven't yet found a way to operate this that's not harmful to society, so we've decided not to run political ads until we do " ?<p>Because at the end of the day, if you don't take this into your own hands and instead you make it look like it's a choice between preserving a 15 years old private company's bottom line and keeping centuries old democracies functioning, that's gonna be a <i>really</i> easy one to make for lawmakers around the world.<p>The hands off stance is a recipe for being regulated into oblivion eventually, which isn't good for shareholders either.
The problem to me can be summarized pretty simply: since unfortunately the USA doesn't have any law on the books to require political advertisement to be truthful (contrary to normal advertisement where it is enforced aggressively).<p>Considering how effective is Facebook at targeting individuals; you can do a lot of damage spreading lies on the platform. The question is moral: even if there's no law forbidding Facebook from spreading lies, should the company hold itself to a higher standard?<p>IMHO Facebook should do that, because it risks creating a lifelong enemy in the political side that's likely to win the next elections and as the Romans would say, Vae Victis.<p><a href="https://www.factcheck.org/2004/06/false-ads-there-oughta-be-a-law-or-maybe-not/" rel="nofollow">https://www.factcheck.org/2004/06/false-ads-there-oughta-be-...</a>
Facts and truth are two different things. A set of facts can be chosen to say something untruthful.<p>And there can be different 'truths' depending on the values people bring to the analysis of facts.<p>Having Facebook, or their designates, arbitrate 'truth' will only create a privatized ministry of truth.
Title:<p>> Dissent Erupts at Facebook Over Hands-Off Stance on Political Ads<p>From the article:<p>> More than 250 employees have signed the message<p>Facebook has >35,000 employees. 250 signees is <0.7% of employees. Hardly seems like an "eruption" of dissent.<p>The article does acknowledge this:<p>> While the number of signatures on the letter was a fraction of Facebook’s 35,000-plus work force...<p>So why use such a misleading title? "A tiny fraction of company employees does not like company policies" is a statement you can make about every sizable company.
Maybe it is just me, as I didn't see it in the comments. But why on earth should Facebook have to run political ads at all?<p>This should be regulated. Provide the same exposure to all the candidates. No targetted ads (how come targetted + political ever seemed like a good idea?). Only link to their program if there's a need at all.<p>But I bet there's plenty of people in queue for ads on FB's platform, so I don't think that not running political ads would hurt them much.
I don’t know why Zuckerberg has so colossally failed to convince the world that Facebook, Inc. should not be an arbiter of what is true and what is false.
The Correct Answer is to restore the Fairness Doctrine, updated to include cable, social, etc.<p>Media companies rejoiced when Reagan sabotaged political discourse. Political ads are huge money and are almost pure profit.<p>Why would Facebook, Twitter, etc. behave any differently?<p><a href="https://wikipedia.org/wiki/FCC_fairness_doctrine" rel="nofollow">https://wikipedia.org/wiki/FCC_fairness_doctrine</a>
The responsibility of handling and interpreting misinformation needs to be shifted to the consumer. People will lie to you almost every day, and you must figure out how to deal with it.<p>The validity of information should be vetted by those consuming it, not an entity who is in any kind of power. If enough people think someone is lying or untruthful, with enough evidence, then the content should be flagged, labeled, or potentially taken down, because every consumer had the opportunity to contribute their perspective leading up to handling said content.<p>We need to move away from the idea that certain authorities in our lives (governments, companies, organizations, or any entity with significant power) can determine what's true or not, because it's highly likely to be biased in either direction.<p>It's incredibly easy for a collective body to double cross their word—to say one thing and intend another at the expense of those who aren't in power.<p>The problem is, when an organization makes the decision to censor content, it is usually a very small few who make that biased decision on behalf of the—seemingly big—company. Effectively, it is a small team, or even one or two people, unless it's done by a dedicated team of moderators driven by policies, procedures—or worse: bribery—that may or may not be something those individuals believe in.<p>When it's left to the people interacting with that content, it's their choice in how to deal with it individually or collectively. That is maximum freedom. To enforce censorship, as a government or organization, is to assume that consumers are idiots, and that's not an assumption they should be making.
Why are the only two options to let ads through or reject them? How about, fact check them and visibly mark them as being potentially false and a link to more details. This should make both sides happy: Zuck who believes the public should decide for themselves, and the rest.
Maybe it would be preferable to provide an immutable log of political ads that have been run, who ran them and with <i>all</i> targeting information.<p>This would be open and transparent and allow politicians to police the turf instead of facebook.
This is not a rhetorical question:<p>If it's ok to lie in a political ad, if the entire responsibility for determining its truthfulness lies on the shoulders of the people view the ad, is it also ok for an administration to lie to citizens?
These people are going to be constructively terminated.<p>Constructive termination is where they want to fire you for 'x' but can't legally so they construct 'y' as the real reason for firing you.
Why would people be willing to give up their fundamental rights so easily. Isn't free speech mainly about invalidating what is false or immoral through discourse?
The mainstream media has lost control of the narrative because of places like FB. Everything that covers politics is a form of political ad and EVERYONE has an agenda. So how will you control that?<p>What NYT, WaPo others offered was a brand and certain Network Effects (subscription). They can not compete with the Network Effects of FB and have been trying to rein in FB.<p>These entities are desperate to regain control of the narrative or they'll lose their value.<p>The reality is, NYT or WAPO can run false news or "political ads" under the name of op-eds. On their own platforms they can highlight these op-eds on their homepage or they can just boost them on FB. If NYT is fine with op-eds that talks about anything political related as "political ads" then they have a standing here.<p>It no longer has to be op-ed. Even their news coverage is turning to political propaganda. You know how bad NYT's own editorial practice is? Just watch this recent re-writing of history [0].<p>Any let's not forget it wasn't the political ads that gave us Donal Trump, but the $5 Billion free advertising that Trump got by the mainstream media [1], watch Bannon talk about how Trump got initial boost in the polls[2]<p>[0] <a href="https://www.youtube.com/watch?v=78CE8eiWItY" rel="nofollow">https://www.youtube.com/watch?v=78CE8eiWItY</a><p>[1] <a href="https://www.thestreet.com/story/13896916/1/donald-trump-rode-5-billion-in-free-media-to-the-white-house.html" rel="nofollow">https://www.thestreet.com/story/13896916/1/donald-trump-rode...</a><p>[2] <a href="https://www.youtube.com/watch?v=CKuPYArH0Gs" rel="nofollow">https://www.youtube.com/watch?v=CKuPYArH0Gs</a> (this is an interesting interview and Bannon talks about how Trump got his boost in the polls by mainstream media)
A few links that may indicate in which direction Facebook is biased:<p><a href="https://twitter.com/donie/status/1188593050546855937" rel="nofollow">https://twitter.com/donie/status/1188593050546855937</a><p><a href="https://mashable.com/article/facebook-false-green-new-deal-ad-removed/" rel="nofollow">https://mashable.com/article/facebook-false-green-new-deal-a...</a><p><a href="https://popular.info/p/the-republican-political-operatives" rel="nofollow">https://popular.info/p/the-republican-political-operatives</a><p><a href="https://popular.info/p/facebook-allows-prominent-right-wing" rel="nofollow">https://popular.info/p/facebook-allows-prominent-right-wing</a>
I posted this in the other thread on the topic,<p>"The Facebook workers called for specific changes including holding political ads to the same standards as other advertising, stronger design measures to better distinguish political ads from other content, and restricting targeting for political ads. The employees also recommended imposing a silence period ahead of elections and imposing spend caps for politicians."<p>In the U.S., political speech is often afforded the highest amount of protection from govt. censorship (c.f. the FB is private platform/publisher). One of the reasons articulated by some First Amendment commentators is that political speech is important to self-government in a democratic society. To quote Brandeis, "Political discussion is a political duty." Further, "Implied here is the notion of civic virtue - the duty to participate in politics, the importance of deliberation, and the notion that the end of the state is not neutrality by active assistance in provided conditions of freedom . . . ." [1]<p>Public political speech should not be censored based on perceived truth or falsehood. In fact, political speech that promulgates false or misleading messages should be exposed to criticism. Again quoting Brandeis, "Sunlight is said to be the best of disinfectants . . . ."<p>However, political speech is regulated to an extent by the F.E.C., e.g. requiring disclosure notices, etc. However, the political speech issues presented on FB can be more complex than that of traditional 20th century print and broadcast media. For example, micro-targeting political speech to certain demographics may cross the line from public political speech to private speech, and perhaps should be affored less protections. See Alexander Meiklejohn [2].<p>Also, content based prohibitions of speech tend to be more troubling than content neutral restrictions, such as time, place or manner restrictions on political ads or spending caps as mentioned in the employee statement above.<p>[1] Lahav, Holmes and Brandeis: Libertarian and Republican Justifications for Free Speech, 4 J.L. & Pol. 451 (1987).<p>[2] Meiklejohn, Free Speech and Its Relation to Self-Governemnt (1948).
The amount of people making weak "both sides" arguments (nobody runs political ads without lies in them) in this thread is alarming. Facebook is easily capable of fact checking every ad on their platform, and if they can't, they should'nt run them at all. We should be prepared to demand that all political advertising be free of outright falsehoods.
The problem here is not so much Facebook (a company doing what companies do) as it is the regulatory system they fit in. This situation is unprecedented, as no single company had ever concentrated the media power Facebook has. Our legislators are barely starting to understand the problem, the ball is in their court really. In the meantime, Facebook will sit at the intersection of what's best for the company and what the law allows.
The problem with Facebook is that it's too big. Different online communities have different standards of what sort of behaviour is acceptable. Facebook is effectively splintered, there is no one community and so there is disagreement on the community standards, to a degree that I don't think can realistically be resolved. Splintering may very well be the result.<p>If social media were more decentralized, the responsibility would also be decentralized. Standards would set by the communities. And as for overall standards, that would be dealt with by the legislature and courts, which would be a huge improvement, as those are way more transparent and fair than Facebook et al.<p>Abuse of power by Facebook (or advertisers pressuring them) would be much less of a problem if people could move more easily between social media platforms.<p>I think a more decentralized model of social media would be good all around. Add some interoperability so you can still communicate when you're not on the same platform, this should alleviate some of the tendencies for these platforms to become so big and centralized.
Buying political ad is kind of like buying a new car, or a firearm.<p>If you leave the dealership with your new vehicle, and decide to go run over 10 people, the dealer is not on the hook for your actions.<p>Same with gun stores not being held liable for gun owners.<p>There may be background checks in place to ensure they aren't selling a car to someone that can't drive (Driver's license) or to make sure someone can own a gun (Background check), but once you pass the initial screening you are on your own for liability.<p>Political ads should be the same, basic KYC to verify the person buying the ad is who they say they are or allowed to represent an entity, but beyond that anything they want say let them say it, let the public scrutinize it, and let their ideas be debated.<p>I could see a world of hurt if this was completely unregulated, as in anyone could pretend to be anyone and buy an ad any which way without verification. This would lead to an insane amount of slander/mudslinging.<p>Just my 2 cents, probably not worth a penny.