This tells more about the journalists than Facebook. Facebook has always been the same. It's journalists who sold their soul so that they can make money.<p>Furthermore, it's funny hearing this hypocrisy coming from journalists, since they are the ones who are throwing away their "journalistic integrity" to make money. It's impossible to come across a mainstream publication with unbiased point of view nowadays.<p>They have to write polarizing articles to gain more eyeballs since otherwise no one will have enough attention to read their articles. They should realize there's a bigger opportunity here but instead all they're focused on is short therm revenue and complaining about how Facebook (who focuses obsessively on long term value) is being unfair.
Why don’t these publishers band together and launch their own news aggregator if they think that’s something people want? I don’t see Facebook owing them a platform here and from a personal perspective politics can’t be removed from my _social_ feed fast enough.
I assume probably a response to decreased engagement metrics. My 'sample of one': I don't look at FB anymore, going from a daily user to maybe once a month if that. I would likely re-engage if only 'human social' posts were in my main feed. Sounds like I could still find public page posts and meme video shit if I want in the second feed. I could manually attempt to manipulate this outcome but the time and effect would pale in comparison to what FB could do fairly simply.<p>As an advertiser, I would love this change if it does in fact bring daily users back to engaging on FB. I already pay for reach, but much of it is mobile and off FB news feed.<p>I wonder what their metrics for ad engagement would show though: does ad engagement increase if the ad is the only non 'human social' post in view, or does the mixed content that 'blends' an ad into similar looking content produce more engagement (perhaps through engagements that users don't realize are on an ad)? And if engagement is higher because of this blending 'trick' engagement, do I actually want to pay for it or do I want to pay maybe a higher price for more 'authentic' engagement? Probably very hard to measure, which is why in my business we use polling to look beyond metrics FB reports...
This reads pretty similar to the cries that come out whenever Google updates their search algorithm. It's as if people believe there is some code big internet companies must live by, that ensures they never change.<p>These companies live by the code of the market - they will happily make changes that make them more profitable. It does not make them evil if a small customer gets caught in the crosshairs - it just means that small customers need to be resilient to change.
Have no love for Facebook but: The Guardian would publish this wouldn't they?<p>This is a good idea by former editor (1975-95 [1]) Peter Preston: move The Guardian back to Manchester from London and trim the wage bill in the process. [2]<p>Perhaps then they might slow the cash burn of their Scott "Trust" endowment (£838.3m 2015) ... [3]<p>Also, if they weren't quite so London centric they might be less surprised by "events" outside their Facebook bubbles like Brexit and Trump.<p>I write the above as someone born in London, but now an expat (Japan) who voted against Brexit, but wasn't surprised or offended that others didn't.<p>[1] <a href="https://www.theguardian.com/profile/peterpreston" rel="nofollow">https://www.theguardian.com/profile/peterpreston</a><p>[2] <a href="https://www.theguardian.com/media/2017/apr/09/guardian-manchester-move-rumour-london-media" rel="nofollow">https://www.theguardian.com/media/2017/apr/09/guardian-manch...</a><p>[3] <a href="https://en.wikipedia.org/wiki/Talk:Scott_Trust_Limited" rel="nofollow">https://en.wikipedia.org/wiki/Talk:Scott_Trust_Limited</a>
> The experiment, which began 19 October and is still ongoing, involves limiting the core element of Facebook’s social network to only personal posts and paid adverts.<p>This is the Orwellian outcry? I'm much more concerned about the idea of everyone getting their news (especially passively) from Facebook and making Facebook some sort of "social network" instead.
My visits to Zuckerberg-land are few and getting fewer. Mainly to pick up the occasional personal message and redirect it to a more civilized channel. And <i>when</i> I visit, I am heavily insulated by various sorts of filters - I never see ads or silly recommendations.<p>So clearly, I'm in a bubble, and my ignorance is self-wrought, but honestly, I never get it when there's talk about people reading <i>news</i> - fake or otherwise - on Facebook. How do you even do that? Why would anyone? What am I missing? If I ever <i>did</i> see a news item there, my first instinct would be distrust, my first potential reaction would be seeking out some external source, and if none could be found, absolute dismissal.<p>Two billion users <i>do</i> realise FB is nothing but a cheap, tacky marketing ploy, yes? They do, don't they? Please!
Maybe the problem is trying to centrally control vast and varied resources to reach some kind of "fair" outcome.<p>Wait, Silicon Valley techies wouldn't be in favor of anything like that, would they?<p>Ohhh....
it's the old Facebook honeypot. So easy to share your content with users, until FB decides what will and will not be shared.<p><a href="http://theoatmeal.com/comics/reaching_people" rel="nofollow">http://theoatmeal.com/comics/reaching_people</a>
I don't get it, in the olden days where forums were popular everyone knew that the final say of the content in that forum was of the administrators. Now, however, we join a website we do not own, run by people we do not know, and expect them to what, do as we say? Yeah, doesn't work like that. I'm not supporting what Facebook does, but I'm not against it either. Like it, use it, don't like it, don't use it.
FB is not obligated to listen to every media outlet in the world when it decides to change its product, but certainly it's ethical to estimate the impact a change will cause on society, how small it can be. Remember: they are not testing the new feed in established and healthy democracies, but in places where independent journalism needs to gain traction
"Media does not spread FREE opinion; It GENERATES opinion" --Oswald,1918 <a href="https://en.wikipedia.org/wiki/Decline_of_the_West" rel="nofollow">https://en.wikipedia.org/wiki/Decline_of_the_West</a>
In David Kirkpatrick's book <i>The Facebook Effect</i>, which was basically the positive counterpart book at the time to Ben Mezrich's <i>The Accidental Billionaires: The Founding of Facebook</i>, Kirkpatrick talks on the first few pages how Facebook's viral and collaborative social network allowed for protests and eventually political revolutions to be organized in various countries. He says that one guy was so thankful for Facebook that he named his own child with the name Facebook.<p>If Facebook back then was lauded for enabling online communication in a way that wasn't possible before that led to political revolution for certain nations, it is ironic that these journalists are complaining about Facebook killing their click count by taking their official pages out of the mainstream feed. Taking their official pages out of the mainstream feed perhaps brings Facebook closer back to the Facebook that Kirkpatrick wrote about. This leads me to wonder whether the journalists are worried about political freedoms or really their income while raising the banner of political freedoms. The fact of the matter is that if Facebook indiscriminately removes these pages from the mainstream feed, they'll remove "good" pages, but they'll also remove "bad" pages (and let's not even get into which news media are "good" and which are "bad").<p>Today, we can see that Facebook is like any other technological product: it is amoral and can be used for "good" and "bad". Even though Zuckerberg now concedes that Facebook may have had a part to play in Trump's election, an idea he at first thought was nonsense, and even though he now makes overtures of wanting to help ensure elections around the world are fair (where a private American organization may or may not even be the right party to do this), Zuckerberg in fact does not have much ability to control what happens on his network. At a fundamental level, Facebook is about online social activity. Even if he manages to successfully ban politically manipulative ads (and which ads aren't?), bans fake accounts with amazing accuracy, and gets reviewed by government committees for political ads the way television and radio get reviewed, Facebook cannot survive without the lifeblood of its members engaging in online social activity. It would not be easy to ban all the "bad" people if there are a lot of "bad" people who still "deserve" their freedom of political speech. So there will likely always be an attack surface for some smart strategist to take advantage of Facebook and spread ideas maliciously and manipulatively.<p>Pandora's Box has been opened, and it's not going to get closed again. And if people mute those they don't like, then all it does is increase individual echo chambers, which in turn then increase political polarization.<p>If technology is amoral and can be used for both good and bad, depending on the user, we should have seen this future coming from a mile away. But we didn't, I certainly didn't. It's funny how all the lessons from <i>1984</i>, <i>Brave New World</i>, and all the other classics have come true in so many ways. Not sure if there was a book that warned about <i>this type</i> of future discussed in these comment threads, but anyway. I should have seen it coming and not sure why I didn't. So we'll live in an era where information and misinformation seem to become one.
There was a time when patio11 was going through a phase calling out Zynga as one of the shadiest companies around.<p>I remember thinking "If Zynga is that bad, then what about Facebook?"<p>Facebook's ability and willingness to manipulate just about everything in sight - WhatsApp ad policies, privacy policy changes, arbitrary censorship of content, providing clear misinformation to legal entities for e.g. the promise to EU that they cannot infer/merge user profiles, the absolute shitshow that is shadow profiles - in line with corporate profits is starting to make Zynga look angelic in comparison.
This was recently posted here: <a href="https://news.ycombinator.com/item?id=15552252" rel="nofollow">https://news.ycombinator.com/item?id=15552252</a>
I think there will be increasing scrutiny over Facebook's ability to wipe their hands while claiming "it's free speech."<p>There's a reason the media were information gatekeepers: there's a responsibility to be accurate and unbiased.<p>Facebook's product was manipulated and helped Donald Trump get elected by swaying public opinion. There has to be consequences- sorry, but in my opinion not every voice deserves to be equal.
FB is like a big dumb blind elephant. FB users are people expecting to get on the elephant for their daily commute ... and expect to get from A to B in a timely fashion.<p>Fail.
Facebook has gotten worse and worse to use. I’m still surprised it hasn’t gone the way of MySpace. Most of my friends don’t post nearly as much. I’m sure that’s partly getting older, but it also just didn’t seem fun anymore. Reading about all the the scummy stuff Facebook has done doesn’t thrill me either.