TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

YouTube Recommendations Lead to More Extremist Content for Right-Leaning Users

29 点作者 grammers超过 1 年前

19 条评论

wlesieutre超过 1 年前
Apparently this is a thing on Audible as well, as I learned recently from Penny Arcade<p><a href="https:&#x2F;&#x2F;www.penny-arcade.com&#x2F;comic&#x2F;2023&#x2F;12&#x2F;01&#x2F;algo-rhythms" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.penny-arcade.com&#x2F;comic&#x2F;2023&#x2F;12&#x2F;01&#x2F;algo-rhythms</a><p><i>&gt; Gurb turned me on to a kick-ass book called &quot;The Mysterious Case Of Rudolph Diesel,&quot; and I think you should read it if you&#x27;re interested at all in the world, but you should buy it with cash in a town you don&#x27;t live in and read it in a dimly lit cavern. Because if you don&#x27;t, if The System finds out you read a book about a fascinating historical character and his mysterious disappearance, you&#x27;ll be clocked immediately by their tendrils as… whoever this is.</i>
评论 #38661815 未加载
twisteriffic超过 1 年前
This is not at all surprising. Make a brand new account and watch a few welding or machining videos. You&#x27;ll be getting PragerU, Daily Wire and Tucker clips in no time. It goes downhill quickly from there. The targeting is pretty explicit.
评论 #38659507 未加载
评论 #38659543 未加载
foota超过 1 年前
In some ways I think this is a tricky problem since you want users to get deeper into some topics, but not ones that are considered &quot;problematic&quot;, but defining those is inherently political.<p>It seems like you could define some idea of &quot;depth&quot; into a topic (based on how far out of normal viewer&#x27;s patterns it is), and only generate recommendations for items that aren&#x27;t far outside of the norm, but this would lead to a lack of depth for recommendations in niches.<p>Maybe a middle ground would be to treat sensitive topics differently in terms of &quot;vertical&quot; recommendations, by e.g., explicitly marking some categories as safe and enabling recommendations to go deeper, but only allowing &quot;horizontal&quot; recommendations for unknown topics, and maybe preventing recommendations &quot;into&quot; that topic from the outside.<p>So... if you&#x27;re watching train videos you might get to see even more niche ones, but welding won&#x27;t recommend for you fox news, and watching fox news won&#x27;t show you Alex Jones recommendations.<p>I pick on the right here since it&#x27;s in the topic (and I&#x27;m left leaning myself), but I think radicalization is an issue on the left as well (though frankly my political opinions make me think it is less impactful there, mostly because of the way people radicalize on the left I believe tends to impact less marginalized people or be in terms of policy rather than affecting people that are already beaten down).
dudul超过 1 年前
While left leaning users are only presented with a healthy selection of diverse and well argumented videos expressing a panel of perfectly reasonable view points.
评论 #38661160 未加载
roenxi超过 1 年前
This seems a bit suspicious because there is a trend of <i>defining</i> right wing content as extremist. And I&#x27;m not interested in whether something is classified as a conspiricy so much as whether it is true.<p>I&#x27;[ll pick on Ivermectin through COVID as an interesting case. Now, obviously, if you have 2 groups and one has parasites but the other doesn&#x27;t then the parasite-free group will get better COVID results. So as expected, people treated with Ivermectin got better COVID outcomes.<p>It took a long time to get the message out to explain that effect because in the spheres I listened to everyone who pointed out the statistically significant result got shut down with logical fallacies. Conspiracy theorist was definitely one.<p>I&#x27;d rather be completely correct, but I&#x27;m happy to fall for the occasional conspiracy that is backed by statistically significantly evidence. People who fall for that sort of mistake are going to get better results long term than people who ignore evidence. But this study would classify that sort of evidence-based reasoning as a right-winger being led into extremist conspiracy content. I mean, I dunno. A branch of the right wing believes in looking at primary evidence. That means they get things wrong, and sometimes right, in ways out of sync with the mainstream conversation.
评论 #38659835 未加载
turing_complete超过 1 年前
True for left-leaning users too, of course.
评论 #38659612 未加载
thr_cust超过 1 年前
Excellent, the algorithms provide what the customer wants. It&#x27;s not YouTubes place to be a nanny.
评论 #38659475 未加载
评论 #38659700 未加载
评论 #38659510 未加载
评论 #38659527 未加载
foota超过 1 年前
I realize this is a contentious topic, but I don&#x27;t think it necessarily deserves to be flagged.
malshe超过 1 年前
True for Twitter as well.
29athrowaway超过 1 年前
The journey to the dark side starts with some interesting Joe Rogan video, that takes you to Jordan Peterson, then Matt Walsh and ends with Stefan Molyneux, Lauren Southern and Alex Jones. That is the bottom of the YouTube iceberg. Below that point videos get mass reported and taken down.
评论 #38659524 未加载
lemoncookiechip超过 1 年前
This is true for most algorithms in user created content sites. Also, this isn&#x27;t exclusive to right-leaning, it&#x27;s the same for left-leaning, and other types of content all the same. It&#x27;s just how algorithms work.<p>The real question should be, should we prevent this type of content from getting recommended, and where are the lines?<p>Has a side note, I&#x27;d love to see a Twitter style Community Notes be implemented on YT. It&#x27;s the one good feature Twitter has implemented in a long while. And yes, YT has Notes, but they&#x27;re done by YT themselves (the COVID ones for example).
thelastknowngod超过 1 年前
I&#x27;d genuinely like to see what moderate right leaning content is even available for consumption. The only thing anyone seems to talk about anymore is the grifters and lunatics.
评论 #38661017 未加载
autoexec超过 1 年前
I suspect that the overall premise of the paper is correct, but it&#x27;s interesting that they repeatedly reference lists of what they call &quot;problematic&quot; right wing categories such as “IDW,” “Alt-right,” “Alt-lite”, “AntiSJW”, “Conspiracy”, “MRA”, “ReligiousConservative”, “QAnon”, and “WhiteIdentitarian” while they seem to only recognize a single category as extremist left content: the &quot;Socialist&quot; category.<p>If you&#x27;re specifically looking out for a long list of right wing extremist content categories, but only one category of left wing extremist content is there any wonder that you&#x27;d find that youtube pushes people to extremist right wing stuff to a greater extent than they do the extremely limited left wing extremist content being considered?
thiago_fm超过 1 年前
No shit, Sherlock
butterNaN超过 1 年前
The study considers the following &quot;Very Left&quot; [1]:<p>- MSNBC<p>- Senator Bernie Sanders<p>- Elizabeth Warren<p>- Vox<p>I mean, I suppose it is understandable if your political experience is solely American. But I do wonder if one considers these &quot;very left&quot;, what will happen if they come across political concepts such as Anarchism? If they read Malatesta&#x27;s writings, for example, would their minds just explode?<p>[1]: <a href="https:&#x2F;&#x2F;www.pnas.org&#x2F;doi&#x2F;10.1073&#x2F;pnas.2213020120#supplementary-materials" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.pnas.org&#x2F;doi&#x2F;10.1073&#x2F;pnas.2213020120#supplementa...</a>
legitster超过 1 年前
&gt; In this study, the research team defined problematic channels as those that shared videos promoting extremist ideas, such as white nationalism, the alt-right, QAnon and other conspiracy theories. More than 36% of all sock puppet users in the experiment received video recommendations from problematic channels. For centrist and left-leaning users, that number was 32%. For the most right-leaning accounts, that number was 40%.<p>They defined problematic channels as anything specifically espousing far-right wing ideas, and found that ring-wing users were <i>only-slightly</i> more likely to be recommended content from them.<p>It&#x27;s kind of disappointing they couldn&#x27;t find <i>something</i> problematic or conspiratorial from the left, even just for the sake of comparison.
realjhol超过 1 年前
It must be election season. Like clockwork, the pro-censorship hit pieces start rolling out
nailer超过 1 年前
&gt; For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content. Recommendations for left-leaning users on YouTube were markedly fewer, researchers said.<p>This depends on the researcher&#x27;s definitions of &#x27;extremism&#x27; and &#x27;conspiracy theories&#x27;.<p>- Recently we&#x27;ve seen many left wing people state that disassembling people in front of their families - surely an &#x27;extreme&#x27; act - is a &#x27;beautiful act of resistance&#x27;, and that calls for genocide against Jewish people (surely also &#x27;extreme&#x27;) may not constitute hate speech in some contexts.<p>- For the last 7 years we&#x27;ve had many people believe in the Russiagate conspiracy theory.<p>- I&#x27;m not sure &quot;problematic&quot; has any real meaning.
monsecchris超过 1 年前
How do I get my YouTube feed to do this? All I get is a stream of mentally leftist incoherent nonsense.