TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

YouTube’s Product Chief on Online Radicalization and Algorithmic Rabbit Holes

169 pointsby Bhilaiabout 6 years ago

25 comments

cameldrvabout 6 years ago
* Let me configure the recommender.<p>* Let me turn it off entirely.<p>* Let me adjust the relative weight that the current video gets vs my past views (i.e. do I want to see related videos to this one, or ones that YouTube generally thinks I&#x27;ll click on)<p>* Let me adjust the weighting of thumbs up vs. watch time.<p>* Let me configure the homepage<p>* Let me make the entire recommender panel be just subscriptions or items from a particular list<p>* Let me replace the recommender entirely with something of my own devising, called back through a webhook<p>These things would make YouTube much more useful for me. I&#x27;m not going to YouTube just to kill time, and I pay them $13&#x2F;mo. They&#x27;re not getting any ad revenue from me, so why am I stuck with their recommender that only cares what it thinks will cause me to spend the most time on YouTube. I am not interested in spending the most time on YouTube. I&#x27;m interested in getting the most out of it in the minimum time.
评论 #19526395 未加载
评论 #19525962 未加载
评论 #19526161 未加载
评论 #19526182 未加载
评论 #19524689 未加载
评论 #19526497 未加载
评论 #19527986 未加载
评论 #19527239 未加载
评论 #19526604 未加载
评论 #19526036 未加载
评论 #19528309 未加载
rikkusabout 6 years ago
Whenever I went to YouTube I would get suggested videos that were either interesting and related to stuff I had watched, or not interesting to me, but no problem. The algorithm just didn’t quite judge what I might be interested in as well as it could.<p>Occasionally I would try dismissing one of the suggestions in the hope I wouldn’t see similar again, but then I’d still see it’s kind repeated and was just disappointed it wasn’t taking my request into account.<p>Suggestions are hard though. No problem.<p>One day, though, I made the mistake of following a link to YouTube which turned out to be to a video of an American political figure (not even sure if he is a politician) whose views were of the sort that court controversy - on purpose, I guessed.<p>What I didn’t expect was that YouTube would then offer me videos featuring the same person - or people with similar ultra-polarising messages - every time I returned to YouTube.<p>I hadn’t realised that not only does YT take into account what you have watched a _lot_ of, it also seems to massively favour what you have watched once - if that is something especially controversial or perhaps otherwise ‘special’.<p>While I don’t mind pointless suggestions I will ignore, it’s very concerning that without ‘liking’ or repeatedly watching such material, people are being suggested it time after time, even when they ask to have it taken away.
评论 #19525600 未加载
评论 #19528034 未加载
minimaxirabout 6 years ago
&gt; Sorry, can I just interrupt you there for a second? Just let me be clear: You’re saying that there is no rabbit hole effect on YouTube?<p>&gt; I’m trying to describe to you the nature of the problem. So what I’m saying is that when a video is watched, you will see a number of videos that are then recommended. Some of those videos might have the perception of skewing in one direction or, you know, call it more extreme. There are other videos that skew in the opposite direction.<p>I assisted with data research on YouTube recommended videos (<a href="https:&#x2F;&#x2F;www.buzzfeednews.com&#x2F;article&#x2F;carolineodonovan&#x2F;down-youtubes-recommendation-rabbithole" rel="nofollow">https:&#x2F;&#x2F;www.buzzfeednews.com&#x2F;article&#x2F;carolineodonovan&#x2F;down-y...</a>), and this claim is misleading at best. There may be <i>some</i> videos that aren&#x27;t extreme, but that doesn&#x27;t matter if they are a) in the extreme minority and&#x2F;or b) appear very far down in the queue of 20+ recommended videos when users typically either do &quot;Up Next&quot; or look at&#x2F;action only the top few recommendations.<p>This isn&#x27;t a &quot;both sides&quot; issue.
评论 #19526014 未加载
评论 #19523955 未加载
评论 #19524221 未加载
评论 #19523921 未加载
mberningabout 6 years ago
Why is it an acceptable expectation for the user to self moderate certain information but not others? For example, if I watch a video about modding a car, then that links me to another video of more extreme modification, and so on, up to the point that I am no longer interested and stop watching. That is OK for the people at YouTube. But when it comes to politics they feel that people cannot decide what they do and don’t agree with and self moderate. I get that you could make the argument that politics is different, but is it that much different than some other topics that could lead to undesirable outcomes?
评论 #19524281 未加载
评论 #19527913 未加载
评论 #19527132 未加载
annadaneabout 6 years ago
Radicalization aside, is anyone just annoyed with how much worse the recommendations are compared to how they were? Damn near impossible to discover anything given how biased it is towards &quot;popular&quot; content as opposed to, y&#x27;know, related to what you&#x27;re watching
评论 #19528059 未加载
cssabout 6 years ago
&gt; The first is that using a combination of those tools of authoritative content and promoting authoritative content is something that can apply to other information verticals, not just breaking news.<p>How is this not a clear admission they are a publisher and not just a platform?
yuliypabout 6 years ago
So it feels like the way YouTube&#x27;s algorithms work re suggested videos is heavily related to &quot;what other videos did the people who watched this video watch?&quot;. Now imagine you have a video that expresses some conspiracy theory. That video is watched by a bunch of people who are into conspiracy theories. Then you come along, and YouTube now recommends you videos that the people who watched the same video as you just watched. And bam you get sucked toward those others who were interested in that video.<p>Now this does work in the other direction, where someone with more normal viewing habits watching a video will steer the conspiracy theorists toward the types of videos they watch, but the size of the pools means it&#x27;s the polarizing direction that would have a greater effect.
评论 #19528048 未加载
评论 #19524809 未加载
apiabout 6 years ago
The thing is this is how algorithmic timelines, lists, etc. work. They&#x27;re programmed to drive engagement. Fear, hate, outrage, sensationalism, and controversy are what does that as has been known since the time of P. T. Barnum.
canthonytucciabout 6 years ago
At one point, if you watched an Asmongold video, you&#x27;d be bombarded with suggestions for more of them.<p>Never seen it happen as strong with any other topic.<p>I don&#x27;t know about increasing levels of extreme, but I can definitely see it possible that suggestions could get people to a place where they are feeling like an idea or a thing is more popular than it really is.<p>He talks about it briefly here (nsfw)<p><a href="https:&#x2F;&#x2F;youtu.be&#x2F;Rstb4IRXZLQ?t=625" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;Rstb4IRXZLQ?t=625</a>
ve55about 6 years ago
Seems to be a complaint that the Internet&#x27;s Overton window is difficult to easily control. Good.
评论 #19524220 未加载
charlesismabout 6 years ago
&quot;There’s more work to be done&quot; is the understatement of the year.
sfashsetabout 6 years ago
Just out of curiosity - we have people on this thread and on other parts of the internet, claiming they&#x27;ll watch one &quot;extremist&quot;, typically right wing video, and then have their recommendations blitzed with even more extreme&#x2F;right wing content. This is the &quot;rabbit hole&quot; effect that&#x27;s being referred to in the article.<p>Can anyone say that they have distinctly <i>not</i> encountered this effect? I can recall having watched maybe a couple of Joe Rogan clips, a Jordan Peterson interview, and even some stuff I would label alt-right content. I&#x27;m personally fairly left-ish&#x2F;liberal, so I find these videos boring&#x2F;offensive and eventually will go back to my normal youtube consumption - music, sports, some tech videos.<p>My recommendations are all of the latter categories, not of extreme&#x2F;political right wing category. I guess due to selection bias, most people who don&#x27;t have a problem with their recommendations won&#x27;t report anything, while most people who do will leave comments that they too experienced the rabbit hole effect. I&#x27;m wondering if that leads to the problem described being overstated. Or am I really the only person who&#x27;s managed to watch some fairly extreme right wing content, and had my recommended videos stay intact?
评论 #19527591 未加载
dmitryminkovskyabout 6 years ago
The big question in my life is whether an app can succeeed in 2019 without manipulating user psychology like this.<p>I built Pony [0], an email platform that delivers once a day, to see if this is possible. In the UI I eschewed every traditional user manipulation technique I could think of: there are no notifications, there are no unread message counts. There even isn’t read&#x2F;unread message state.<p>I truly wonder whether people can adapt to a totally unstructured online platform, an unguided, unprompted experience that they create for themselves. My bet is they can.<p>[0]: <a href="https:&#x2F;&#x2F;www.pony.gg" rel="nofollow">https:&#x2F;&#x2F;www.pony.gg</a>
评论 #19526964 未加载
评论 #19527970 未加载
IronWolveabout 6 years ago
If only someone took the rss feeds from youtube channels and made your own front end. Ditch their portal.<p>Turn youtube into a type of podcast db, its just a media storage site. Then you could create your own portal, parse the views&#x2F;ratings and provide real statistics.<p>I don&#x27;t watch youtube for youtube, I watch youtube for user-created content. And some of that content has already moved off onto 3rd party sites. Floatplane anyone? Patreon anyone? list goes on.<p>I just wish finding video content was as easy as searching for podcasts. Youtube provides RSS links, I use them to add to my podcast player.
_uhtuabout 6 years ago
I&#x27;ve noticed an interesting thing in terms of &quot;youtube rabbit holes&quot; towards extreme content. I&#x27;m what American society would generally consider &quot;liberal&quot;, and I watch a lot of videos about fixing climate change, medicare for all, etc. Interestingly, I don&#x27;t usually get &quot;liberal&quot; recommendations on my home page.<p>However, if I watch a video from a conservative angle, even just one or two videos, I almost immediately get extreme right wing content in my recommendations. Stuff like PragerU, ReasonTV, NRATV, etc. Even watching videos that I wouldn&#x27;t consider right wing, just critical of certain left wing sects, like h3h3 for example, tend to almost immediately lead me into videos like &quot;DUMB FEMINISTS GET OWNED - COMPILATION&quot;.<p>It&#x27;s strange the rabbit holes almost always take me deep into right wing territory, but never really into left wing territory.
评论 #19526386 未加载
评论 #19525043 未加载
评论 #19524604 未加载
评论 #19524750 未加载
username223about 6 years ago
Thank $deity for <a href="https:&#x2F;&#x2F;youtube-dl.org" rel="nofollow">https:&#x2F;&#x2F;youtube-dl.org</a>. I can&#x27;t be bothered to deal with Google&#x27;s &quot;up next&quot; algorithm.
swamy_gabout 6 years ago
If you call Jordan Peterson videos or Joe Rogan clips extremist videos, then I think this is clearly a step in the wrong direction. Their videos thrive on YouTube not because they hold extreme views, it&#x27;s because their videos are very engaging, fun and you learn something from it.<p>If this is a ploy to push more mainstream narratives through YouTube that is akin to watching CNN&#x2F;CBS or ABC, then I&#x27;ll be looking for other platforms.
评论 #19524082 未加载
评论 #19524041 未加载
评论 #19524121 未加载
评论 #19524009 未加载
评论 #19524508 未加载
BEEdwardsabout 6 years ago
&gt;Yeah, so I’ve heard this before, and I think that there are some myths that go into that description that I think it would be useful for me to debunk.<p>This is no myth bro, go to youtube pick a political video leave on autoplay, watch it go to shit.
mindgam3about 6 years ago
It’s almost like the YouTube guy is taking his talking points directly from Zuckerberg’s script.<p>-----------<p>1. “of course we take this seriously”<p>- Mohan: &quot;Having said that, we do take this notion of dissemination of harmful misinformation, hate-filled content, content that in some cases is inciting violence, extremely seriously.&quot;<p>- Zuckerberg 2016: &quot;we take misinformation seriously. We’ve been working on this problem for a long time and we take this responsibility seriously.&quot;<p>2. “we’re simply trying to help people get accurate information”<p>- Mohan: &quot;when users are looking for information, YouTube is putting its best foot forward in terms of serving that information to them.&quot;<p>- Zuckerberg 2016: “Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information.”<p>3. “fear not, it’s our users who are in power&#x2F;in control”<p>- Mohan: &quot;But YouTube is also still keeping users in power, in terms of their intent and the information that they’re looking for.&quot;<p>- Zuckerberg 2004: “People have very good control over who can see their information.”<p>- Zuckerberg 2017: “Our full mission statement is: give people the power to build community and bring the world closer together. That reflects that we can’t do this ourselves, but only by empowering people to build communities and bring people together.”<p>And last but definitely not least,<p>4. “we’re proud of our progress, but there’s more work to do”<p>- Mohan: &quot;It’s an ongoing effort. I think we’ve made great strides here. But clearly there’s more work to be done.&quot;<p>- Zuckerberg 2016: “We’ve made significant progress, but there is more work to be done.”<p>- Zuckerberg 2018: “I’ve learned a lot from focusing on these issues and we still have a lot of work ahead…I’m proud of the progress we’ve made in 2018… I’m committed to continuing to make progress on these important issues as we enter the new year.”<p>Why anyone would copy Zuckerberg’s script at this point is beyond me.<p>-----------<p>Sources:<p>- Zuck 2004: <a href="https:&#x2F;&#x2F;www.thecrimson.com&#x2F;article&#x2F;2004&#x2F;2&#x2F;9&#x2F;hundreds-register-for-new-facebook-website&#x2F;?page=single" rel="nofollow">https:&#x2F;&#x2F;www.thecrimson.com&#x2F;article&#x2F;2004&#x2F;2&#x2F;9&#x2F;hundreds-registe...</a><p>- Zuck 2016: <a href="https:&#x2F;&#x2F;www.facebook.com&#x2F;zuck&#x2F;posts&#x2F;10103269806149061" rel="nofollow">https:&#x2F;&#x2F;www.facebook.com&#x2F;zuck&#x2F;posts&#x2F;10103269806149061</a><p>- Zuck 2017: <a href="https:&#x2F;&#x2F;www.facebook.com&#x2F;notes&#x2F;mark-zuckerberg&#x2F;bringing-the-world-closer-together&#x2F;10154944663901634&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.facebook.com&#x2F;notes&#x2F;mark-zuckerberg&#x2F;bringing-the-...</a><p>- Zuck 2018: <a href="https:&#x2F;&#x2F;www.facebook.com&#x2F;4&#x2F;posts&#x2F;10105865715850211&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.facebook.com&#x2F;4&#x2F;posts&#x2F;10105865715850211&#x2F;</a>
评论 #19525946 未加载
habosaabout 6 years ago
I work at Google but not on anything related to YouTube. This opinion is my own: that was very disappointing to read.<p>I <i>use</i> YouTube, I don&#x27;t browse it. I search for what I want, and then I watch it. Yet YouTube seems absolutely determined to send me down rabbit holes for some reason. For instance my friend sent me a video by some cable news neo-philosopher (a Jordan Peterson type) and since then I can&#x27;t get rid of low-quality vids in my recommendations that are trying to make me angry. &quot;{Person} totally EVISCERATES {other person}&quot; being the calmest of them.<p>The recommendation algorithm is also just not even good at its job. I am Jewish and the number of videos I have seen recommended to me that are thinly-veiled anti-semitism is pathetic. Do they really think I am going to watch those?<p>The one thing from the article I do believe is that these extremist videos (mostly) don&#x27;t monetize. So why do they dominate? Honestly I think we just suck at building recommendation systems.<p>I am not a big fan of Facebook, but I applaud their recent ban of white supremacist content. Freedom of speech is one thing, freedom to use someone else&#x27;s site to speak to millions is another thing. YouTube should just take a harder stance on all of this dangerous crap. The Sandy Hook deniers and the Antivaxxers will have a hard time finding a new home.<p>&lt;&#x2F;rant&gt;
评论 #19526466 未加载
评论 #19526496 未加载
lvsabout 6 years ago
This is nothing more than a very long-winded PR denial.
whotheffknowsabout 6 years ago
This is what google okay is for. Trust me.
wbronitskyabout 6 years ago
I&#x27;m so tired of people calling the regulation of hate speech and violent speech on private platforms &quot;censorship.&quot; These people are more than free to express themselves in any public platform or their own platform. They are not censored in any way. They are being removed from a private platform for not complying with the rules of the platform.<p>You wouldn&#x27;t say that a drunk who walked into a nice restaurant yelling hateful things was being censored when they are asked to leave. Neither would you say that a man trying to convince kids to get into his van outside of McDonalds was being censored for being kicked out of the McDonalds. Both of those are private businesses regulating the behavior of their customers, and not in any way censorship.<p>&gt; Randomly clicking on videos<p>The entire point is that people don&#x27;t click on random videos, they click on videos they think are interesting. Call it &quot;rage clicks&quot;, curiosity, or some people are just bad people and want to see bad things, just don&#x27;t call it random, because thats obviously wrong.
评论 #19524545 未加载
评论 #19524200 未加载
评论 #19524673 未加载
评论 #19526011 未加载
评论 #19524219 未加载
评论 #19525818 未加载
评论 #19524966 未加载
评论 #19524505 未加载
评论 #19526251 未加载
评论 #19526677 未加载
评论 #19524182 未加载
评论 #19524212 未加载
评论 #19524483 未加载
评论 #19526210 未加载
jkha0about 6 years ago
Giving people more of what they want is the optimal user experience.<p>People are ultimately choosing what to watch.<p>It&#x27;s a clear conflict of interest for traditional media to blame their younger competitors for all the world&#x27;s problems.
评论 #19524985 未加载
oustaabout 6 years ago
the stupid idea that it is the algorithm driving people to extreme content. It is not. people are driven by that. our whole societies are driven by violence, fear, disruptive stuff