It's another sock puppet study: <a href="https://arxiv.org/abs/2501.17831" rel="nofollow">https://arxiv.org/abs/2501.17831</a><p>Very similar methodology to an earlier study the government cited in their case against TikTok: <a href="https://networkcontagion.us/wp-content/uploads/NCRI-Report_-The-CCPs-Digital-Charm-Offensive.pdf" rel="nofollow">https://networkcontagion.us/wp-content/uploads/NCRI-Report_-...</a><p>There are a number of issues with these studies, one being that the way the sock puppet bots interact with content is not exactly organic. Typically they search for content in a conditioning phase, followed by random scrolling during which the recommended videos are collected and classified by an LLM. Modern recommendation algorithms famously work by examining how long and how users engage with content, and there's none of that going on here. Still, the methodology itself and the use of LLMs to classify content is clever and probably about the best we can get.<p>Also, even if there _is_ a bias, it doesn't tell us why. Are the recommendations intentionally spiked, or is this simply the recommendation strategy that maximizes profit? (Or that the recommendation model thinks will maximize profit?) It's very difficult to tell, which is part of what makes these models dangerous and also part of what makes them difficult to regulate.<p>On a sidenote, TikTok (and presumably other content platforms) _really_ does not like these studies, as demonstrated by them nerfing search functionality after the second study above was released to prevent researchers using these techniques in the future. I haven't read the study in detail yet, but it will be interesting to see how the team at NYU Abu Dhabi adapted their methodology.
Weird to say but could be a relevant anecdote, as a trans person, that avoids politics in my socials but watches trans creators. I've never gotten a platform that doesn't show me the trans negative politics in my feeds after engaging with trans creators. I find I only get the positive politics AFTER I ignore or dislike the negatives ones.
Not that I think TikTok is particularly spotless but this sounds like the result of an algorithm learning what groups will engage with, ie democratic leaning accounts were more likely to do _something_ with republican content than vice versa.
The title here is deceptive and I'm not sure the study authors did a good job of characterizing their results. They found that Democrats were likely to see less Democrat-aligned content than Republicans were to see Republican-aligned content. But Republican-aligned as a category included both pro-Republican and anti-Democrat content and the big difference was that Democrats saw less anti-Republican and more anti-Democrat videos. Which may seem like a conspiracy until you imagine the amount of anti-Harris/"uncommitted" content that was on TikTok during the election cycle. Even if you think that have an edge to the a Republican ticket I don't think you could reasonably describe a video by a DSA guy decrying the bombing of Gaza as a pro-Republican video.<p>Perhaps more succinctly, there was more anti-Democrat material from both sides (and the binary clarification system is reductive in a way that obscures what's going on)
That was certainly my experience with it. Overwhelmingly so, even. Not just “Republican” per se, but all sorts of reality-denying nonsense, propaganda, and clearly foreign-influenced trolling. Depressing enough that I removed the app that I once loved to use.
And many platforms had a pro-Democrat lean, especially in the 2020 election. Blaming tiktok seems like scapegoating instead of looking introspectively at the Dem's failings like having a weak leader and pushing various insane ideologies.
Of course it did. This is why so many want it out of the US. It's a propaganda vehicle for the CCP. Our citizens have an IV drip of bullshit feeding them 24x7.<p>This is why Trump wants to save it. Back in his first term he tried to destroy it[0]. Flash forward to now, he openly acknowledges that it helped him win[1]. He doesn't want to lose that influence if it gets dismantled, so of course he's now talking about introducing a sovereign wealth fund and use it to buy tik tok[2].<p>- [0] <a href="https://www.nytimes.com/2024/12/28/us/politics/trump-tik-tok-ban.html" rel="nofollow">https://www.nytimes.com/2024/12/28/us/politics/trump-tik-tok...</a><p>- [1] <a href="https://thehill.com/policy/technology/5042241-president-elect-trump-warm-spot-tiktok/" rel="nofollow">https://thehill.com/policy/technology/5042241-president-elec...</a><p>- [2] <a href="https://www.whitehouse.gov/fact-sheets/2025/02/fact-sheet-president-donald-j-trump-orders-plan-for-a-united-states-sovereign-wealth-fund/" rel="nofollow">https://www.whitehouse.gov/fact-sheets/2025/02/fact-sheet-pr...</a>
That's not algorithmic bias, that's a large delta in content quality.<p>Videos coming out of the Harris camps were milquetoast in comparison to the Trump camp. Of course those videos get more attention, from people that like him and hate him, and therefore get pushed more by the algorithm.
The main use case of technology and AI is to sway public opinion. Elon Musk is one of the richest men in the world and that's where he spends most of his focus. If you're building technology, know that this is the endgame, and proceed with caution.<p>The amount of deceit per dollar you can generate today is profoundly inexpensive when compared to the entirety of human history. Sadly, our cowardly tech giants are the antithesis of the morals espoused in comic books they grew up on, seeking to abdicate all responsibility as they achieve unprecedented power.
This makes sense - as Paul Graham mentioned in his essay from January [1] “outrage means traffic”.<p>No-one gets more outraged than left leaning viewers watching Trump videos.<p>[1] <a href="https://paulgraham.com/woke.html" rel="nofollow">https://paulgraham.com/woke.html</a>
Curious as when I was still using TikTok I was getting pro-Trump content in between my usual communist/worker’s rights or pro-Palestinian content.
Why is it called bias, as if the results should be even? Would we say that there is a bias in TikTok to show interesting videos, instead of boring videos? I guess you <i>could</i> describe it as such, but the term carries an implication of impropriety with it, as if what displays bias should rightfully not display bias.<p>Liberals <i>love</i> consuming Trump content, maybe even more than conservatives. Sort of like the Howard Stern effect. Because after they do, they can go post on Twitter and reddit about how stupid Trump is.<p>Conservatives didn't have that with Harris. Maybe because all of her content was so scripted and her entire personality that we were allowed to see was designed by committee, and frankly there was just much less content. Trump would go on a 4 hour tirade about some random thing like Greenland. Meanwhile, Harris was tightly controlled by her handlers, only giving "safe" interviews, probably thinking that playing it safe and not fumbling somewhere would be the path to the white house.