TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.
Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.
- Houston: 88 shorts
- Chicago: 98 shorts
- Atlanta: 109 shorts
- NYC: 247 shorts
- San Fransisco: never (Benaminute stopped after 250 shorts)
There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.
What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.
In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.
Do these companies put their fingers on the scale? Almost certainly
But it’s exactly what he said that’s what brought us here. They have not particularly given a shit about politics (aside from no taxes and let me do whatever I want all the time). However, the algorithms will consistently reward engagement. Engagement doesn’t care about “good” or “bad”, it just cares about eyes on it, clicks, comments. And who wins that? Controversial bullshit. Joe Rogan getting elon to smoke weed. Someone talking about trans people playing sports. Etc
This is a natural extension of human behavior. Human behavior occurs because of a function. I do x because of a function, function being achieving reinforcement. Attention, access to something, escaping, or automatic.
Attention maintained behaviors are tricky because people are shitty at removing attention and attention is a powerful reinforcer. You tell everyone involved “this person feeds off of your attention, ignore them”. Everyone agrees. The problematic person pulls their bullshit and then someone goes “stop it”. They call it negative reinforcement (this is not negative reinforcement. it’s probably positive reinforcement. It’s maybe positive punishment, arguably, because it’s questionable how aversive it is).
You get people to finally shut up and they still make eye contact, or non verbal gestures, or whatever. Attention is attention is attention. The problematic person continues to be reinforced and the behavior stays. You finally get everyone to truly ignore it and then someone new enters the mix who doesn’t get what’s going on.
This is the complexity behind all of this. This is the complexity behind “don’t feed the trolls”. You can teach every single person on Lemmy or reddit or whoever to simply block a malicious user but tomorrow a dozen or more new and naive people will register who will fuck it all up
The complexity behind the algorithms is similar. The algorithms aren’t people but they work in a similar way. If bad behavior is given attention the content is weighted and given more importance. The more we, as a society, can’t resist commenting, clicking, and sharing trump, rogan, peterson, transphobic, misogynist, racist, homophobic, etc content the more the algorithms will weight this as “meaningful”
This of course doesn’t mean these companies are without fault. This is where content moderation comes into play. This is where the many studies that found social media lead to higher irritability, more passive aggressive behavior and lower empathetization could potentially have led us to regulate these monsters to do something to protect their users against the negative effects of their products
If we survive and move forward in 100 years social media will likely be seen in the way we look at tobacco now. An absolutely dangerous thing that was absurd to allowed to exist in a completely unregulated state with 0 transparency as to its inner workings
So… in the US then ?
It’s 100% not just the US where the algorithm favours this stuff.
I noticed my feed almost immediately changed after Trump was elected. I didn’t change my viewing habits. I’m positive YouTube tweaked the algorithm to lean more right.
I’ll get downvoted for this, with no explanation, because it’s happened here and on reddit.
I’m a liberal gun nut. Most of my limited YouTube is watching gun related news and such. You would think I’d be overrun with right-wing bullshit, but I am not. I have no idea why this is. Can anyone explain? Maybe because I stick to the non-politcal, mainstream guntubers?
The only thing I’ve seen start to push me to the right was watching survival videos. Not some, “dems gonna kill us all” bullshit, simply normal, factual stuff about how to survive without society. That got weird fast.
I’ve noticed most firearms channels steer well clear of politics, unless it’s directly related to the topic at hand, I think partly to appeal to an international audience.
I do think the algorithm puts firearms and politics into very separate categories, someone watching Forgotten Weapons probably isn’t going to be interested in political content.
Their algorithms are probably good enough to know you’re interested in guns but not right wing stuff. Simple as that.
Yeah, I don’t think I’ve ever seen alt-right nonsense without actively looking for it. Occasionally I’ll get recommended some Joe Rogan or Ben Shapiro nonsense, but that’s about it.
I consider myself libertarian and a lot of my watch time is on Mental Outlaw (cyber security and dark web stuff), Reason (love Remy and Andrew Heaton videos), and John Stossel, but other than that, I largely about political channels. I watch a fair amount of gun content as well.
If I get recommended political stuff, it’s usually pretty mainstream news entertainment, like CNN or Fox News. Even the crypto nonsense is pretty rare, even though I’m pretty crypto-positive (not interested in speculation though, only use as a currency and technical details).
If you’re seeing alt-right crap, it’s probably because you’ve watched a lot of other alt-right crap.
I have had the opposite experience. I watch a few left-leaning commentary channels. Sam Seder, my boy Jesse Dollomore. If I watch a single video about guns (with no apparent ideological divide), within a single refresh I’m getting Shapiro and Jordan Peterson videos. I’m in a red Western state. My subscriptions are mostly mental health, tech, and woodworking. I have to delete history if I stray even a little bit.
Or the people around you do. I’m shocked occasionally after going home from work lol.
My watch history would peg me as NOT a Republican. Youtube’s short feed will serve me
- excerpt from youtuber’s longer video
- tiktok repost from like, the truck astrology guy or “rate yer hack, here we go” guy, etc
- Artificial voice reading something scraped from Reddit with Sewer Jump or Minecraft playing in the background
- Chris Boden
- Clip from The West Wing
- Clip from Top Gear or Jeremy Clarkson’s Farm
- “And that’s why the Bible tells us that Jesus wants you to hate filthy fucking liberals.”
“Do not recommend channel.” “The downvote button doesn’t even seem to be a button anymore but I clicked it anyway.” “Report video for misinformation and/or supporting terrorism.” But the algorithm keeps churning it up.
Guy you replied to is trying to pretend his individual experience is representative of the whole.
I’m not sure there is a “representative of the whole” here; I think the Youtube algorithm is modal.
I think it’s an evolution of the old spam bots, like if you had an email address that in any way indicated you were male you’d get “v1agra” and “c1alis” ads nonstop, I’m sure you’d get makeup and breast enlargement spam or some shit in a woman’s inbox, whatever they can make you feel insecure enough to buy.
This is basically the central thesis of The Social Dilemma.
I keep getting recommendations for content like “this woke person got DESTROYED by logic” on YouTube. Even though I click “not interested”, and even “don’t recommend channel”, I keep getting the same channel, AND video recommendation(s). It’s pretty obvious bullshit.
You’d think a recommendation algorithm should take your preferences into account - that’s the whole justification for tracking your usage in the first place: recommending relevant content for you…
it is. But who said that **you ** get to decide what’s relevant for you? Welcome and learn to trust your algorithmic overlords
Thanks, I hate it
YOU’D THINK THAT YES. [caps intended]
Anything but the subscriptions page is absolute garbage on that site. Ideally get an app to track your subs without having to have an account. NewPipe, FreeTube etc.
Are those available on PC/Linux? On my TV? 😭 I have them on my phone but I feel like there’s too much hassle to do on my main viewing devices.
Instagram is probably notably worse, I have a very establish account that should be very anti that sort of thing and it keeps serving up idiotic guru garbage.
Tiktok is by far the best in this aspect, at least before recent weeks.
For now.
deleted by creator
A couple of years ago, I started two other Instagram accounts besides my personal one. I needed to organize and have more control of what content I want to see at times I choose. One was mostly for combat sports, other sports, and fitness. The second one was just food.
The first one, right off the bat showed me girls with OnlyFan accounts in the discovery page. Then after a few days, they begin showing me right wing content, and alpha male garbage.
The second one, the food account, showed alternative holistic solutions. Stuff like showing me 10 different accounts of people suggesting I consume raw milk. They started sending me a mix of people who just eat meat and vegans.
It’s really wild what these companies show you to complete your profile.
I saw a tiktok video talking about how Instagram starts the redpill/incel stuff early for the young people then they become failures in life at which point they push the guru stuff for “guidance”.
EU and even China has at least made a attempt of holding these companies accountable for the algorithm but US and Canadian government just sat there and did nothing.
Saying it disproportionately promotes any type of content is hard to prove without first establishing how much of the whole is made up by that type.
The existence of proportionately more “right” leaning content than “left” leaning content could adequately explain the outcomes.
adding to this: youtubes audience seems male dominated. Males are “on average” more right leaning.
approximately 54.3 percent of YouTube male
Yeah I wonder why they’re right leaning ? It’s not as if something is pushing men to the right
Human males are also more violent on average. They commit murder more often. I wonder why? https://www.journals.uchicago.edu/doi/10.1086/711705
Across all cultures, men are more physically aggressive than women.
You mean Cultures (& women) that demand that men be more aggressive or they’ll be booted out of society
Yeah totally not something pushing men to the right & the right wingers taking advantage of it
Yes, I mean all cultures created by humans, no exception exists, as you seem to imply.
[…] such that men comprise 95% of those convicted for homicide worldwide (United Nations Office on Drugs and Crime 2013). https://www.journals.uchicago.edu/doi/10.1086/711705
From my view males are to an extent biologically programmed to be right leaning and more violent, always have been. This is why they die in wars and crimes in male on male violence often and stand on top of hierarchies of physical power on average. This risk taking behavior might have positive side effects in case of victory as well. And lets not forget youtube was founded by a pure male team.
This is my thought, but I think many young men are (rightfully) frustrated, but they don’t know what they’re frustrated about. It’s hard to get a job - especially without higher education. It’s hard to buy a home and build a family. Many young men are increasingly more alone.
At the same time, there’s a lot of talk about the ” white male privilege”. ”What privilege?”, they might think. They don’t feel particularly privileged about their situation.
And then they find people like Jordan Peterson who seem to speak for their struggles. For first time they hear someone seem to understand them. And they point to the (very wrong) diagnosis of the situation: it’s the woke identity politics fault! But that’s good enough for them, and that’s where the alt-right pipeline starts.
Don’t let the algorithm feed you!
You get what you usually click?
I didn’t watch the video, but it’s YT short, you just swipe like tiktok. The few ways to curate the algorithm is to either swipe away quickly, click on the “not interested” button, downvote, or delete watched shorts from history. If you doesn’t interact with any of this and watch the full length of the video, the algorithm gonna assume you like this kind of content. They also will introduce you content you never watched before to gauge your interest, a lot of times it’s not even related to what you currently watched, and if you didn’t do any curation, they gonna feed you the exact type for some times. I don’t know how they manage the curation but that’s the gist of it from my experience. My feed have 0 politics, mostly cats. I control the feed strictly so i got what i demand.
I bet thise right wing shorts are proposed and shoehorned in everywhere because someone pays for the visibility. Simple as that.
Filter bubbles are the strongest form of propaganda.
does shadow-banning create filter bubbles, in a way it demonstrates the power these platforms hold over their users? https://en.wikipedia.org/wiki/Shadow_ban
Commenting on stuff definitely strengthens it, but I wouldn’t know if a shadow ban changes that. I don’t think there’s much difference if you are shadowbanned or not, you’re still interacting with the content.
In my view, instagram blocking Searchterm democrat for short times is kind of a shadowban … on all democrats
With Milo (miniminuteman) in the thumbnail, I thought the video was going to imsinuate that his content was part of the alt-right stuff. Was confused and terrified. Happily, that was not the case.
From my anecdotal experiences, it’s “manly” videos that seem to lead directly to right wing nonsense.
Watch something about how a trebuchet is the superior siege machine, and the next video recommended is like “how DEI DESTROYED Dragon Age Veilguard!”
Or “how to make ANY woman OBEY you!”
Check out a short about knife sharpening or just some cringe shit and you’re all polluted.
If I see any alt-right content, I immediately block the account and report it. I don’t see any now. I go to yourube for entertainment only. I don’t want that trash propaganda.
Same. I watched one Rogan video in like, 2019, and it was like opening a flood gate. Almost immediately almost every other recommendation was some right-wing personality’s opinion about “cancel culture” or “political correctness.” It eventually called down once I started blocking those channels and anything that looks like it might lead to that kind of content. I can only imagine what would pop up now.