this post was submitted on 23 Jan 2025
1131 points (97.2% liked)

Technology

61227 readers
4339 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

TLDR if you don't wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 1 day ago (1 children)

Just anecdotal but I only ever watch duck videos or funny animal videos with occasional other funnies or crazy science things, and that's still all I ever get. Other days I get plenty of cool music like tesla coils making music or other piano music.

Am I youtubing wrong?

[–] [email protected] 1 points 1 day ago

Nah good for you. Maybe it's because of your geographical location/you just being lucky? I have experienced what the video above says quite a lot though.

I'm not American, so I didn't exactly see a lot of Trump (although there was some amount of it). I largely saw a lot of Hindu nationalist content (cuz of my geographical location). The more I disliked the videos, the more they got recommended to me. It was absolutely pathetic.

[–] [email protected] 45 points 6 days ago (2 children)

So you're saying we need to start pumping out low quality left wing brainrot?

[–] [email protected] 20 points 5 days ago (1 children)

Insanely, that seems to be the play. Not logic or reason, but brainrot and low blows. Which is a bit at odds with the actual desire.

[–] [email protected] 8 points 5 days ago

fight fire with fire i guess….
maybe people get on board quicker if they feel the emotions first, and then learn the logic….
one good example is Noam Chompsky: every thing is says is gold, but he says it so slow and dispassionately even people who agree with him find it hard to watch.

[–] [email protected] 7 points 5 days ago* (last edited 5 days ago)

It only must be extremely simplified and evoke emotional reactions. That's just basic propaganda rules. The brainrot quality of the content is a consequence of the sheer quantity of the content. You can't make that volume of content and without making fully automated ai slop.

What the experiment overlooks is that there are PR companies being paid to flood YouTube with rightwing content and are actively trying to game its algorithm. There simply isn't a left-wing with the capital to manufacture that much content. No soros-bucks for ai minions in keffiyehs talking about medicare.

[–] [email protected] 73 points 6 days ago (2 children)

I keep getting recommendations for content like "this woke person got DESTROYED by logic" on YouTube. Even though I click "not interested", and even "don't recommend channel", I keep getting the same channel, AND video recommendation(s). It's pretty obvious bullshit.

[–] [email protected] 22 points 6 days ago (6 children)

Anything but the subscriptions page is absolute garbage on that site. Ideally get an app to track your subs without having to have an account. NewPipe, FreeTube etc.

load more comments (6 replies)
[–] [email protected] 19 points 6 days ago (5 children)

You'd think a recommendation algorithm should take your preferences into account - that's the whole justification for tracking your usage in the first place: recommending relevant content for you...

[–] [email protected] 11 points 6 days ago* (last edited 5 days ago) (2 children)

it is. But who said that you get to decide what's relevant for you? Welcome and learn to trust your algorithmic overlords

load more comments (2 replies)
[–] [email protected] 10 points 6 days ago

YOU'D THINK THAT YES. [caps intended]

[–] [email protected] 3 points 5 days ago (1 children)

Wrong, the whole purpose of tracking your usage is to identify what kind of consumer you are so they can sell your views to advertisers. Recommendations are based on what category of consumer you've been identified as. Maintaining your viewership is secondary to the process of selling your views.

[–] [email protected] 2 points 4 days ago

I said justification, not purpose. They claim they want to track usage to tailor your experience to you.

They don't actually believe that, of course, but respecting your explicit expression of interest ought to be the minimum perfunctory concession to that pretense. By this we can see just how thin a pretense it is.

load more comments (1 replies)
[–] [email protected] 49 points 6 days ago* (last edited 6 days ago) (3 children)

I realized a while back that social media is trying to radicalize everyone and it might not even be entirely the oligarchs that control its fault.

The algorithm was written with one thing in mind: maximizing engagement time. The longer you stay on the page, the more ads you watch, the more money they make.

This is pervasive and even if educated adults tune it out, there is always children, who get Mr. Beast and thousands of others trying to trick them into like, subscribe and follow.

This is something governments should be looking at how to control. Propaganda created for the sole purpose of making money is still propaganda. I think at this point that sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

[–] [email protected] 13 points 6 days ago (1 children)

The problem is education. It's a fools game to try and control human nature which is the commodification of all and you will always have commercials and propaganda

What is in our means is to strengthen education on how to think critically and understanding your environment. This is where we have failed and I'll argue there are people actively destroying this for their own gain.

Educated people are dangerous people.

It's not 1984. It's Brave New World. Aldous Huxley was right.

[–] [email protected] 11 points 6 days ago (6 children)

I think we need to do better than just say "get an education."

There are educated people that still vote for Trump. Making it sound like liberalism is some result of going to college is part of why so many colleges are under attack.

From their perspective I get it, many of the Trump voters didn't go, they hear that and they just assume brainwashing.

We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information, etc, not just the kind of "education" where you regurgitate talking points from teachers, the TV, or the radio as if they're matter of a fact ... and the whole education system is pretty tuned around regurgitation, even at the college level. A lot of the culture of exploration surrounding college (outside of the classroom) is likely more where the liberal view points come from and we'd be ill advised to assume the right can't destroy that.

load more comments (6 replies)
[–] [email protected] 7 points 6 days ago

This discussion existed before computers. Before that it was TV and before that it was radio. The core problem is ads. They ruined the internet, TV, radio, the press. Probably stone tablets somehow. Fuck ads.

load more comments (1 replies)
[–] [email protected] 60 points 6 days ago (3 children)

I hate the double standards

On a true crime video: "This PDF-File game ended himself after he was caught SAing this individual.... Sorry Youtube forces me to talk like that or I might get demonetized" Flagged for discussing Suicide

On PragerU: "The Transgender Agenda is full of rapists and freaks who will sexually assault your children, they are pedophiles who must be dealt with via final solution!" Completely fucking acceptable!

load more comments (3 replies)
[–] [email protected] 42 points 6 days ago (8 children)

Alt right videos are made to elicit outrage, hate, and shock which our lizard brains react to more due to potential danger than positive videos spreading unity and love. It’s all about getting as many eyeballs on the video to make money and thi is the way that’s most effective.

[–] [email protected] 4 points 5 days ago

There's also an entire industry around mass producing this content and deliberately gaming the algorithm.

load more comments (7 replies)
[–] [email protected] 29 points 6 days ago

I don't think it makes me feel better to know that our descent into fascism is because gru promised 1MM rizz for it

[–] [email protected] 36 points 6 days ago (1 children)

From my anecdotal experiences, it's "manly" videos that seem to lead directly to right wing nonsense.

Watch something about how a trebuchet is the superior siege machine, and the next video recommended is like "how DEI DESTROYED Dragon Age Veilguard!"

[–] [email protected] 22 points 6 days ago

Or "how to make ANY woman OBEY you!"

Check out a short about knife sharpening or just some cringe shit and you're all polluted.

[–] [email protected] 7 points 5 days ago

Yeah it sure does. There is no way that garbage should be showing up for me, but yet...

[–] [email protected] 7 points 5 days ago (2 children)

There's a firefox extension to hide short and another to default to your subscription. Along with ublock, those are the only things that makes youtube usable.

[–] [email protected] 2 points 4 days ago (1 children)

That doesn't fix the out-of-the-box experience of the platform for millions, if not billions of people. Yes it's a good step to take individually, but insufficient to deal with the broader issue raised of latent alt-right propagandizing

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 24 points 6 days ago (4 children)

Filter bubbles are the strongest form of propaganda.

load more comments (4 replies)
[–] [email protected] 19 points 6 days ago (1 children)

I don't know if anyone of you still looks at memes on 9gag, it once felt like a relatively neutral place but the site slowly pushed right wing content in the last years and is now infested with alt-right and even blatantly racist "memes" and comment-sections. Fels to me like astroturfing on the site to push viewers and posters in some political direction. As an example: in the span during US-election all of a sudden the war on palestine became a recurring theme depicting the Biden admin and jews as "bad actors" and calling for Trump; after election it became a flood of content about how muslims are bad people and we shouldn't intervene in palestine...

load more comments (1 replies)
[–] [email protected] 21 points 6 days ago* (last edited 6 days ago) (3 children)

The view farming in shorts makes it even harder to avoid as well. Sure, I can block the JRE channel, for example, but that doesn’t stop me from getting JRE clips from probably day-old accounts which just have some shitty music thrown on top. If you can somehow block those channels, there’s new ones the next day, ad infinitum.

It’s too bad you can’t just disable the tab entirely, I feel like I get sucked in more than I should. I’ve tried browser extensions on mobile which remove the tab, but I haven’t had much luck with PiPing videos from the mobile website, so I can’t fully stop the app.

load more comments (3 replies)
[–] [email protected] 10 points 6 days ago* (last edited 6 days ago) (5 children)

I found youtube shorts very annoying, because I have an attention span and can focus on something for more than 30 seconds. But if you right-click the three dots on a few Shorts sections and click Not Interested, youtube gets the hint and stops offering them to you. Win-win!

[–] [email protected] 9 points 6 days ago (4 children)

It doesn't seem to work for me, I also keep reporting ads and blocking ads, and they keep serving them up to me. So I have decided to disable YouTube on my phone.

load more comments (4 replies)
load more comments (4 replies)
[–] [email protected] 12 points 6 days ago (2 children)

Same happened to me (live in WA) but not only do I get pro-tyranny ads and Broprah (Rogan) shorts, I also get antivax propaganda.

I always use the “show less of this” option or outright remove it from my feed. Seems better now.

load more comments (2 replies)
[–] [email protected] 14 points 6 days ago

I bet thise right wing shorts are proposed and shoehorned in everywhere because someone pays for the visibility. Simple as that.

[–] [email protected] 11 points 6 days ago* (last edited 6 days ago)

The people where I live are -- I guess -- complete morons because whenever I try to check out Youtube without being logged in, I get the dumbest of dumb content.

But as another weird data point, I once suggested my son check out a Contrapoints video which I found interesting and about 1 year later she told me she wanted to get a surgery -- I don't exactly remember which kind as I obviously turned immediately into a catatonic far right zombie.

[–] [email protected] 15 points 6 days ago (10 children)

Saying it disproportionately promotes any type of content is hard to prove without first establishing how much of the whole is made up by that type.

The existence of proportionately more "right" leaning content than "left" leaning content could adequately explain the outcomes.

load more comments (10 replies)
[–] [email protected] 1 points 4 days ago* (last edited 4 days ago)

I have encountered this too. Around 2 months ago I visited YouTube to watch some video, rejected all cookies and there were "woke" videos everywhere in recommendations. I used it without an account.

[–] [email protected] 7 points 6 days ago (1 children)

Instagram does the same thing with "dark jokes" and really weird ufo and conspiracy videos it really sucks

load more comments (1 replies)
[–] [email protected] 8 points 6 days ago

Yup they have been showing up for me today. Fuck YouTube

[–] [email protected] 3 points 5 days ago (1 children)

Does this mean youtube preferentially selects alt-right shorts, or alt-right people make more shorts? Or some other thing entirely? Jump to your own conclusion.

[–] [email protected] 3 points 5 days ago

YouTube selects what gives YouTube the most views for the longest time. If that's right wing shorts, they don't care.

[–] [email protected] 9 points 6 days ago (1 children)

Yeah, I've gotten more right wing video recommendations on YouTube, even though I have turned off my history. And even if I turned on my history, I typically watch left wing videos.

load more comments (1 replies)
load more comments
view more: next ›