this post was submitted on 20 Mar 2024
1011 points (98.0% liked)
Technology
59374 readers
3250 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
What an excellent presedent to set cant possibly see how this is going to become authoritarian. Ohh u didnt report someone ur also guilty cant see any problems with this.
That's... not what this is about, though?
This isn't about mandated reporting, it's about funneling impressionable people towards extremist content.
And they profit from it. That’s mentioned there too, and it makes it that much more infuriating. They know exactly what they’re doing, and they do it on purpose, for money.
And at the end of the day, they’ll settle (who are the plaintiffs? Article doesn’t say) or pay some relatively inconsequential amount, and they’ll still have gained a net benefit from it. Another case of cost-of-doing-business.
Would’ve been free without the lawsuit even. Lives lost certainly aren’t factored in otherwise.
Youtube Shorts is the absolute worst for this. Just recently it's massively trying to push transphobic BS at me, and I cannot figure out why. I dislike, report and "do not recommend this channel" every time, and it just keeps shoving more at me. I got a fucking racist church sermon this morning. it's broken!
Don't dislike it just hit do not recommend, also don't open comments - honestly the best way is just to skip past as fast as you can when you set one, the lower time with it on your screen YNt less the algo thinks you want it.
I never really see that on YouTube unless I've been on related topics recently and it goes pretty quick when you don't interact. Yes it's shifty but they're working on a much better system using natural language with an llm but it's a complex problem
I am not discounting anyone's experience. I am not saying this isn't happening. But I don't see it.
LiberalGunNut™ here! You would think watching gun related videos would lead me down a far-right rabbit hole. Here's my feed ATM.
Meh. History, gun comparisons, chemistry, movies, whatever. Nothing crazy. (Don't watch Brandon any longer, got leaning too right, too political. Video's about his bid for a Congressional seat in Texas. Not an election conspiracy thing. Don't care.)
If anyone can help me understand, I'm listening. Maybe I shy away from the nutcase shit so hard that YouTube "gets" me? Honestly don't get it.
So that looks like main long form content. I'm specifically talking about youtube shorts which is Google's version of TikTok
Imagine watchibg let alone even having the option for shorts. Get newpipe there is a sponsorblock version on fdroid no shorts no google tracking no nonsence u dont get comments tho but whatever. It also supports peertube which is nice.
Report for what? Sure disagree with them about their bullshit but i dont see why u need to report someone just cos u disagree with their opinions.
I like shorts for the most part
Misinformation and hatespeech mostly. They have some crazy, false pseudoscience to back their "opinions" and they express them violently. Like it or not, these videos "promote hatred against a protected group" and are expressly against youtube TOS. Reporting them is 100% appropriate.
I can strongly reccommwnd stop watching ahort form content it has been proven to caise all sorts of mental issues.
Fair. Also what is a "protected group" what makes it any different from any other grouping?
U can make any common practice and pillar of capitalism sound bad by using the words impressionable and extremist.
If we remove that it become: funnelling a market towards the further consumption of your product. I.e. marketing
And yes of cause the platforms are designed to be addictive and are effective at indoctranation but why is that only a problem for certain ideologies shouldnt we be stopping all ideologies from practicing indoctranation of impressionable people should we not be guiding people to as many viewpoints as possible to teach them to think not to swallow someone elses ideas and spew them back out.
I blame Henry Ford for this whole clusterfuck he lobbied the education system to manufacture an obedient consumer market and working class that doesnt think for itself but simply swallows what its told. The education system is the problem anything else is treating the symptoms not the disease.
And if a company's marketing campaign is found to be indirectly responsible for a kid shooting up a grocery store, I'm sure we'll be seeing a repeat of this with that company being the one with a court case being brought against them, what even is this argument?
Isnt the entire gun market indirectly responsible, what about the food the shooters ate? Cant we use the same logic to prssecute anyone of any religion cos most of the religiouse texts support the killing of some group of people.
Its convenient to ask what the argument is when u ignore 60% of it
Did you even read the article we're discussing, or are you just reading the comments and getting mad?
That means that the government is injecting itself on deciding what "extremist" is. I do not trust them to do that wisely. And even if I did trust them, it is immoral for the state to start categorizing and policing ideologies.
Do you understand you're arguing for violent groups instigating a race war?
Like, even if you're ok with white people doing it, you're also saying ISIS, MS13, any fucking group can't be labeled violent extremists...
Some "ideologies" need to be fucking policed
anarchists have had to deal with this for over a century. the state can go fuck itself.
Ur missing the point violence should absolutly be policed. Words ideas ideology hell no let isis, ms13, the communists, the nazis, the vegans etc etc etc say what they want. They are all extremists by some definition let them discuss let them argue and the second someone does something violent lock em for the rest of their lives simple.
What you are suggesting is the policing of ideology to prevent future crime their is an entire book about where that leads to said book simply calls this concept thought crime.
Someone wants to start with yours, and they have more support than you know. Be careful what you wish for.
Guess we shouldn't ever do anything about anything, ever.
Big difference between policing actions and policing thoughts. Declaring some thoughts as verboten and subject to punishment or liability is bad.
It's insane you're being downvoted by people who would be the first ones silenced.
You really think they're going to use this for himophobes and racists instead of anyone calling for positive socia6 change?
Did you not see any of history?
That is generally what Governments do. They write laws that say .. you can do this but not that. If you do this thats illegal and you will be convicted. Otherwise you wouldnt be able to police things like Mafia and drug cartels. Even in the US their freedom of speech to conspire to committe crimes is criminalised. There is no difference between that and politically motivated 'extremists' who conspire to commit crimes. The idealogy is not criminalised the acts that groups plan or conduct are. You are totally fine saying . I dont like x group.
What its not ok to say is . Lets go out and kill people from x.group.
The problem is that social media sites use automated processes to decide which messages to put in front of users in the fundamentally same way that a newspaper publisher decides which letters to the editor they decide to put in their newspaper.
Somehow though Tech companies have argued that because their is no limit on how many posts they can communicate amd hence theoretically they arent deciding what they put in and what they done, that their act of putting some at the top of people's lists so they are seen is somehow different to the act of the newspaper publisher including a particular letter or not ..but the outcome is the same The letter or post is seen by people or not.
Tech companies argue they are just a commutation network but I never saw a telephone, postal or other network that decided which order you got your phone calls, letters or sms messages. They just deliver what is sent in the order it was sen.
commercial social media networks are publishers with editorial control - editorial control is not only inclusion/exclusion but also prominence
There is a fundamental difference in Lemmy or Mastodon in that those decisions (except for any moderation by individual server admins) dont promote or demote any post so therefore dont have any role in whether a user sees a post or not.
The government is already the one who makes that decision. The only thing new here is a line being drawn with regards to social media's push towards addiction and echo-chamberism.
umm.. isnt the government or rather the judikative already deciding what extremist is?
How would specifically this be different?
I can understand the problems thos causes for the platforms, but the government injecting decisions is something you focus on?
Not to forget the many other places they inject themselves.. one could say your daily lifes because.. careful now.. you live in the country with a government, whaaat?