this post was submitted on 22 Dec 2023
849 points (96.4% liked)

Technology

59421 readers
4186 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

(page 9) 50 comments
sorted by: hot top controversial new old
[–] [email protected] -3 points 11 months ago (2 children)

I can ALMOST see his point... If you push them underground, you push them to find a space where nobody will challenge them, and they can grow stronger in that echo chamber.

Allowing them to be exposed to the light of day and fresh air makes their evil apparent to all and searchable.

And besides, "Punch a Nazi Day" just isn't the same without Nazis. :)

load more comments (2 replies)
[–] [email protected] -3 points 11 months ago (6 children)

Kicking them off the platform just sends them to other echo chambers like False social where they just circle jerk each other all day unchallenged.

load more comments (6 replies)
[–] [email protected] -4 points 11 months ago* (last edited 11 months ago) (3 children)

OK? But I'm going to think Substack is a hardened Nazi supporter whan I all of a sudden don't see Antifa openly talking about their plans for disposing of their Nazi opposition on their platform, which would be appropriate discussion in said situation. I'm also guessing that their coffers are now open to any and all well known terrorist organizations. Maybe we shouldn't of given corporations any power at all, they have proven time and time again to have absolutely no morals.

And I'm going right back to sleep, so if anyone wants to argue about free speech I'll give my opinion on that now. I draw the line at helping sick individuals try to organize the genocide of most of the people on this planet. I'm all fine for a mentally ill person (Nazi) to be yelling their propaganda from their soapbox in the town square, but letting and even helping the Nazis openly spread their well documented genocidal hate is too far for me.

Edit: I'm a little confused about the fast downvotes?

Maybe mentioning that Antifa (you know the opposite of Nazis) should be equally represented if your platform supports Nazis is considered a bad thing here on Lemmy, but that don't make much sense.

Maybe it's just the corporates paving the way for Facebooks infiltration and organized downfall of all their competition?

Maybe it's just the Nazis.

Then again it's probably just me having asd and speaking directly without a filter.

Don't worry though the Nazis have plans for people like me.

load more comments (3 replies)
[–] [email protected] -4 points 11 months ago

Nazis gotta work too

[–] [email protected] -4 points 11 months ago (17 children)

Honestly? Unless I'm missing something, this sounds fine.

The internet I grew up on had Nazis, racists, Art Bell, UFO people, software pirates, and pornographers. The ACLU defended KKK rallies. Some of the people who were allowed a platform, that "everyone hated" and a lot of people wanted to censor, were people like Noam Chomsky who I liked hearing from.

I think there's a difference between "moderation" meaning "we're going to prevent Nazis from ruining our platform for people who don't want to hear from them" -- which, to me, sounds fine and in fact necessary in the current political climate -- and "moderation" meaning "if you hold the wrong sort of views you're not allowed to express them on my platform." The Nazi bar analogy, and defederating with toxic Lemmy instances, refers to the first situation. If I understand Substack's platform properly, it's the second: Only the people who want to follow the Nazis can see the Nazis. No? Am I wrong in that?

I'm fully in agreement with McKenzie that not allowing "wrong" views to be expressed and legitimately debated makes it harder to combat them, not easier. They're not gonna just evaporate because "everyone agrees they're bad" except the people who don't.

I realize this is probably a pretty unpopular view.

load more comments (17 replies)
[–] [email protected] -5 points 11 months ago (22 children)

I actually prefer this type of hands-off approach. I find it offensive that people would refuse to let me see things because they deem it too "bad" for me to deal with. I find it insulting anyone would stop me reading how to make meth or read Mein Kampf. I'm 40yo and it's pretty fucking difficult to offend me and to think I'm going to be driven to commit crime just by reading is offensive.

I don't need protecting from speech/information. I'm perfectly capable and confident in my own views to deal with bullshit of all types.

If you're incapable of dealing with it - then don't fucking read it.

Fact is the more you clamp down on stuff like this the more you drive people into the shadows. 4chan and the darkweb become havens of 'victimhood' where they can spout their bullshit and create terrorists. When you prohibit information/speech you give it power.

In high school it was common for everyone to hunt for the Anarchists/Jolly Roger Cookbook. I imagine there's kids now who see it as a challenge to get hold of it and terrorist manuals - not because they want to blow shit up, but because it's taboo!

Same with drugs - don't pick and eat that mushroom. Don't burn that plant. Anyone with 0.1% of curiosity will ask "why?" and do it because they want to know why it's prohibited.

Porn is another example. The more you lock it down the more people will thirst for it.

Open it all up to the bright light of day. Show it up for all it's naked stupidity.

load more comments (22 replies)
load more comments
view more: ‹ prev next ›