this post was submitted on 05 Nov 2023
276 points (96.6% liked)
Technology
59123 readers
2973 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Part of the problem is who decides what is misinformation. As soon as the state gets to decide what is and isn't true, and thus what can and cannot be said, you no longer have free speech.
Don't worry, the person you responded to is conservative so they're doing their damnedest to finish off education
The state deciding on speech is a red line yes but that’s not even on the table here. This is about social media moderation. It actually seems really suspiciously disingenuous to bring that up here.
OP: Thread about social media moderation
You: The state deciding what’s true is the death of free speech!
Actually your comment is one of the big problems in this debate. People can’t tell the difference between a private social media firm moderating hate content and the government taking away their freedom of speech. You just slurred the two together yourself by bringing this up here.
Centralized for-profit companies policing speech doesn’t really solve free speech concerns. It doesn’t violate the US first amendment, but corporate-approved speech isn’t really free speech either. No person or organization is really suitable to be the arbiter of truth, but at the same time unmoderated misinformation presents its own problems.
Yes it solves it. Companies are not required to carry your voice around the world, which is what their platforms do. Stop equating guaranteed amplification with your freedom of speech. It’s wrong and dumb. I’ve lived in countries that actually restrict speech and whatever the Facebook mod did to you is NOTHING. The only reason Americans even fall into this stupid way of thinking is because their speech is so free. When your speech has never truly been restricted you have no idea what that freedom even means.
You don't have free speech.
Courtrooms are arbiters of truth literally all the time. There are plenty of laws for which truth is a defence, and dishonesty is punished.
When battling misinformation, the problem is not that lying on the internet is legal - it is still actionable. Fraud is still illegal. False or misleading advertisements are still illegal. Defamation is still illegal. Perjury is illegal in the criminal law sense, not just torts. Ask Martha Stewart who the "arbiter of truth" is.
The problem is that it's functionally impossible to enforce on the scale of social media. If 50,000 people call you a pedophile because it became a meme even though it was completely untrue, and this costs you your job and you start getting death threats, what are you going to do about that? Sue them all?
So we throw up our hands and let corporations handle it through abuse policies, because the actual law is unworkable - it's "this is illegal but enforcing it is so impractical that it's legal". Twitter and Facebook don't have to deal with that crap so we let them do a vague implementation of the law but without the whole "due process" thing and all the justice they can mete out is bans.
If you disagree, then I've got a Nigerian prince who'd like to get your banking info, and also you're all cannibals.
You do not have free speech on social media today, private platforms decide what they want to have.
The state does not have to be the one to decide these things, nor is it a case of "deciding" what is true, we have a long history of using proofs to solidify something as fact, or propaganda, or somewhere in between. This is functionally what history studies are about.
That brings up another thing. At what point does it become a "public space"?
Theres an old supreme court case on a company town that claimed someone was trespassing on a sidewalk. The supreme court ruled it was a public space, and thus they could pass out leaflets.
https://firstamendment.mtsu.edu/article/marsh-v-alabama-1946/
Imo, a lot of big sites have gotten to that stage, and should be treated as such.
I think this is an underrated point. A lot of people are quick to say "private companies aren't covered by free speech", but I'm sure everyone agrees legal ≠ moral. We rely on these platforms so much that they've effectively become our public squares. Our government even uses them in official capacities, e.g. the president announcing things on Twitter.
When being censored on a private platform is effectively social and informational murder, I think it's time for us to revisit our centuries-old definitions. Whether you agree or disagree that these instances should be covered by free speech laws, this is becoming an important discussion that I never see brought up, but instead I keep seeing the same bad faith argument that companies are allowed to do this because they're allowed to do it.
This is an argument for a publicly-funded “digital public square”, not an argument for stripping private companies of their rights.
Why not both?
While I agree that punishing companies for success isn't a good idea, we aren't talking about small startups or local business ran by individual entrepreneurs or members of the community here. We're talking about absurdly huge corporations with reach and influence the likes that few businesses ever reach. I don't think it's unreasonable to apply a different set of rules to them, as they are distinctly different situations.
Because one is violating the first amendment rights of a private company, the other isn’t. Punishing a private company for how an individual uses their platform isn’t constitutional. It would be like holding car manufacturers liable for drunk drivers.
It's different because the company built and maintains the space. Same goes for a concert hall, a pub, etc...
Nobody believes that someone being thrown out of a pub for spouting Nazistic hate speech is their "free speech being trampled". Why should it be any different if it's a website?
You rarely see the discussion, because there's rarely a good argument here. It boils down to "it's a big website, so I should be allowed to post whatever I want there", which makes little to no sense and opens up a massive quagmire of legal issues.
There is a key difference here. Social media companies have some liability with what gets shared on the platform. They also have a financial interest in what gets said and how it gets promoted by algorithms. The fact is, these are not public spaces. These are not streets. They're more akin to newspapers, or really the people printing and publishing leaflets. The Internet itself is the street in your analogy.
Your analogy about Newspapers isn't accurate either. The writers of a newspaper are paid by the company and everyone knows that writers execute the newspaper's agenda. Nothing gets published without review and everything aligns with the company's vision. Information is one way and readers buy it to consume information. They don't expect their voice to be heard and the newspaper don't pretend that the readers have that ability either. This isn't comparable to a social media site at all.
Companies probably shouldn't be liable then for what individuals share / post then, instead the individuals should. Social media constantly controls their push / promotion of posts currently using algorithms to decide what should be shown / shared and when.
I hate this so much. I want real, linear feeds from all my friends I'm following, not a personally curated style sanitized feed to consider my interests and sensibilities.
The Supreme Court is visiting such issues this month. Second block of text:
https://www.scotusblog.com/2023/10/major-second-and-first-amendment-cases-headline-november-sitting/
Nobody (besides maybe extreme conservatives) is advocating for "the state" to decide what "is and isn't true". That's not what this is about.
Furthermore, "misinformation" and "disinformation" refer to things that can be true! Propogansists don't always need to invent false facts for them to be used in deceptive ways. To suggest that the goverment should stay out of the matter unless they utilze a perfectly foolproof fact-o-meter is IMO, shortsighted. "The state" makes policy decisions all the time with imperfect facts.
Except there have always been limits on speech, centered mainly on truth. Your freedom of speech doesn't extend to yelling "Fire" in a crowded theater when there is no fire, for instance.
But we live in an age of alternative facts now, where science isn't trusted if it comes up with conclusions that conflict with your world view. Do you get a pass if you are yelling "Fire" because you are certain there are cell phone jammers in the theater that are setting your brain on fire because you got the COVID shot and now the 5G nanoparticles can't transmit back to Fauci's mind control lair?
Yes. Anyone in good faith attempting to warn others of any potential harm that they believe to be true to the best of their abilities should have their speech protected.
But what if their beliefs are verifiably false? I don't mean that in a sense of a religious belief, which cannot be proven and must be taken on faith. I mean that the facts are clear that there are no 5G nanoparticles in the vaccine for cell phone jammers to interfere with in the first place. That isn't even a thing.
It's one thing to allow for tolerance of different opinions in public. It's another thing entirely to misrepent things that can be objectively disproven as true, just because you've tied it to a political movement. Can that really still be considered to be in good faith?
I wrote a comment about this earlier today. People who have been brainwashed to believe total nonsense often act in ways that are rational to them, but irrational to people who see the world through different eyes.
That's fine until it's violent action.
The alcoholic who thinks he's "fine to drive" believes he's perfectly rational. He's drunk all the time and no accidents. That's wonderful until he kills a family some night.
Uh, you know that happens regularly in courtrooms right? Like, almost every court battle hinges on what's true and what's not. And courts are an arm of the state.
In some cases it's directly about the truth of speech. Fraud, defamation, perjury, filing a false report, etc. are all cases where a court will be deciding whether a statement made publicly is true and punishing a party if it was not. Ask a CEO involved in a merger how much "free speech" they have.
Oh weird, you coincidentally are a conservative mod lol
Gee so surprising you're mad about cEnSoRsHiP
Well, here’s how that was framed for participants of this study:
And even with this, Republicans didn’t care if it was true or not.
We’re actually past the point of anyone being able to be considered truthful by Republicans. It either tickles their feelings right or it doesn’t and that is all.
Section 230 gets the state involved from the get go. Remove liability protections from the state and everything else will shake out. Make little tweaks from there as necessary. The broad protection of 230 is causing this issue.
Isnt a grand jury enough to deal with this kinda thing? Like before damage is done but I don't see why that mechanism can't be useful here too?