this post was submitted on 29 Aug 2023
790 points (96.9% liked)

Technology

59390 readers
2712 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -2 points 1 year ago* (last edited 1 year ago) (9 children)

I'll agree that ISPs should not be in the business of policing speech, buuuut

I really think it's about time platforms and publishers be held responsible for content on their platforms, particularly if in their quest to monetize that content they promote antisocial outcomes like the promulgation of conspiracy theories and hate and straight-up crime

For example, Meta is not modding down outright advertising and sales of stolen credit cards at the moment Also meta selling information with which to target voters... to foreign entities

[–] [email protected] 2 points 1 year ago (5 children)

The issue with this is holding tech companies liable for every possible infraction will mean tjay platforms like Lemmy, and mastodon can't exist cause they could be sued out of existance

[–] [email protected] 2 points 1 year ago* (last edited 11 months ago) (4 children)

The issue with this is holding tech companies liable for every possible infraction

That concern was the basis for section 230 of the 1996 Communications Decency Act, which is in effect in the USA but is not the law in places like, say the EU. It made sense at the time, but today it is desperately out of date.

Today we understand that in absolving platforms like Meta of their duty of care to take reasonable steps to not cause harm to their customers, their profit motive would guide them to look the other way when their platform is used to disseminate disinformation about vaccines that gets people killed, that the money would have them protecting Nazis, that algorithms intended to promote engagement would become a tool not just to advertisers but to propagandists and information warfare people.

I'm not particularly persuaded that if in the US there is reform to section 230 of the Communications Decency act, that it would doom nonprofit social media like most of the fediverse- if you look around at all, most of it already follows a well-considered duty-of-care standard that provides its operators substantial legal protection from liability for what 3rd parties post to their platforms. Also if you consider even briefly, that is the standard in effect in much of Europe and social media still exists- it's just less-profitable and has fewer nazis.

[–] [email protected] 2 points 1 year ago

I think generally we need to regulate how algrithums work of this is the case. We need actual legislation and not just law suit buttons. Also meta can slither its way out of any lawsuit, this would really only effect small mastodon instances.

load more comments (3 replies)
load more comments (3 replies)
load more comments (6 replies)