this post was submitted on 14 Jan 2024
384 points (98.5% liked)

Piracy: ꜱᴀɪʟ ᴛʜᴇ ʜɪɢʜ ꜱᴇᴀꜱ

54609 readers
585 users here now

⚓ Dedicated to the discussion of digital piracy, including ethical problems and legal advancements.

Rules • Full Version

1. Posts must be related to the discussion of digital piracy

2. Don't request invites, trade, sell, or self-promote

3. Don't request or link to specific pirated titles, including DMs

4. Don't submit low-quality posts, be entitled, or harass others



Loot, Pillage, & Plunder

📜 c/Piracy Wiki (Community Edition):


💰 Please help cover server costs.

Ko-Fi Liberapay
Ko-fi Liberapay

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 10 months ago (1 children)

I didn’t realize at first, my bad. I realize that makes a lot of my post redundant but I think my point still stands.

So much hypocrisy that a massive corporation can actually steal like this and it is more socially acceptable than torrenting.

[–] [email protected] 5 points 10 months ago (1 children)

And that's the issue I in particular have. It's a double standard and not only that, they're using it to generate money for their own tools

It's not the same as some kid pirating photoshop to play around with, or a couple who is curious about GOT and want to watch it without paying HBO.

This is a separate issue and I hate that this place is so reddit like that trying to talk about it gets "hurrr dur I guess you're mad because AI and meta are just the current hate train circle jerk hurrr i form my own opinions hurr"

Like, no, I'm upset because this is a whole new topic of piracy use.

[–] [email protected] -1 points 10 months ago (1 children)

I'm not upset because I think it is totally irrelevant because training AI is not reproducing any works and it is no different than a person who reads or sees said works talking about or creating in the style of said works.

At the core, this amounts to thought policing as the final distilled issue if this is given legal precedent. It would be a massive regression of fundamental human rights with terrible long term implications. This is no different than how allowing companies to own your data and manipulate you has directly lead to a massive regression of human rights over the last 25 years. Reacting like foolish luddites to a massive change that seems novel in the moment will have far reaching consequences most people lack the fundamental logic skills to put together in their minds.

In practice, offline AI is like having most of the knowledge of the internet readily available for your own private use in a way that is custom tailored to each individual. I'm actually running large models on my own computer daily. This is not hypothetical, or hyperbole; this is empirical.