this post was submitted on 22 Dec 2024
1083 points (97.3% liked)
Technology
60052 readers
3344 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That is how LLM works, they don't store the data as data, but as weight values.
So then why, if it were all open sourced, including the weights, would the AI be worthless? Surely having an identical but open source version, that would strip profitability from the original paid product.
It wouldn't be. It would still work. It just wouldn't be exclusively available to the group that created it-any competitive advantage is lost.
But all of this ignores the real issue - you're not really punishing the use of unauthorized data. Those who owned that data are still harmed by this.
It does discourages the use of unauthorised data. If stealing doesn't give you competitive advantage, it's not really worth the risk and cost of stealing it in the first place.