Technology
This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.
Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.
Rules:
1: All Lemmy rules apply
2: Do not post low effort posts
3: NEVER post naziped*gore stuff
4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.
5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)
6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist
7: crypto related posts, unless essential, are disallowed
view the rest of the comments
https://crsreports.congress.gov/product/pdf/LSB/LSB10922
I’ve glanced at these a few times now and there are a lot of if ands and buts in there.
I’m not understanding how an AI itself infringes on the copyright as it has to be directed in its creation at this point (GPT specifically). How is that any different than me using a program that will find a specific piece of text and copy it for use in my own document. In that case the document would be presented by me and thus I would be infringing not the software. AI (for the time being) are simply software and incapable of infringement. And suing a company who makes the AI simply because they used data to train its software is not infringement as the works are not copied verbatim from their original source unless specifically requested by the user. That would put the infringement on the user.
There's a bit more nuance to your example. The company is liable for building a tool that allows plagiarism to happen. That's not down to how people are using it, that's just what the tool does.
So a company that makes lock picking tools is liable for when a burglar uses them to steal? Or a car manufacturer is liable when some uses their car to kill? How about knives, guns, tools, chemicals, restraints, belts, rope, and I could go on and nearly use every single word in the English language yet none of those manufacturers can be sued for someone misusing their products. They’d have to show intent of maliciousness which I just don’t see is possible in the context they’re seeking.
The reason GPT is different from those examples (not all of them but I'm not going into that), is that the malicious action is on the part of the user. With GPT, it gives you an output that it has plagiarised. The user can take that output and then submit it as their own which is further plagiarism but that doesn't absolve GPT. The problem is that GPT doesn't cite its own sources which would be very helpful in understanding the information it's getting and with fact-checking it.
While GPT was trained on the material it does not produce plagiarizing results. It can have reused phrases but only because those phrases are reused across multiple examples and not from a specific work. It learns like b comes after a, c comes after b, d comes after c and then will sometimes reproduce ABCD because it’s normal for that to be used within the context. It is not plagiarism but more akin to the human capability of guiltless probability. If it’s plagiarizing then it’s doing so by coincidence due to context.
How it goes about constructing sentences doesn't mean the phrases it reproduces aren't plagiarism. Plagiarism doesn't care about probability of occurrence, it looks at how much one work closely resembles another and the more similar they are, the more likely it is to be plagiarised.
You can only escape plagiarism by proving that you didn't copy intentionally or you cite your sources.
GPT has no defence because it has to learn from the sources in order to learn the probabilities of the phrases being constructed together. It also doesn't cite its sources so in my eyes, if found to be plagiarising then it has no defence.