this post was submitted on 24 Apr 2024
283 points (95.2% liked)
Technology
59207 readers
2513 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This genie is probably impossible to get back in the bottle.
People are going to just direct the imitative so called AI program to make the face just different enough to have plausible deniability that it's a fake of this person or that person. Or use existing tech to age them to 18+ (or 30+ or whatever). Or darken or lighten their skin or change their eye or hair color. Or add tattoos or piercings or scars...
I'm not saying we should be happy about it, but it is here and I don't think it's going anywhere. Like, if you tell your so called AI to give you a completely fictional nude image or animation of someone that looks similar to Taylor Swift but isn't Taylor Swift, what's the privacy (or other) violation, exactly?
Does Taylor Swift own every likeness that looks somewhat like hers?
It’s also not a new thing. It’s just suddenly much easier for the layman to do. Previously, you needed some really good photoshop skills to pull it off. But you could make fake nudes if you really wanted to, and were willing to put in the time and effort.
This does give prosecutors a new angle though. So it's not for nothing.
If the prompt includes “Taylor swift” or an image of her. Then it doesn’t matter if the AI slightly changed it, it used her likeness to generate the image and so she should have rights to the image and the ability to claim damages.
The same thing should apply to using deepfake porn AIs to make non consensual nudes of private person, or heck manually creating nonconsensual deepfake nudes should also fall under the same definition
This is not how it works. Paparazzi that take her image own the rights to the image. Not Taylor Swift. They make the money on the image when they sell it and Taylor Swift gets nothing out of the sale and has no rights on that transaction. If you're in public you can be photographed. If a photographer takes an image and releases it to public domain, the subjects of the image will have no say in it unless the photographer broke some other law. (Eg peeping Tom laws or stalking)
I believe that your statements are only true for public figures. I'm pretty sure non-public figures retain the right to photos of themselves (unless they aren't the main subject in the photograph).
Stackexchange conversation about this.
Negative. Go take headshots at a photo place. You don't have the right to make copies of your own headshot without permission from that photo place. Your own headshot would literally be you as the primary subject. Yet you still don't have rights to it unless your contract with that photographer says otherwise.
https://www.avvo.com/legal-answers/who-owns-the-copyrights-for-headshots--1029175.html
In your own link the first answers even states it...
Subjects having any rights to the photo is rare, short of other laws being broken.
Edit: Hell my own kids school pictures. I have to purchase a digital copy of the photo to get a release from the company to make my own prints. EVEN ON MY OWN PRINTER.
Sorry my bad, I was speaking to pictures taken in a public setting, but didn't clarify. When you get headshots done you are giving the photographer the rights.
Still negative.
https://legalbeagle.com/8581945-illegal-pictures-people-permission.html
[...]
So even though that couple is the direct foreground subject of the image, the photographer is NOT liable for not only taking the picture, publishing the picture, but ALSO any damages the picture caused by being published. This is why the paparazzi are also protected.
In the previous post the photographer has the rights because it's their photo, not because you're giving them any rights.
Edit: Typo
Taking photos and the right for commercial use of the photos are two different things. The reason why film crews/photographers generally ask for people to sign releases is because it's not clear cut. While the US is generally more forgiving, it's not a guarantee.
More details
Right... So back to the topic discussion rather than adding extra shit... Someone taking pictures and putting it through AI... There's no problem. They own the rights to that photo and all derivative works (except for any cases where it outright violates a law, peeping tom/stalking/etc...). Public figure or not.
After that it can get gray (but I never brought sale or commercial AI use as a thing... Not sure why people assume I did). But it's quite rare where a sold picture cause a photographer problems. Even if the subjects didn't necessarily consent.
Some other countries might have problems with that and have different laws on the books. But at this point in the world it's really not hard to have a shell company in a territory/country that doesn't have such laws... Then it no longer matters again. Good like finding the photographer to sue.