this post was submitted on 06 Sep 2024
240 points (96.9% liked)
Technology
59174 readers
3700 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's messy legislation all around. When does it become porn vs art vs just erotic or satirical? How do you prove it was a deep fake and not a lookalike? If I use a porn actress to make a deep fake is that also illegal or is it about how the original source content was intended to be used/consumed?
I'm not saying that we should just ignore these issues, but I don't think any of this will be handled well by any government.
Actually I was thinking about this some more and I think there is a much deeper issue.
With the advent of generative AI, photographs can no longer be relied upon as documentary evidence.
There's the old saying, 'pics or it didn't happen', which flipped around means sharing pics means it did happen.
But if anyone can generate a photo realistic image from a few lines of text, then pictures don't actually prove anything unless you have some bulletproof way to tell which pictures are real and which are generated by AI.
And that's the real point of a lot of these laws, to try and shove the genie back in the bottle. You can ban deep fake porn and order anyone who makes it to be drawn in quartered, you can an AI watermark it's output but at the end of the day the genie is out of the bottle because someone somewhere will write an AI that ignores the watermark and pass the photos off as real.
I'm open to any possible solution, but I'm not sure there is one. I think this genie may be out of the bottle for good, or at least I'm not seeing any way that it isn't. And if that's the case, perhaps the only response that doesn't shred civil liberties is to preemptively declare defeat, acknowledge that photographs are no longer proof of anything, and deal with that as a society.
One solution that's been proposed is to cryptographic ally sign content. This way someone can prove they "made" the content. It doesn't prove the content is real, but means you can verify the originator.
However, at the end of the day, you're still stuck with needing to decide who you trust.
Probably the best idea yet. It's definitely not foolproof though. Best you could do is put a security chip in the camera that digitally signs the pictures, but that is imperfect because eventually someone will extract the key or figure out how to get the camera to sign pictures of their choosing that weren't taken by the camera.
A creator level key is more likely, so you choose who you trust.
But most of the pictures that would be taken as proof of anything probably won't be signed by one of those.