this post was submitted on 08 Feb 2024
306 points (99.0% liked)
Technology
59347 readers
5016 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Fundamentally we as a species have lost the use of face and voice in a video to establish authenticity.
A person can spoof an email, and we have cryptographic signatures as a means of authentication.
So if I record myself saying something I could sign the video I guess (implementation TBD lol).
But what if someone else (news agency say) takes a video of someone else, how do we authenticate that?
If it's a news agency they could sign it. Great.
But then we have the problem of incentives, too. Does the benefit of a fake outweigh the detrimental effects for said news agency?
The most damage would be to the person being videoed (reputation, loss of election, whatever). There would be less damage to the media company ("oops so sorry please stay subscribed"). You could add fines but corporate oversight is weak. And the benefit of releasing a fake would be clicks and money so a news company would be a lot more likely to pass along a fake as real.
So I guess I have no idea what we do. At the moment we are fucked. Yay.
Vocal chord implants which sign whatever you're saying and emit the signature with ultrasound.