this post was submitted on 27 Oct 2024
176 points (97.8% liked)

Ask Lemmy

26875 readers
3213 users here now

A Fediverse community for open-ended, thought provoking questions

Please don't post about US Politics. If you need to do this, try [email protected]


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected]. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

If AI and deep fakes can listen to a video or audio of a person and then are able to successfully reproduce such person, what does this entail for trials?

It used to be that recording audio or video would give strong information which often would weigh more than witnesses, but soon enough perfect forgery could enter the courtroom just as it's doing in social media (where you're not sworn to tell the truth, though the consequences are real)

I know fake information is a problem everywhere, but I started wondering what will happen when it creeps in testimonies.

How will we defend ourselves, while still using real videos or audios as proof? Or are we just doomed?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -3 points 2 weeks ago* (last edited 2 weeks ago) (6 children)

For the longest time now, from before AI, before NFT was a thing i had an idea to incorporating blockchain tech into real life media footage to combat the rise of misinformation.

The metadata, original author would be stored on this chain the moment footage is recorded. The biggest challenge is that this means the devices themselves need to be connected.

Adoption would be slow but i imagined news and official channels make use of this tech first. Eventually all footage outside of this will be seen as not trustworthy

Then NFTBros came along and people have shit on this idea ever since. Some days i feel that was a conspiracy to ruim out perception of potential but more likely humans where just greedy.

I still believe this could work. Detailed example below:

The system works with a fair amount of transparency, verifiable digital signatures for recording devices and their owners. Professional cameras and organizations would have publicly known IDs, while individuals could choose to remain pseudonymous authors but would need to build credibility over time.

Let's say BBC records an interview. When viewers watch this content on any platform, they can access blockchain verification through an embedded interface (perhaps a small icon in the corner). This shows the complete chain of custody from recording to broadcast.

The system verifies content through computational comparisons. When a raw interview is edited into a final piece:

  • Each original clip has a unique blockchain signature
  • The final edited version's signature can be compared against source material
  • Automated analysis shows what percentage of original footage matches
  • Modifications like color correction or audio adjustments are detected through signature differences
  • Additional elements like station logos or intro sequences have their own verified identifiers
[–] [email protected] 6 points 2 weeks ago (5 children)

Because it's insanely idiotic. Signing videos is one thing.

Hooking it into blockchain bullshit is entirely deranged. It adds a bunch of complexity to provide literally zero benefit in any possible context.

[–] [email protected] 1 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I am not sure what you think blockchain actually is but in essence its a decentralized ledger of signatures.

Not coins, no sellable goods. Just that. Computers connected in a network to verify the correctness of a cloud ledger.

So if you say signing footage is one thing how do you propose a laymen can verify that signature without centralized databank.

I understand some people may not mind centralized authority but i prefer against it.

I am willing to hear peoples thoughts on this. I am not pro or against blockchain or any form of technology. With the information i have this just seems like a reasonable and practical solution.

[–] [email protected] 2 points 2 weeks ago (1 children)

I am well aware of what it is. It serves no purpose and provides no benefit.

Ignoring the fact that hardware signing doesn't validate inputs as "real", because it's entirely possible to replicate the actual signals entering the camera, and the fact that the entire premise by definition would be a terrible power grab by big hardware/software tools, the very obvious way to implement such an approach would be the exact same system as certificate authorities. You have to have actual root certificate signers.

Blockchain is horseshit and serves no purpose.

[–] [email protected] 0 points 2 weeks ago (1 children)

That hardware inputs can be faked is part of my reasoning here because there would be transparency of the source of footage.

If a reputable journalists fake their own footage and it would be found out their credibility would be gone.

If they often rely on borrowing footage and don't fact check it. Credibility will degrade as well.

Journalist media that does their work and only uses credible sources will thrive.

My solution isn't about who or how signature gets created but how ordinary people can check for themselves where a clip within footage originates from.

I am fine with inventing a new system that does this and call it something else than blockchain. But my understanding is that it does pretty much provide this functionality in a robust manner.

Also typing these comments on the go caused me to lose something dear to me on public transport. I am very sad now and probably wont engage further.

[–] [email protected] 2 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Again, you have to completely ignore that the core premise is evil intended to give big players even stronger monopoly control. It's anti-free in every sense, and as an added bonus, would very certainly make possession of specific hardware sufficient to be executed in some countries, because everything it has ever captured would be tracked to it.

But if you do that, there is already a system that does exactly what you're asking. You don't need to invent anything. It's certificate authorities.

I'm not actually trying to be an asshole, though I'm sure I'm coming off as one. But the only thing blockchain actually does is validate transactions. It's a shared ledger.

[–] [email protected] 0 points 2 weeks ago

Sure i’ll have a look at decentralized certificate authorities options.

Very possibles to adapt my idea to whatever technology provides those function honestly.

The only actual connection i have with blockchain is that reading about it when it was new directly inspired in me a possible way to combat fake news.

load more comments (3 replies)
load more comments (3 replies)