this post was submitted on 09 Feb 2024
144 points (95.0% liked)

Technology

59207 readers
3247 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The AI Deepfakes Problem Is Going to Get Unstoppably Worse::Deepfakes are blurring the lines of reality more than ever before, and they're likely going to get a lot worse this year.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 10 points 9 months ago (5 children)

People need to be more media litterate and more skeptical of news stories instead of taking them at face value, regardless of Deepfakery. So many articles that pass as "news" are filled with opinion and adjectives designed to ellicit an emotional response.

People need to learn to look at a piece of information and ask questions.

  • Who wants me to be reading this?
  • What emotions (if any) is this trying to ellicit?
  • What objective information can be taken from this story?
  • What are the sources for that objective information? Are they reliable?

Etc. Etc. Etc.

Even a Fox News article can have some insight into the goings on if you can parse the information from the spin. Deepfakes are just going to be another level of spin, but if people are informed enough, they'll be able to logically differentiate between a real news story and a damning fake video.

However, that doesnt solve the age old problem of willfully ignorant people and the confirmation bias...

load more comments (3 replies)