WolfLink

joined 3 months ago
[–] [email protected] 1 points 2 hours ago* (last edited 2 hours ago)

We aren’t talking about current cameras. We are talking about the proposed plan to make cameras that do cryptographically sign the images they take.

Here’s the link from the start of the thread:

https://arstechnica.com/information-technology/2024/09/google-seeks-authenticity-in-the-age-of-ai-with-new-content-labeling-system

This system is specifically mentioned in the original post: https://www.seroundtable.com/google-search-image-labels-ai-edited-38082.html when they say “C2PA”.

[–] [email protected] 0 points 6 hours ago* (last edited 5 hours ago) (2 children)

It’s not that simple. It’s not just a “this is or isn’t AI” boolean in the metadata. Hash the image, then sign the hash with digital signature key. The signature will be invalid if the image has been tampered with, and you can’t make a new signature without the signing key.

Once the image is signed, you can’t tamper with it and get away with it.

The vulnerability is, how do you ensure an image isn’t faked before it gets to the signature part? On some level, I think this is a fundamentally unsolvable problem. But there may be ways to make it practically impossible to fake, at least for the average user without highly advanced resources.

[–] [email protected] 1 points 7 hours ago

Take a high-quality AI image, add some noise, blur, and compress it a few times.

Or, even better, print it and take a picture of the print out, making sure your photo of the photo is blurry enough to hide the details that would give it away.

[–] [email protected] 6 points 7 hours ago* (last edited 7 hours ago) (4 children)

Even if you assume the images you care about have this metadata, all it takes is a hacked camera (which could be as simple as carefully taking a photo of your AI-generated image) to fake authenticity.

And the vast majority of images you see online are heavily compressed so it’s not 6MB+ per image for the digitally signed raw images.

[–] [email protected] 12 points 5 days ago (1 children)

I genuinely love PlexAmp. I’m curious about the photos thing and might give it a try.

[–] [email protected] 1 points 5 days ago

Sure but also I literally have a whole box of cables, and if/when I actually need a new cable I can buy the Amazon Basics $5 cable.

Alternatively, if you really care about having the Brand Name Cable, consider this a $20 price hike.

Seriously this is such a petty issue there are much bigger things to complain about.

[–] [email protected] 42 points 6 days ago (1 children)

I’ve watched the whole thing. It’s so close to something I’d really like, at least in concept. But the ball is dropped so hard in crucial areas :(

[–] [email protected] 7 points 1 week ago (1 children)

At least be accurate.

The “Pro” model iPhone has a lot of the features you are calling out the non-pro one for not having. Also no non-proprietary lossless audio streaming would be more accurate.

[–] [email protected] 44 points 1 week ago* (last edited 1 week ago) (3 children)

Almost everything about it needs to be optional because sometimes USB is used to charge some cheap battery powered thing and sometimes it’s used to make a backup of a harddrive and sometimes it’s charging my laptop with enough power for it to be rendering video but still have a net charge increase to the battery while also providing Ethernet, video output, and keyboard/mouse input over the same one port.

EDIT to make it more clear why the variability of USB standards is what it is, compare a modern laptop to one from 10 years ago.

The older laptop has:

  • for video, an HDMI port (or the less common mini HDMI port), and perhaps a mini DP port
  • an Ethernet port
  • a charging plug
  • possibly some FireWire ports (may or may not be the same as the mini DP port)
  • USB A ports for keyboard/mouse and other random devices

The newer laptop has:

  • USBC ports that can do all of the above

The perhiperals, however, don’t support all of the features. They only support the features they actually use. As long as the laptop supports all of the optional features, you don’t need to worry about it.

The is especially helpful for less technical users who may not want to know what the difference between HDMI and DisplayPort is. With a fully USBC based laptop and USBC perhipals you can just plug it in and it will work.

Of course this is all dependent on the laptop implementing all of the extra features, which is still only really true of more expensive laptops.

[–] [email protected] 8 points 1 week ago (2 children)

“Regular” instead of medium should be ok.

If there are only 2 sizes you can pick any 2 of the labels.

[–] [email protected] 5 points 1 week ago (7 children)

You don’t actually need internet for the VR streaming part, so you could just set up a router not plugged into the wall

[–] [email protected] 3 points 2 weeks ago

Xubuntu is more than fine. Tbh it doesn’t hugely matter which distro you use for this type of thing

view more: next ›