Mirodir

joined 1 year ago
[–] [email protected] 4 points 5 months ago

Same. I had PayPal do an automated charge back because their system thought I was doing something fraudulent when I wasn't. Steam blocked my account.

Talking to support and re-buying said game did fix the issue for me.

[–] [email protected] 30 points 9 months ago (2 children)

I think the humor is meant to be in the juxtaposition between "reference" in media contexts (e.g. "I am your father") and "reference" in programming contexts and applying the latter context to the former one.

What does “I’m your father” mean if the movie is jaws?

I think the absurdity of that question is part of said humor. That being said, I didn't find it funny either.

[–] [email protected] 34 points 9 months ago* (last edited 9 months ago)

It's not as accurate as you'd like it to be. Some issues are:

  • It's quite lossy.
  • It'll do better on images containing common objects vs rare or even novel objects.
  • You won't know how much the result deviates from the original if all you're given is the prompt/conditioning vector and what model to use it on.
  • You cannot easily "compress" new images, instead you would have to either finetune the model (at which point you'd also mess with everyone else's decompression) or do an adversarial attack onto the model with another model to find the prompt/conditioning vector most likely to create something as close as possible to the original image you have.
  • It's rather slow.

Also it's not all that novel. People have been doing this with (variational) autoencoders (another class of generative model). This also doesn't have the flaw that you have no easy way to compress new images since an autoencoder is a trained encoder/decoder pair. It's also quite a bit faster than diffusion models when it comes to decoding, but often with a greater decrease in quality.

Most widespread diffusion models even use an autoencoder adjacent architecture to "compress" the input. The actual diffusion model then works in that "compressed data space" called latent space. The generated images are then decompressed before shown to users. Last time I checked, iirc, that compression rate was at around 1/4 to 1/8, but it's been a while, so don't quote me on this number.

edit: fixed some ambiguous wordings.

[–] [email protected] 5 points 9 months ago

If someone wants to read one of those papers, I can recommend Extracting Training Data from Diffusion Models. It shouldn't be too hard for someone with little experience in the field to be able to follow along.

[–] [email protected] 7 points 9 months ago (1 children)

Understanding the math behind it doesn't immediately mean understanding the decision progress during forward propagation. Of course you can mathematically follow it, but you're quickly gonna lose the overview with that many weights. There's a reason XAI is an entire subfield in Machine Learning.

[–] [email protected] 59 points 9 months ago* (last edited 9 months ago) (1 children)

I think it's much more likely whatever scraping they used to get the training data snatched a screenshot of the movie some random internet user posted somewhere. (To confirm, I typed "joaquin phoenix joker" into Google and this very image was very high up in the image results) And of course not only this one but many many more too.

Now I'm not saying scraping copyrighted material is morally right either, but I'd doubt they'd just feed an entire movie frame by frame (or randomly spaced screenshots from throughout a movie), especially because it would make generating good labels for each frame very difficult.

[–] [email protected] 3 points 9 months ago

I haven't personally used it but from what I can find: if you're using torrents with Stremio (e.g. the ones found with torrentio) you are totally uploading parts of what you're watching to others.

[–] [email protected] 7 points 9 months ago

Or "watch". That way they don't have to make it obvious that their customers won't own it but still don't straight up lie.

[–] [email protected] 21 points 10 months ago (4 children)

no where near Reddit yet on niche subjects

I'm always saddened by how not-active some of those subjects are. For example: Even many large games struggle to have dedicated, active communities on Lemmy (assuming I'm not terrible at finding them, which is sadly also possible). Even some of the largest games have only completely dead communities here. A huge draw of Reddit for me was to be able to talk about the games I play with other people who do too. And mostly, the games I'd love to talk about aren't in the top 10 most played games list.

Now I could try to (re)vitalize those communities I would love to see around, and I have done so shortly after the exodus (on my previous account that died with the instance it was on). However, there's only so much talking into the void I can do until it gets boring.

I also feel like that might be a big issue for people coming over. After I manage to explain to my friends how federation works, they ask me to help them find the [topic of their interest] community, and all I can show them is a community with 10 threads, all over 3 months old and with 0 comments. Sadly it shouldn't surprise anyone they're not sticking around after that.

[–] [email protected] 29 points 10 months ago

I was curious too and checked the article but skimming it, instead of a total, I found this:

A new analysis from MUSO, a U.K.-based anti-piracy analyst [...]

With the study being done by a clearly biased person/group and that large omission, I think it's fair to assume that the % of total web traffic going to pirates might not have gone up all that much, maybe it even went down.

[–] [email protected] 16 points 11 months ago

I'm guessing they just generate a bunch of pictures, pick the closest and fix the rest in photoshop.

Not like real models aren't already often photoshopped to (near) unrecognizability.

[–] [email protected] 14 points 1 year ago

I interpreted it more as a "I'm willing to sacrifice all Ameircan's right for anonymous free speech, which I do value, to take away that of foreigners too." which is a typical braindead racist take.

view more: ‹ prev next ›