this post was submitted on 11 Jun 2024
174 points (95.8% liked)

Technology

59312 readers
5006 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A group of hackers that says it believes “AI-generated artwork is detrimental to the creative industry and should be discouraged” is hacking people who are trying to use a popular interface for the AI image generation software Stable Diffusion with a malicious extension for the image generator interface shared on Github. 

ComfyUI is an extremely popular graphical user interface for Stable Diffusion that’s shared freely on Github, making it easier for users to generate images and modify their image generation models. ComfyUI_LLMVISION, the extension that was compromised to hack users, is a ComfyUI extension that allowed users to integrate large language models GPT-4 and Claude 3 into the same interface. 

The ComfyUI_LLMVISION Github page is currently down, but a Wayback Machine archive of it from June 9 states that it was “COMPROMISED BY NULLBULGE GROUP.”

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 42 points 5 months ago* (last edited 5 months ago) (1 children)

Based on the discussion that I've seen, it looks like the "Anti-AI" motive was an excuse since all the hack was doing was to steal API keys and potentially sell them. Here's a discussion thread on reddit that goes into this more.

[–] [email protected] 19 points 5 months ago

Looting API keys makes way more sense. They must have been stuck using GPT2 to write that incoherent statement.