this post was submitted on 11 Jun 2024
174 points (95.8% liked)

Technology

59312 readers
5006 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A group of hackers that says it believes “AI-generated artwork is detrimental to the creative industry and should be discouraged” is hacking people who are trying to use a popular interface for the AI image generation software Stable Diffusion with a malicious extension for the image generator interface shared on Github. 

ComfyUI is an extremely popular graphical user interface for Stable Diffusion that’s shared freely on Github, making it easier for users to generate images and modify their image generation models. ComfyUI_LLMVISION, the extension that was compromised to hack users, is a ComfyUI extension that allowed users to integrate large language models GPT-4 and Claude 3 into the same interface. 

The ComfyUI_LLMVISION Github page is currently down, but a Wayback Machine archive of it from June 9 states that it was “COMPROMISED BY NULLBULGE GROUP.”

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 5 months ago (1 children)

For me the funniest moment of this whole saga was when the AI bros were claiming that they weren't stealing anyone's art, but then flipped shit when a FOSS tool released that let people reformat their art pieces specifically so that it'd be harmful to AI art generators that copied them.

[–] [email protected] 6 points 5 months ago

You've got it backwards. Glaze and Nightshade aren't FOSS and Ben Zhao, the University of Chicago professor behind them stole GPLv3 code for glaze. GPLv3 is a copyleft license that requires you share your source code and license your project under the same terms as the code you used. You also can’t distribute your project as a binary-only or proprietary software. When pressed, they only released the code for their front end, remaining in violation of the terms of the GPLv3 license.

Moreover, Nightshade and Glaze also only works against open source models, because the only open models are Stable Diffusion's, companies like Midjourney and OpenAI with closed source models aren't affected by this. Attacking a tool that the public can inspect, collaborate on, and offer free of cost isn’t something that should be celebrated.