7112

joined 1 year ago
[–] [email protected] 7 points 7 months ago* (last edited 7 months ago)

So Taylor Swift is going to buy it next?

[–] [email protected] 22 points 7 months ago (1 children)

I have a subscription only because it was added on to another service. I wouldn't be surprise if they padded their numbers by giving out accounts as paired deals.

[–] [email protected] 51 points 9 months ago (4 children)

Why is everyone on a race to the bottom nowadays? Almost every action a consumer makes is becoming monetized.

[–] [email protected] 4 points 9 months ago

Thanks for sharing this

[–] [email protected] 6 points 9 months ago

Sadly you're right. We at least need the basic rule of labeling content that used AI

[–] [email protected] 12 points 9 months ago (8 children)

AI can be such a great tool, but they just use it for this crap... We need regulation

[–] [email protected] 7 points 9 months ago* (last edited 9 months ago)

Why use any human-like image then? A lot of amateur fashion designer on instagram use mannequins or busts. The models are serving a purpose. Removing them means someone loses a job.

If we look at this from top-down you're right because the company is saving a cost. But from the bottom-up, you've just become more expendable. This leads into the arguments others have been making, what happens when eventually people can't work? And why should we use technology to serve the few and not the many?

[–] [email protected] 222 points 9 months ago (25 children)

They frame this article in such a weird way. Like replacing the models and their jobs was justified because they had egos etc...

I can see similar framing used to replace other workers because they want to be paid fairly or do something drastic like take bathroom breaks... :D

[–] [email protected] 17 points 10 months ago

Sadly all that crazy seems totally possible with Microsoft involved.

[–] [email protected] 9 points 11 months ago (2 children)

Just... yay!

[–] [email protected] 3 points 1 year ago

It seems we need to just let this all run longer and see what happens. Currently we have no real way to detect AI in media beside disclosures and the silly mistakes like 20 fingers. This all relies on the creator (Not hard to edit a photo to clean up those hands etc)

I think a lot of creatives are struggling so they just feel shut out of the conversation. Copyright is probably the one thing most people can understand as a talking point.

I think we still have some time before we see which way will work. Ideally we could always augment the laws... but yeah, America and stuff.

[–] [email protected] 1 points 1 year ago

Sadly no matter what, the big media companies are going to have a huge advantage in everything because of decades of lobbying etc.

I think people should still be able to profit from selling the image themselves, however, I don't think we have enough knowledge on how AI will truly impact things. If it becomes a minor fad and is just a tool to help speed a process I think the law doesn't need to change much.

If AI becomes the majority creator on projects then we have to have this conversation about who owns what.

Close models will probably be the future, much like stock photos, and people will have to pay to access the models.

In the end big business will always fuck us over, copyright or not.

view more: next ›