I agree. Well, that is assuming there's no human editing of the results of the AI tool afterwards. There was heaps of it in the piece referenced in the article, and there usually is if you want to get something actually good. The piece referenced was entered in to a photomanipulation and editing category too, which seems like it's very much in keeping with the spirit of the competition. But the reason I said that was because the comment I was replying to wasn't about who has the copyright of the tool's output, it was about the value of the output and tools in general
Skua
Are not typically held directly against your head
There's a reason I said "they should be made to be more ethical" and not just "they should be more ethical". I know that they aren't going to do it themselves and I'll support well-written regulations on them.
but it doesn’t matter for this discussion.
Isn't it what almost your entire comment was about?
My point is that this description literally applies just as much to humans. Humans are also trained on vast quantities of things they've seen before and meanings associated with them.
it’s a collage of other art
This is genuinely a misunderstanding of how these programs work.
when AI is used for art it takes jobs from artists and prevents the craft from advancing.
Because the only art anyone has ever done is when someone else paid them for it? There are a lot of art forms that generally aren't commercially viable, and it's very odd to insist that commercial viability is what advances an art form.
I do actually get regularly paid for a kind of work that is threatened by these things (although in my case it's LLMs, not images). For the time being I can out-perform ChatGPT and the like, but I don't expect that that will last forever. Either I'll end up incorporating it or I'll need to find something else to do. But I'm not going to stop doing my hobby versions of it.
Technology kills jobs all the time. We don't have many human calculators these days. If the work has value beyond the financial, people will keep doing it.
I mean, I agree that the developers of these AI tools need to be made to be more ethical in how they use stuff for training, but it is worth noting that that's kind of also how humans learn. Every human artist learns, in part, by absorbing the wealth of prior art that they experience. Copying existing pieces is even a common way to practice.
There's gyro as in gyroscope, where the Y vowel sound is like the word "eye", and there's gyro as in the Greek sandwich where the Y vowel is more like the vowel sound in "sea". The latter is often seen only as "gyros" because that's what the actual Greek word is, but because that seems like a plural in English the S is sometimes dropped.
It's not like "they" are some unknown quantity though, it's the Facebook people. It's not weird or unreasonable for people to not want the company that got fined literally a billion euros for data privacy violations just a couple of months ago to get involved in a thing they like
He did? This article mentions it only briefly, but he talked about it more when it was first getting attention for winning the competition. Is this something he did in the court case that you've read elsewhere?
But also, if you used Midjourney at the time that the image was made, you'll know that you did not get an image like that straight out of it