Stupid sharks loose their teeth, not their fins that actually do the work.
Errr...wat?!?!
The shark dies either way.
Stupid sharks loose their teeth, not their fins that actually do the work.
Errr...wat?!?!
The shark dies either way.
Under what law?
UK currently holds the people that post things liable for their own words. X, the platform, just relays what is said. Same as Lemmy. Same as Mastodon.
If you ban X I don't see why those other platforms wouldn't be next.
Now should people/organisations/companies leave X? Absolutely! Evacuate like it's a house of fire. Should it be shut down by legal means? No.
Oh, "incident post-mortem" was ambiguous. I read "Incident that happened after death" not "analysis after incident".
I thought OP had a necrophiliac blowjob fantasy.
With batteries that would have a multi-day cycle like these ones, you're going to be trying to flatten out the demand curve (and supply, but the two are related).
The US generates 4.2 PWh a year, and so averages a consumption rate of about 480GW. So, in an ideal system we'd only need this level of generation capacity and if it was higher sometimes and lower others the batteries would smooth it all out.
I'm going to take your 560GW figure as representative of normal demand above the 480GW average. I'll say half of every day is 80GW above average (when we'd be draining batteries) and half is 80GW below (when we'd be charging). The real curves are much more nuanced, but we're establishing context. 80GW for 12 hours is 960GWh, so let's call it 1TWh of battery capacity needed for the whole USA to smooth out a day.
That's 117 of these installation, which frankly I find amazing that it's so low.
Right, so it makes no sense anywhere else.
Protect it, sure, but don't remove it. It's location is part of the art.
Training data is the source. Not the 20 lines of python that get supplied with a model.
A generative AIs only purpose is to generate "works". So it's only purpose in consuming "work" is to use them as reference. It exists to produce derivative works. Therefore the person feading the original work into the machine is the one making the choice on how that work will be used.
A human can consume a "work" for no other reason but to admire it, be entertained by it, be educated by it, to evoke an emotion and finally to produce another work based on it. Here the consumer of the work is the one deciding how it will be used. They are the ones responsible.
I would disagree, because I don't see the research into AI as something of value to preserve.
People talk about open source models, but there's no such thing. They are all black boxes where you have no idea what went into them.
They don't do it because they claim that there isn't enough public domain data.... But let's be honest, nobody has tried because nobody wants a machine that isn't able to reference anything in the last 100 years.
It's certainly arguable that the algorithm constitutes an editorial process and so that opens them up to libel laws and to liability.
Fair point.