SkyeStarfall

joined 1 year ago
[–] [email protected] 4 points 1 year ago

It's not splitting hairs to say that two things are different. It's just a wrong statement.

[–] [email protected] 2 points 1 year ago

Having been near an industrial robot arm, it does make me nervous even if it's powered off. It's a giant hunk of steel that has the strength to probably lift up a car and more, nevermind crush your bones.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

Not to mention stuff like patents and being secretive for profit, hiding knowledge instead of it being used for the good of all and allowed to be iterated upon. Also monopolizing, undermining others, intentionally sabotaging innovations for profit, wasting a ton of resources and effort on things useless on the grand scheme of things, such as how to manipulate people into buying more of your product.. etc.

Also not seeking the most logical option/investing in the logical technologies, such as intentionally sabotaging climate change/renewable energy efforts in order to earn more from oil sales and so on, and so forth. Capitalism/competition does not breed innovation. Cooperation does. And it shows, because research is very much based on cooperation.. at least researchers love to cooperate in the quest for more knowledge.

[–] [email protected] 15 points 1 year ago* (last edited 1 year ago) (1 children)

Imagine the US deploying AI bots to deal with "anti-US hate". Wild.

I guess that was kinda the cold war though.

[–] [email protected] 11 points 1 year ago* (last edited 1 year ago) (1 children)

So, I think I found the original study: https://www.nature.com/articles/s41557-023-01346-3, because for some reason these articles never cite the actual studies. What happened to citing sources??

Anyway, apparently they did grew octyltrichlorosilane on silicon wafers. Now, I have no idea what octyltrichlorosilane is, but here is some information I found about it https://www.chemicalbook.com/msds/Octyltrichlorosilane.htm

It seems to be purely a chemical used for research, so this study would be more of a proof of concept, and you would replace the chemical used with something else for production.

[–] [email protected] 8 points 1 year ago (1 children)

And in turn, ever gradually are artists losing more jobs.

Like it or not, generative AI is already replacing jobs, and we as a society aren't ready for it, even though automation should be a good thing.

[–] [email protected] 8 points 1 year ago (2 children)

Moral agents and moral subjects are two vastly different things

[–] [email protected] 13 points 1 year ago (1 children)

It is, and it won't get better. Windows is very much in the process of enshittification.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

I get you are being intentionally obtuse for some fucking reason, probably to absolve yourself of your own moral harm you are causing, but yeah, veganism is about reducing harm as much as individually possible. Is it really that hard to understand?

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (3 children)

Can an animal understand the moral implications of the exploitative logistics chain of creating a smartphone and your part in it as a consumer? And what alternatives could be used in its stead? From environmental, to exploitative of the workers, to the health issues of resource extraction or factory work. Or about the ethics and consequences of fast fashion, or political policies or phenomena such as universal healthcare or gentrification?

If you are incapable of understanding the structural reason for why someone does something, I think it's fair to say you cannot be a moral agents. Stealing is bad, yes, but is stealing bread because your children are starving bad? Is stealing still bad when the laws and moral framework that is set in the society determined by those who get rich off of exploiting the same people who steal? I think it's fair to say you need a lot of abstract thinking to fully comprehend these scenarios.

[–] [email protected] 8 points 1 year ago (5 children)

Literally in the abstract

In this paper, we do not challenge this claim. Instead, we presuppose its plausibility in order to explore what ethical consequences follow from it.

And further in the introduction

He has argued that, while animals probably lack the sorts of concepts and metacognitive capacities necessary to be held morally responsible for their behaviour, this only excludes them from the possibility of counting as moral agents. There are, however, certain moral motivations that, in his view, may be reasonably thought to fall within the reach of (at least some) animal species, namely, moral emotions such as “sympathy and compassion, kindness, tolerance, and patience, and also their negative counterparts such as anger, indignation, malice, and spite”, as well as “a sense of what is fair and what is not” (Rowlands 2012, 32). If animals do indeed behave on the basis of moral emotions, they should, he argues, be considered moral subjects, even if their lack of sophisticated cognitive capacities prevents us from holding them morally responsible.1

But yes, I am fairly certain that no non-human animals has the mental facilities to be true moral agents. Especially because this is something a significant chunk of humans struggle with, and no animal comes close to us in terms of abstract thinking and that kinda stuff.

[–] [email protected] 11 points 1 year ago* (last edited 1 year ago) (12 children)

Non-human animals aren't moral agents. This is like basic stuff, c'mon.

view more: ‹ prev next ›