kogasa

joined 1 year ago
[–] [email protected] 1 points 1 year ago

Your broader point would be stronger if it weren't framed around what seems like a misunderstanding of modern AI. To be clear, you don't need to believe that AI is "just" a "coded algorithm" to believe it's wrong for humans to exploit other humans with it. But to say that modern AI is "just an advanced algorithm" is technically correct in exactly the same way that a blender is "just a deterministic shuffling algorithm." We understand that the blender chops up food by spinning a blade, and we understand that it turns solid food into liquid. The precise way in which it rearranges the matter of the food is both incomprehensible and irrelevant. In the same way, we understand the basic algorithms of model training and evaluation, and we understand the basic domain task that a model performs. The "rules" governing this behavior at a fine level are incomprehensible and irrelevant-- and certainly not dictated by humans. They are an emergent property of a simple algorithm applied to billions-to-trillions of numerical parameters, in which all the interesting behavior is encoded in some incomprehensible way.

[–] [email protected] 1 points 1 year ago

in math, if you have a real and you round it, it's always a real not an integer.

No, that's made up. Outside of very specific niche contexts the concept of a number having a single well-defined type isn't relevant in math like it is in programming. The number 1 is almost always considered both an integer and a real number.

If we follow your mind with abs(-1) of an integer it should return a unsigned and that makes no sense.

How does that not make sense? abs is always a nonnegative integer value, why couldn't it be an unsigned int?

[–] [email protected] 1 points 1 year ago

What a genuinely unhinged take.

[–] [email protected] 2 points 1 year ago (1 children)

Complaining about a $20 purchase you don't have to make qualifies for "cheapskate" I think. Simply not purchasing it, or not wanting to purchase it, is fine. The difference is entitlement.

[–] [email protected] 1 points 1 year ago (3 children)

God forbid you ~~have to~~ can pay for stuff if you want.

It's a third party app. One of many. With an optional purchase to support the dev. Honestly...

[–] [email protected] -1 points 1 year ago* (last edited 1 year ago)

What you just said is at best irrelevant and at worst meaningless. No, the fact that multiplication is defined in terms of addition does not mean that it is required or natural to evaluate multiplication before addition when parsing a mathematical expression. The latter is a purely syntactic convention. It is arbitrary. It isn't "accounting."

[–] [email protected] -1 points 1 year ago* (last edited 1 year ago) (2 children)

It is, in fact, completely arbitrary. There is no reason why we should read 1+2*3 as 1 + (2*3) instead of (1 + 2) * 3 except that it is conventional and having a convention facilitates communication. No, it has nothing to do with set theory or mathematical foundations. It is literally just a notational convention, and not the only one that is still currently used.

Edit: I literally have an MSc in math, but good to see Lemmy is just as much on board with the Dunning-Kruger effect as Reddit.

view more: ‹ prev next ›