ech

joined 1 year ago
[–] [email protected] 1 points 5 months ago

Unless I'm misunderstanding it myself, fixed term means they had a set period of time that's not up for renewal. Ergo, when the term runs out, that's it. There's no chance of delinquency because there's no additional payments.

[–] [email protected] 3 points 5 months ago (2 children)

"Fixed Term" does not mean "delinquent". It just means there's a hard cut off. All I know is the snippet you posted, though, so maybe there's more to the situation.

[–] [email protected] 10 points 5 months ago (1 children)

But their question wasn't "Do humans deserve to go extinct?", it was "Can we survive?" Your (valid) issues with human-driven climate change don't really have anything to do with what they brought up.

[–] [email protected] 5 points 5 months ago (5 children)

That's not what I got out of that at all. It looks more like an errant setting made the account expire automatically after some amount of time, triggering the wipe.

[–] [email protected] 5 points 5 months ago

It's all serving the same base design, so the distinction seems a bit moot, imo.

[–] [email protected] 0 points 5 months ago (1 children)

So you don't see. Got it.

[–] [email protected] 0 points 5 months ago (3 children)

So people that don't want accounts should be forced to get an account because they might be a bot (that, notably, can't do anything without an account) so they can be filtered out and denied an account that they didn't want in the first place? Do you really not see the issue with what you're saying?

[–] [email protected] 14 points 5 months ago (3 children)

What do you think asshole design means, exactly?

[–] [email protected] 5 points 5 months ago (5 children)

Not sure how forced account creation would help with either of those. Both are only problems when they already have accounts.

[–] [email protected] 5 points 5 months ago

Even saying they're guessing is wrong, as that implies intention. LLMs aren't trying to give an answer, let alone a correct answer. They just put words together.

[–] [email protected] 4 points 5 months ago

It wouldn't be anything specific. The disclaimers would just be overbroad stuff like "Please verify this answer. Google is not responsible for anything. Blah blah blah."

[–] [email protected] 13 points 5 months ago (4 children)

For the upteenth time - an llm just puts words together, it isn't a magic answer machine.

view more: ‹ prev next ›