this post was submitted on 08 Feb 2024
920 points (94.7% liked)

Programmer Humor

32483 readers
657 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 9 months ago (3 children)

This is all funny and stuff but chatGPT knows how long the German Italian border is and I'm sure, most of you don't

[–] [email protected] 10 points 9 months ago (1 children)

Nobody knows how long any border is if it adheres to any natural boundaries. The only borders we know precisely are post-colonial perfectly straight ones.

[–] [email protected] 3 points 9 months ago* (last edited 9 months ago) (1 children)

Well, for non-adjacent countries, the answer is still straightforward

[–] [email protected] 3 points 9 months ago (1 children)

Yes, I too can confidently state the precise length of the infamous Loatian-Canadian border.

[–] [email protected] 4 points 9 months ago (1 children)

I've tried, but chatGPT won't give me an answer. So far, my personal record is Serbia - Iraq. If you find 2 countries that are further apart, yet chatGPT will give you a length of the border, feel free to share a screenshot!

[–] [email protected] 3 points 9 months ago

Thank you for your service, Sir - that made my day.

[–] [email protected] 9 points 9 months ago (3 children)

So I apparently have too much free time and wanted to check. So I asked ChatGPT how long the border was exactly, and it could only get an approximate guess, and it had to search using Bing to confirm.

[–] [email protected] 8 points 9 months ago (2 children)

Here I am wondering why no one made the joke that the answer was not found (404) but chat gpt assumed it was the answer 😂

[–] [email protected] 2 points 9 months ago

Presumably, we can expect 404 to eventually replace 42 as the answer to everything.

[–] [email protected] 2 points 9 months ago

lol, holy shit... I can't believe I didn't notice that.

[–] [email protected] 3 points 9 months ago

That's a number I never got. I got either 700 something km or 1000 something. It's only sometimes that chatGPT realizes that there are Austria and Switzerland in between and there is no direct border

[–] [email protected] 3 points 9 months ago

Google's AI gives it as:

The length of the German-Italian border depends on how you define the border. Here are two ways to consider it:

Total land border: This includes the main border between the two countries, as well as the borders of enclaves and exclaves. This length is approximately 811 kilometers (504 miles).

Land border excluding exclaves and enclaves: This only considers the main border between the two countries, neglecting the complicated enclaves and exclaves within each country's territory. This length is approximately 756 kilometers (470 miles).

It's important to note that the presence of exclaves and enclaves creates some interesting situations where the border crosses back and forth within the same territory. Therefore, the definition of "border" can influence the total length reported.

[–] [email protected] 2 points 9 months ago (1 children)

Make sure you ask the AI not to hallucinate because it will sometimes straight up lie. It’s also incapable of counting.

[–] [email protected] 2 points 9 months ago (1 children)

But where is it fun in it if I can't make it hallucinate?

[–] [email protected] 1 points 9 months ago (1 children)

I do feel bad when I have to tell it not to. Hallucinating is fun!

[–] [email protected] 1 points 9 months ago (1 children)

But does it work to tell it not to hallucinate? And does it work the other way around too?

[–] [email protected] 2 points 9 months ago (1 children)

It’s honestly a gamble based on my experience. Instructions that I’ve given ChatGPT have worked for a while, only to be mysteriously abandoned for no valid reason. Telling AI not to hallucinate is apparently common practice from the research I’ve done.

[–] [email protected] 1 points 9 months ago (1 children)

Makes me wonder: Can I just asked it to hallucinate?

[–] [email protected] 2 points 9 months ago

Yep. Tell it to lie and it will.