this post was submitted on 15 May 2025
475 points (98.0% liked)

Programmer Humor

23402 readers
2402 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 

(Please don't lob rocks at me. I love Python.)

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 2 days ago

dusts off a commodore 64 well time to make my own chatgpt

[–] [email protected] 173 points 1 week ago (2 children)

To be fair, Python is just glue for code written in lower level languages when it comes to AI

[–] [email protected] 95 points 1 week ago (2 children)

A lot of it is c in a python raincoat

[–] [email protected] 19 points 1 week ago (1 children)

Which can be ASM in a C raincoat

[–] [email protected] 23 points 1 week ago (1 children)

Which can be ASMR depending on pronunciation and tone of voice.

[–] [email protected] 6 points 1 week ago (1 children)
load more comments (1 replies)
[–] [email protected] 15 points 1 week ago (2 children)

The underlining linear algebra routines are written in… FORTRAN.

[–] [email protected] 5 points 1 week ago (2 children)

I've never played with FORTRAN, but I've done some linear algebra with matlab. Matlab was interesting for the native handling if matrices. What makes FORTRAN so good at linear algebra?

[–] [email protected] 7 points 6 days ago

Matlab's syntax for matrices actually derives from Fortran. There's a lot of flexibility in Fortran's array features for

  • multidimensional arrays
  • arrays of indeterminate and flexible length
  • vectorized operations on arrays without explicitly writing loops.

Because Fortran does not have a pointer in the sense of C, the Fortran compiler is free to make several optimization that a C compiler can't. Compiled Fortran is often faster than C code that does the same thing.

[–] [email protected] 11 points 1 week ago (1 children)

the main thing that makes fortran preferable to C is the way it handles arrays and vectors. due to different pointer semantics, they can be laid out more efficiently in memory, meaning less operations need to be done for a given calculation.

[–] [email protected] 5 points 1 week ago* (last edited 1 week ago) (1 children)

Interesting. Is this a fundamental limitation of C or is it just more preferable and easier to use FORTRAN when implementing it?

Meaning could the same performance be achieved in C but most optimized libraries are already written so why bother? Or basically C can't achieve the memory optimization at all?

[–] [email protected] 8 points 1 week ago* (last edited 1 week ago)

you can get the same performance by using the restrict keyword in C.

basically, C allows pointer aliasing while fortran does not, which means C programs need to be able to handle cases when a value is accessed from multiple locations. fortran does not, so a lot of accesses can be optimized into immediates, or unrolled without guards.

restrict is a pinky-promise to the compiler that no overlapping takes place, e.g. that a value will only be accessed from one place. it's basically rust ownership semantics without enforcement.

load more comments (1 replies)
[–] [email protected] 5 points 1 week ago (1 children)

Does one even have to actually write Python code, except for frontends? I'd assume you just load the model, weights and maybe training data into pytorch/tensorflow.

[–] [email protected] 6 points 1 week ago

Doesn't seem to be the case, some popular servers:

And then of course talking to these servers can be in any language that has a library for it or even just handles network requests, although Python is a nice choice. Possibly the process of training models is more heavy on the Python dependencies than inference is, haven't actually done anything with that though.

[–] [email protected] 57 points 1 week ago (2 children)

Python-wrapped C, for the most part.

[–] [email protected] 5 points 1 week ago

There's also a whole lot that's just C/C++ exposing a Python interface, without any wrapping.

load more comments (1 replies)
[–] [email protected] 29 points 1 week ago (1 children)

It sure made sense forty years ago. And I'd bet that the examples in that book are more AI than today's LLMs.

[–] [email protected] 2 points 2 days ago* (last edited 2 days ago) (1 children)

The dominant approach at the time were Expert Systems. This used a lot of carefully crafted data and manually curated facts that the inference engine can use. It also fit in a MUCH smaller footprint compared to conventional neural networks. But you also don't get real language processing, reasoning beyond the target problem domain, and stuff like that - it's laser focused and built on very small amounts of data. Much of the research from back then centers on using Lisp and Prolog of all things, so BASIC isn't a big stretch.

[–] [email protected] 1 points 2 days ago (1 children)

Prolog is even better suited for such applications.

[–] [email protected] 1 points 2 days ago* (last edited 2 days ago)

who tf even uses prolog anymore (said the one still using old basic, from when it still had line numbers and everything was goto all the way down)

this is very clearly a self deprecating joke btw

[–] [email protected] 28 points 1 week ago (2 children)

Every old timer knows AI is supposed to be written in Prolog.

[–] [email protected] 2 points 6 days ago

Love a language that doesn't care if you're using inputs to get outputs or using outputs to get inputs

[–] [email protected] 5 points 1 week ago* (last edited 1 week ago) (2 children)
[–] [email protected] 1 points 4 days ago (1 children)

But there is one other, probably even more important advantage: Prolog is a programmer's and software engineer's dream. It is compact, highly readable, and arguably the "most structured" language of them all. Not only has it done away with virtually all control flow statements, but even explicit variable assignment too! These virtues are certainly reason enough to base not only systems but textbooks on this language.

The 90s certainly were a different time...

[–] [email protected] 1 points 4 days ago

I highly recommend learning the language. You learn to think about problems from an entirely different perspective, effectively working backwards from the solution, and once you wrap your head around it, it becomes the clear choice for certain applications such as expert systems.

[–] [email protected] 3 points 1 week ago

That book opening image is indeed telling

[–] [email protected] 27 points 1 week ago (1 children)

I am lobbing rocks at you because of that admission.

[–] [email protected] 2 points 6 days ago

Ikr. Python does kinda suck as a language.

[–] [email protected] 17 points 1 week ago (1 children)

don't blame me i voted for turbo pascal

[–] [email protected] 4 points 6 days ago

I loved turbo pascal. Anyone using that FOSS delphi equivalent thing?

[–] [email protected] 17 points 1 week ago

I have this one! Probably at my folks' place, definitely I'll put it behind my chair so people can see it during video calls.

[–] [email protected] 9 points 1 week ago* (last edited 1 week ago)

Python hatched out of the egg on the cover.

[–] [email protected] 6 points 1 week ago (9 children)

Is Python not considered to be any good?

[–] [email protected] 3 points 5 days ago* (last edited 5 days ago) (1 children)

One tremendous strength of Python no one has mentioned is its vast ecosystem of high quality packages. It's not just the language features that speed up development, that ecosystem makes a huge difference.

Another (far more subjective) advantage is readability - when written according to Python's (actually quite opinionated!) style guidelines and general software engineering best practices, Python is also extremely readable, which really facilitates teamwork. My software shop has transitioned to using Python for most things these days for that reason, away from JS, after seeing my work and code reviews, FWIW.

I'm not some wizardly dev, to be clear, but I'm this shop's first senior dev specializing in Python. I write deliberately clean and readable Python and folks are really enjoying it - enough to voluntarily switch.

Performance is always listed as a Python drawback, and it's not untrue, it's just so overblown as a problem. It basically never causes me issues. Crucially, saving dev time is almost always the better choice compared to saving compute cycles. And I'd take that farther and say anyone junior enough to be wondering about Python and performance...is almost certainly working on tasks that Python is well suited to - better suited, than most other languages.

(Hopefully this was not too controversial, but I accept the risk of a flame war, as is tradition lol)

Edit: clarity

[–] [email protected] 2 points 5 days ago (1 children)

Very good explanation, thank you.

[–] [email protected] 1 points 5 days ago

Cheers friend!

[–] [email protected] 17 points 1 week ago* (last edited 1 week ago) (2 children)

...It's okay. I've programmed in far far worse languages. ...It's got its advantages. It's got it's problems. 🤷🏻‍♀️

Edit: If you need a serious answer: Much like BASIC, it's a language often used in teaching programming. In that sense, I guess it's much better than BASIC. You can, like, actually use it on real world applications. If you're using BASIC for real world applications in this day and age something has gone really wrong.

[–] [email protected] 10 points 1 week ago* (last edited 1 week ago)

Python is great, but it's so forgiving that it's easy to write garbage code if you're not very proficient and don't use the right tools with it.

The only objectively bad (major) thing against it is speed. Not that it matters much for most applications though, especially considering that most number crunching tasks will use libraries that have critical path written in a systems language:

numpy, pandas, polars, scikit-learn, pytorch, tf, spacy; all of them use another language to do the cpu intensive tasks, so it really doesn't matter much that you're using python at the surface.

[–] [email protected] 7 points 1 week ago (1 children)

It's certainly not very fast

[–] [email protected] 1 points 6 days ago

Python itself might not be, but all the AI shit runs on GPUs so it's CUDA or OpenCL or whatever underneath

[–] [email protected] 7 points 1 week ago (2 children)

Python is phenomenal for prototyping IMO.

Once you need performance, its best to use another language (even partially).

But quickly banging out a concept, to me, is the big win for python.

[–] [email protected] 9 points 1 week ago

Once you need performance

If you need more performance. Many things just don't.

[–] [email protected] 7 points 1 week ago

But quickly banging out a concept, to me, is the big win for python.

For me the best language for quickly banging out a concept has always been the one I'm most familiar with at the moment.

[–] [email protected] 6 points 1 week ago (3 children)

It's okay, but it's a bit slow and dynamic typing in general isn't that great IMO.

[–] [email protected] 4 points 1 week ago

Dynamic typing is shit. But type annotation plus CI checkers can give you the same benefits in most cases.

load more comments (2 replies)
[–] [email protected] 6 points 1 week ago

good is subjective to the opinions of the group.

objectively, Python is a smoldering pile of trash waiting to completely ignite. it does have one thing going for it though.

it's not JavaScript.

[–] [email protected] 6 points 1 week ago* (last edited 1 week ago)

Python is the tradeoff between ease of development and performance. If you do things the "normal" way (aka no cython) your programs will oftentimes severely underperform when compared with something written in a relatively lower-level language. Even Java outperforms it.

But, you can shit out a program in no time. Or so I've been told. Python is pretty far from the things I'm interested in programming so I haven't touched it much.

load more comments (1 replies)
[–] [email protected] 4 points 1 week ago

Would it have been any less shitty if it had instead been written in assembly?

load more comments
view more: next ›