BatmanAoD

joined 1 year ago
[–] [email protected] 9 points 4 months ago

I'm not totally clear on why signals are used here in the first place. Arguably most C code doesn't "need" to use signals in complex ways, either.

[–] [email protected] 28 points 4 months ago (2 children)

The trope will be "old" once the mainstream view is no longer that C-style memory management is "good enough".

That said, this particular vulnerability was primarily due to how signals work, which I understand to be kind of unavoidably terrible in any language.

[–] [email protected] 1 points 4 months ago (1 children)

Indeed, I had no idea there are multiple languages referred to as "APL".

I feel like most people defending C++ resort to "people shouldn't use those features that way". 😅

As far as I can tell, pointer arithmetic was not originally part of PASCAL; it's just included as an extension in many implementations, but not all. Delphi, the most common modern dialect, only has optional pointer arithmetic, and only in certain regions of the code, kind of like unsafe in Rust. There are also optional bounds checks in many (possibly most) dialects. And in any case, there are other ways in which C is unsafe.

[–] [email protected] 1 points 5 months ago* (last edited 5 months ago) (3 children)

True, but AFAIK they all sucked really bad.

That's pure assumption and, as far as I can tell, not actually true. PASCAL was a strong contender. No language was competitive with handwritten assembly for several decades after C's invention, and there's no fundamental reason why PASCAL couldn't benefit from intense compiler optimizations just as C has.

Here are some papers from before C "won", a more recent article about how PASCAL "lost", and a forum thread about what using PASCAL was actually like. None of them indicate a strong performance advantage for C.

[–] [email protected] 2 points 5 months ago (1 children)

I'm honestly not convinced JavaScript is good even for the front-end, because it's intentionally designed to swallow, ignore, and otherwise minimize errors; which is not helpful at all for development of any software. My point is that the only reason JavaScript is dominant in front-end development is that, prior to WASM, it was literally the only option; if that hadn't been the case, I doubt it would have become nearly so widely used.

[–] [email protected] 3 points 5 months ago* (last edited 5 months ago)

C++11 also introduced new problems, such as the strange interaction between brace-initialization and initializer-lists (though that was partially fixed several years later), and the fairly arcane rules around move semantics with minimal compiler support (for example, it would be great if the standard required compilers to emit an error if a moved-from object were accessed).

I know Lisp is minimal, I'm just saying that I expect there are Lisp fans who won't acknowledge (or would excuse) any shortcomings in the language, just as there are C++ fans who do the same for C++.

[–] [email protected] 1 points 5 months ago

Sounds like we're actually in agreement about most of this.

I'm okay with languages limiting their "expressive" power in order to provide stronger correctness guarantees or just limit how "weird" code looks; but this is largely because I've worked on many projects where someone had written a heap of difficult-to-understand code, and I doubt such limitations would be appealing if I were working strictly on my own.

I also don't really see the appeal of Java-style inheritance, but to be honest I didn't use Scala for long enough to know whether or not I agree that Scala does inheritance "right".

It does make sense that Rust provides mutability in some cases where Scala doesn't. Rust's superpower, enabled by the borrow checker, is effectively "safe mutability." I hope other, simpler languages build on this invention.

[–] [email protected] 2 points 5 months ago (1 children)

I don't really like the title either, but the article does demonstrate how unfortunate it is that we're effectively locked in to using the ABI at some level of nearly every piece of software.

That said, there definitely were languages with better type systems prior to the invention of C. Pascal is a frequently-cited example.

[–] [email protected] 3 points 5 months ago (3 children)

Sorry, I'm not sure what your point is. I realize that you can almost completely avoid JavaScript, but the point I'm making is merely that there is a real technical limitation that limits the choices developers can make for front-end code, and although WASM is making great strides in breaking down that barrier (something I've been thrilled to see happen, but which is going much more slowly than I had hoped), the limitation is still there. Conversely, such a barrier has never existed on the backend, except in the sense that C limits what all other languages can do.

[–] [email protected] 5 points 5 months ago (2 children)

Ehhh, I mean this more strongly. I've never met people more in denial about language design problems than C++ adherents. (Though admittedly I haven't spent much time talking to Lisp fans about language design.)

[–] [email protected] 7 points 5 months ago* (last edited 5 months ago)

I see where you're coming from, but no matter how many null pointer exceptions there are in Java code, you're almost always protected from actually wrecking your system in an unrecoverable way; usually the program will just crash, and even provide a relatively helpful error message. The JVM is effectively a safety net, albeit an imperfect one. Whereas in C++, the closest thing you have to a safety net, i.e. something to guarantee that invalid memory usage crashes your program rather than corrupting its own or another process's memory, is segfaults, which are merely a nicety provided by common hardware, not required by the language or provided by the compiler. Even then, with modern compiler implementations, undefined behavior can cause an effectively unlimited amount of "bad stuff" even on hardware that supports segfaults.

Additionally, most languages with managed runtimes that existed when Java was introduced didn't actually have a static type system. In particular, Perl was very popular, and its type system is...uh...well, let's just say it gives JavaScript some serious competition.

That said, despite this grain of truth in the statement, I think the perception that Java is comparatively robust is primarily due to Java's intense marketing (particularly in its early years), which strongly pushed the idea that Java is an "enterprise" language, whatever that means.

[–] [email protected] 3 points 5 months ago* (last edited 5 months ago) (3 children)

This is a really good post about why C is so difficult to seriously consider replacing, or even to avoid by using a different language for certain projects: https://faultlore.com/blah/c-isnt-a-language/

view more: ‹ prev next ›