this post was submitted on 09 Aug 2024
89 points (96.8% liked)

Technology

59174 readers
2689 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 19 points 3 months ago (2 children)

How in the fuck do you even coax software into using a key like that? Did someone just say "yeah just use the smallest size possible, that'll be okay" and then just like not care?

[–] [email protected] 21 points 3 months ago* (last edited 3 months ago) (2 children)

From the article:

In an email, a GivEnergy representative reinforced Castellucci’s assessment, writing:

In this case, the problematic encryption approach was picked up via a 3rd party library many years ago, when we were a tiny startup company with only 2, fairly junior software developers & limited experience. Their assumption at the time was that because this encryption was available within the library, it was safe to use. This approach was passed through the intervening years and this part of the codebase was not changed significantly since implementation (so hadn't passed through the review of the more experienced team we now have in place).
[–] [email protected] 15 points 3 months ago (1 children)

So, it sounds like they don't have regular security audits, because that's something that would absolutely get flagged by any halfway competent sec team.

[–] [email protected] 4 points 3 months ago

No need for audits. It's only critical infrastructure embedded into tens of thousands of homes, lol.

[–] [email protected] 10 points 3 months ago

Yet another reminder that trust should be earned.

[–] [email protected] 9 points 3 months ago* (last edited 3 months ago) (1 children)

Because cryptography is a specialized knowledge. Most curriculums doesn't even include cryptography as core topic in their Computer Science degree. You can have a look of the MIT's computer science curriculum. Cryptography is instead embedded in the elective class of Fundementals of Computer Security (6.1600). That's also why DevSecOps instead of the previous DevOps. It's just simply boils down teaching and learning cryptography is hard. It's still too early to expect a typical dev to understand how to implement cryptograhy, even with good library. Most doesn't know compression and encryption doesn't mix well. Nor they understand the importance of randomness and never use the same nounce twice. They doesn't even know they can't use built-in string comparison (==) for verifying password hashes which can lead to timing attacks. Crypto lib devs who understands crypto add big scary warnings yet someone will mess something up.

Still, I will strongly support academics adding basic cryptography knowledge to their curriculum, like common algoritms, key lengths, future threats, and how fast the security landscape is moving, just for the sake of the future of cyber security.

[–] [email protected] 6 points 3 months ago (1 children)

Eh, I disagree. Cryptography really isn't something your average software engineer needs to know about, as long as they understand that you should never roll your own crypto. If you teach it in school, most students will forget the details and potentially just remember some now-insecure details from their classes.

Instead, we should be pushing for more frequent security audits. Any halfway decent security audit would catch this, and probably a bunch of other issues they have as well. Expect that from any org with revenue above some level.

[–] [email protected] 5 points 3 months ago* (last edited 3 months ago) (1 children)

At least have few lessons let them remember not to roll their own crypto, and respect those scary warnings. These needs to be engraved into their mind.

I agree security audit would catch this, but that's something after the fact. There is a need for a more preventative solution.

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago) (1 children)

Security audits should be preventative. Have them before any significant change in infrastructure is released, and have them periodically as a backup.

I had a cryptography and security class in college (I took the elective), and honestly, we didn't cover all that much that's actually relevant to the industry, and everything that was relevant was quickly outdated. That's not going to be a solution, we need a greater appreciation for security audits.

[–] [email protected] 1 points 3 months ago* (last edited 3 months ago)

At least teach the concept of "don't do it ever" won't hurt, and won't get outdated anytime soon.

However, this approach will hurt security in the long term as this brings to burden to the lib dev to maintain a foolproof design, which they can burnout, quit, and leave a big vulnerbility in the future as most dev won't touch the code again if it's still "working."

Cybersecurity is very important in today's digital landscape, and cryptography is one of the pillers. I believe it's essential for devs to learn of core principles of cryptograhy.

Again, audits are nice, and you can use it in various points, but it's not silver bullet. It is just a tool, and can't replace proper education. People are often ignorant. Audits can generate any number of warnings it can, but it's the people needs to take corrective actions, which they can ignore or pressured to ignore. Unless it's part of a compliances certification process that can cause them to get out of business. Otherwise, most managers are "What would I care? That cost more."