this post was submitted on 21 Feb 2024
313 points (94.8% liked)

Technology

59374 readers
3169 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Note: Unfortunately the research paper linked in the article is a dead/broken/wrong link. Perhaps the author will update it later.

From the limited coverage, it doesn't sound like there's an actual optical drive that utilizes this yet and that it's just theoretical based on the properties of the material the researchers developed.

I'm not holding my breath, but I would absolutely love to be able to back up my storage system to a single optical disc (even if tens of TBs go unused).

If they could make a R/W version of that, holy crap.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 8 months ago (3 children)

I never knew the whole thing was considered part of the metric system, makes sense though.
I love the metric system to death because its so simple and easy, and it links different measurements together ( 1l of water = 1kg etc ).

That said, a computer works differently and because we work in factors of 2, 1000 bytes being a kilobyte makes no sense once you start working with bits and low level stuff. Other than that, i can see why the stuff was redefined.

Also, i think linux also works in factors of 1024, but id need to check

[–] [email protected] 1 points 8 months ago (2 children)

There is nothing to keep you from using factors of 1024 (except he slightly ludicrous prefix "kibi" and "mebi"), but other than low level stuff like disc sectors or bios where you might want to use bit logic instead of division it's rather rare. I too started in the time when division op was more costly than bit level logic.

I'd argue that any user facing applications are better off with base 1000, except by convention. Like a majority of users don't know or care or need to care what bits or bytes do. It's programmers that like the beauty of the bit logic, not users. @[email protected]

[–] [email protected] 2 points 8 months ago (1 children)

I agree with what you said, and its imo why the discussion of a factor of 1000 and 1024 will always rage on. Im a developer, and do embedded stuff in my free time. Everything around me is factor 1024 because of it, and i hate the factor 1000. But from a generic user standpoint, i agree its a lot more user friendly, as they are used to the metric system of a factor of 10

[–] [email protected] 1 points 8 months ago

It is user friendly, and technically incorrect, since nothing ever lines up with reality when you use 1000 because the underlying system is base 8.

Or you get the weird non-sense all around "my computer has 18.8gb of memory"...