this post was submitted on 28 Mar 2024
368 points (93.6% liked)

Programmer Humor

32479 readers
315 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 52 points 7 months ago* (last edited 7 months ago) (26 children)

KiB, MiB, GiB etc are more clear. It makes a big difference especially 1TB vs 1TiB.

The American way would probably be still using the units you listed but still meaning 1024, just to be confusing.

Either that or maybe something that uses physical measurement of a hard-drive (or CD?) using length. Like that new game is 24.0854 inches of data (maybe it could be 1.467 miles of CD?).

[–] [email protected] 37 points 7 months ago (19 children)

The American way would probably be still using the units you listed but still meaning 1024, just to be confusing.

American here. This is actually the proper way. KB is 1024 bytes. MB is 1024 KB. The terms were invented and used like that for decades.

Moving to 'proper metric' where KB is 1000 bytes was a scam invented by storage manufacturers to pretend to have bigger hard drives.

And then inventing the KiB prefixes was a soft-bellied capitulation by Europeans to those storage manufacturers.

Real hackers still use Kilo/Mega/Giga/Tera prefixes while still thinking in powers of 2. If we accept XiB, we admit that the scummy storage vendors have won.

Note: I'll also accept that I'm an idiot American and therefore my opinion is stupid and invalid, but I stand by it.

[–] [email protected] 5 points 7 months ago (14 children)

No the correct way is to use the proper fucking metric standard. Use Mi or Gi if you need it. We have computers that can divide large numbers now. We don't need bit shifting.

[–] [email protected] 1 points 7 months ago (1 children)

Hey how is "bit shifting" different then division? (The answer may surprise you).

[–] [email protected] 5 points 7 months ago (1 children)

Bit shifting works if you wanna divide by 2 only.

[–] [email protected] 2 points 7 months ago (1 children)

interesting, so does the computer have a special "base 10" ALU that somehow implements division without bit shifting?

[–] [email protected] 4 points 7 months ago (1 children)

In general integer division is implemented using a form of long division, in binary. There is no base-10 arithmetic involved. It's a relatively expensive operation which usually requires multiple clock cycles to complete, whereas dividing by a power of two ("bit shifting") is trivial and can be done in hardware simply by routing the signals appropriately, without any logic gates.

[–] [email protected] 1 points 7 months ago

In general integer division is implemented using a form of long division, in binary.

The point of my comment is that division in binary IS bitshifting. There is no other way to do it if you want the real answer. You can estimate, you can round, but the computational method of division is done via bitshifting of binarary expansions of numbers in an ALU.

load more comments (12 replies)
load more comments (16 replies)
load more comments (22 replies)