• nybble41@programming.dev
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    9 months ago

    The metric standard is to measure information in bits.

    Bytes are a non-metric unit. Not a power-of-ten multiple of the metric base unit for information, the bit.

    If you’re writing “1 million bytes” and not “8 million bits” then you’re not using metric.

    If you aren’t using metric then the metric prefix definitions don’t apply.

    There is plenty of precedent for the prefixes used in metric to refer to something other than an exact power of 1000 when not combined with a metric base unit. A microcomputer is not one one-thousandth of a computer. One thousand microscopes do not add up to one scope. Megastructures are not exactly one million times the size of ordinary structures. Etc.

    Finally: This isn’t primarily about bit shifting, it’s about computers being based on binary representation and the fact that memory addresses are stored and communicated using whole numbers of bits, which naturally leads to memory sizes (for entire memory devices or smaller structures) which are powers of two. Though the fact that no one is going to do something as idiotic as introducing an expensive and completely unnecessary division by a power of ten for every memory access just so you can have 1000-byte MMU pages rather than 4096 also plays a part.