Most of the time when people say they have an unpopular opinion, it turns out it’s actually pretty popular.

Do you have some that’s really unpopular and most likely will get you downvoted?

  • Hamartiogonic
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    I can see the convenience in the Wh unit, because hours are so common and people prefer to think in terms of hours.

    Joules and watts look nicer and are naturally fully compatible with each other. Non-technical people usually get very confused with kW and kWh, so switching to joules would create a clearer distinction between energy and power. Many energy production facilities also give their annual energy production in MWh per year, which mixes two time units in the same package!

    • PeterPoopshit@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Let’s make a shitty unit trade. We’ll change all watt hour units to joules in exchange for completely banning bits per second as a unit of bandwidth speed. Converting megabit per second to the actually usable unit of megabytes per second in my head is far more infuriating than any amount of joule shenanigans. Any takers?

      • Hamartiogonic
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Oh, I totally forgot about the chaos surrounding bits and bytes. Personally, I don’t really care which one we use, as long as its unified. Mixing and matching Mb/s for transfer and MiB for storage is truly infuriating. Yes, even the prefixes need to be unified.

        I don’t make programs in low level languages, such as assembly, so I don’t really see the benefit of using base 1024 prefixes. If anyone here can convince me why non-binary prefixes are great, I’m listening. As far as I’m concerned, prefixes should be based on 1000, just like in SI.

        Having 8 bits in a byte is just another historical relic, and I see no reason to keep it. Systems have changed many times since the adoption of that term, and back in the early days a single character of text took exactly 8 bits. I guess that was important at the time, but why would it be today? Programmers out there can tell us why we need to have two units for data.