Celsius is also kinda arbitrary, but at least it sets its 0 and 100 to very fundamental, observable temperatures, namely the points where the state of matter of water changes. There are more constraints to it of course, in particular atmospheric pressure, and the modern definition of Celsius is actually purely based on Kelvin (which in turn uses the Boltzmann constant), but as long as you’re not high up in the Andes, everybody can observe a pretty good approximation of it.
Its prevalence is also the outcome of a long process of many different scales. In 19th century Europe, before Celsius completely took over, Réaumur was also very popular. It set 0° at the freezing point of water and the boiling point at 80° under normal atmospheric conditions. Thinking about it, it’s quite wonky to do that, but at least it’s easy to convert to and from Celsius. On the other hand, the similarity in temperatures makes it slightly harder for plausibility checks.
I ran into this when researching the history of some stuff and the specific scale was not always included, but the temperatures in the particular context both made sense as Celsius and Réaumur. That’s when you then have start digging through a whole early 19th century 500 page book printed in a German Gothic font just to see whether the specific temperature scale is mentioned anywhere.
It… isn’t. That would change wildly depending on which sea/ocean you get your saltwater from (more salt = colder freezing point).
It really is defined relative to a very specific brine mixture (in the most scientifically generous origin story - some say he literally just measured the coldest winter day he could). Well except it isn’t anyway, because like all US units nowadays it’s defined against metric units (namely the Kelvin, just like 0°C is actually defined to be 273.15 K).
Having the freezing point of water be at 0 instead of 32 just makes infinitely more sense.
Only if you’re measuring water temps. In general it makes more sense to put the zero of your scale at absolute zero
Celsius is also kinda arbitrary, but at least it sets its 0 and 100 to very fundamental, observable temperatures, namely the points where the state of matter of water changes. There are more constraints to it of course, in particular atmospheric pressure, and the modern definition of Celsius is actually purely based on Kelvin (which in turn uses the Boltzmann constant), but as long as you’re not high up in the Andes, everybody can observe a pretty good approximation of it.
Its prevalence is also the outcome of a long process of many different scales. In 19th century Europe, before Celsius completely took over, Réaumur was also very popular. It set 0° at the freezing point of water and the boiling point at 80° under normal atmospheric conditions. Thinking about it, it’s quite wonky to do that, but at least it’s easy to convert to and from Celsius. On the other hand, the similarity in temperatures makes it slightly harder for plausibility checks.
I ran into this when researching the history of some stuff and the specific scale was not always included, but the temperatures in the particular context both made sense as Celsius and Réaumur. That’s when you then have start digging through a whole early 19th century 500 page book printed in a German Gothic font just to see whether the specific temperature scale is mentioned anywhere.
Fahrenheit’s 0 is the freezing point of water - salt water that is. Not that I think it’s better, just that there was some thought put into it.
It… isn’t. That would change wildly depending on which sea/ocean you get your saltwater from (more salt = colder freezing point).
It really is defined relative to a very specific brine mixture (in the most scientifically generous origin story - some say he literally just measured the coldest winter day he could). Well except it isn’t anyway, because like all US units nowadays it’s defined against metric units (namely the Kelvin, just like 0°C is actually defined to be 273.15 K).