PSA: this is not me making fun of the US as a country, or it's citizens. this is me making fun of the US unit system. again.
So yeah, units, let's talk about that. I've already milked to death the fact that the US is one of three countries who don't follow the SI (International System), which is major dick for, you know, converting stuff and doing calculations and physics and all. I've also already talked about the fact that the US unit system is non decimal. This means that one mile is 1,760 yards, one yard is 3 feet, one foot is 12 inches, and this goes on for every unit in the book (distance, volume, mass...). But something I've yet to talk about is how units are first defined.
When you want to define a unit (which was done a few centuries ago, remember that), you want to make it with something easy and which can be easily reproduced (well, for SI units at least). For example, a unit of distance? Well we're not gonna take something random, we're gonna take something that can be reproduced with simple rules, so that anyone in the world can easily find out what length a meter is. So a meter
is the length of the string of a pendulum
, if this pendulum has a period of one second
. Boom, easy to do, not very random since you're basing that on the second, and fairly precise. Note that with time, those units got redefined with more precise measurements and stuff. Now, the meter is defined by the speed of light in a vacuum, but what matters here is how it was first defined. And this is true with every other unit. We have a distance, that means we have a surface
(m²) and a volume
(m³, 1L = 1dm³). Now that we have a volume, we can define a mass. One kilogram
is the mass of one litter of water
(at 4°C and 1bar, who cares). You're noticing a pattern here? Centuries ago, we didn't have super precise stuff, so we defined units with easily reproductible systems that don't change much, for example pure water (and not liquid nitrogen, milk, or whatever random liquid you could find). To get back on topic, I guess anyone can understand that feet and inches and all that jazz were based on... well, the length of human limbs and bodyparts. Easy to use ("yeah, this is about one arm and two hands"), but as imprecise as one can be. At first, every geographic region had a different foot, a different inch and so forth. So yeah, kinda stupid.
Now let's talk about the real reason why I'm making this post, temperature. Because making fun of Fahrenheit should be a national sport. First of all, the SI unit for temperature are Kelvins
, not Celsius degrees. But it's okay here, because Kelvins are based off of Celsius degrees, they have the same base unit and all, it's just that 0K is the lowest temperature possible (-273.15°C), because having a unit that can go bellow 0 (like Celsius and Fahrenheit does) is a bit stupid when you're trying to apply science to it. So Celsius and Kelvin are pretty much the same, it's just that Celsius makes sense for humans and Kelvin makes sense for physicists. Anyways, how were Celsius degrees defined? To define a temperature scale
, you need three things: a low temperature, a high temperature, and how many units you have in between. Following the SI pattern of making things easy, the low Celsius
temperature is the point of freezing water
(at 1bar), the high temperature is the point of boiling water
(at 1bar still), and those were separated by 100 units
(water freezes at 0°C, water boils at 100°C). This might seem like a logical but imprecise measurement to most, but if you've had a course on thermodynamics or ever seen the phase diagram of water
(this one is in Celsius degrees and not Kelvin for easy reading), you will know that those are extremely precise and easy to measure. Now, how were Fahrenheit degrees defined? Hold on to your hats because this is gonna get hairy. You might recall something about the human body temperature or something like that being used, but it's actually a lot more complicated (and hilarious). When Fahrenheit
submitted his temperature scale, he said that the low temperature was the temperature of "a mixture of ice, water, and ammonium chloride" (Wikipedia), and the high temperature was "approximately the human body temperature" (Wikipedia still). But that's what he submitted
, now how he first defined it
. The low temperature was in fact the lowest air temperature measured in his hometown
(Gdańsk, Poland) during the 1708/1709 winter
, and the high temperature was the blood temperature of a horse
. I think you can understand now why he tried to make his submission "a bit more scientific". If it sounds like I'm pulling random stuff out of my ass it's because it's pretty much what the thought process behind this scale was. But wait, there's more! Because yes, I did not mention where those two temperatures where on the scale. You may think 0°F and 100°F, like any normal human being, but no, this had to be bullshit as well. Because Fahrenheit (reminder that I'm talking about a physicist here) wanted that the interval between the two could be divided in 12
, but that wasn't enough so each twelfth had do be divided in 8
. That means that the low temperature is 0°F, and the high temperature is 12*8=96°F. This, this right here is the temperature scale the US is using. I mean, at this point I don't have anything else to say I can just close it off there and call it a day. Because, seriously, what the actual fuck is this scale. A temperature scale defined by 0 being the temperature recorded in some small Poland town in January/February of 1709, and 96 being the blood temperature of one random horse.
tl;dr I invite all of you to invent your own personal unit of temperature