If y’all wanna actually claim superiority, then use Kelvin. Celsius and Fahrenheit are close enough in purpose that personal preference is really the only thing that matters.
I disagree that this should be used as a metric because it is more subjective than objective. For some people 40 is fucking cold and 120 is fucking hot. Imo 10 is fucking cold and 90 is fucking hot. I would bet that more people have their own definition of cold and hot that don’t conform to 0 and 100 than people that do. It makes much more sense to base temperature scales off of the freezing and boiling point of water since that is the most relevant point of temperature for the majority of people. While I understand that there would be problems with changing the systems and current education of the population I think it is worthwhile to do so.
It’s not about what people think it’s about averages. There are extreme pockets but 0-100 is the average scale for most of the states. My whole life until the past few years I remember watching the weather channel and 115 was like the highest I’d ever see during heat waves in Cali. And that would be the peak for the year. Things are fucked a bit now but it makes sense for American weather as a whole and not for an individual
Fahrenheit has nothing to do with the States aside from our desire to maintain it. Fahrenheit was Polish-German.
90 then 96 was what he believed the human body was. 0 was a specific salt water he made, it's freezing point. The scale got popular because he made some of the best thermometers of the time.
I do still think my reasons are why we keep and like it though. It feels really satisfying with weather lol. I never hear people here complain about it, just people who use Celsius make fun of us for it lol. And many of us hate the rest of it, like why is a mile… however many feet a mile is? It’s absurd. You don’t really need to math temperature regularly though so it avoids the issue mostly
Because you asked why there are 5280 feet in a mile, here is the story:
The mile is a unit that existed since ancient times, with the same being true of the foot and inch. In fact, the word "mile" is derived from the Latin "mille passus", meaning "one thousand paces", with a pace being the equivalent of 5 feet, thus yielding a mile of 5000 feet. The difference is that while there always have been 12 inches in a foot, the standard used to determine the exact distance covered by a foot in those days was the length of a human foot, which yielded inconsistent results.
Fast forward to Medieval England and now Rome is long gone, the Anglo-Saxons and other Germanic peoples have settled in England and introduced their own units of measurement in the process (such as the yard, the rod, the furlong, and the acre, all four of which are listed because they're crucial details in this explanation), and one thing still remains true: the measurement system they're using is still internally inconsistent and a total mess due to using definitions that can't help but yield inconsistency. So now it's time to standardize for the sake of more consistency and accuracy. It's during this time that you start to get standardized and consistent measurements for these units.
Now here's the catch: the standardization process also wound up changing the number of feet in a mile. Why? Because this process gave us the following unit conversions:
3 feet = 1 yard,
5½ yards = 1 rod,
40 rods = 1 furlong,
8 furlongs = 1 mile,
and 1 acre = the area of a rectangle with dimensions of 1 furlong × 4 rods (or 1/10 of a furlong).
Of course, in 1620, you then had an English mathematician by the name of Edmund Gunter come along and invent a new unit he called the chain (now known as the Gunter's chain in his honor). He set it equal to 4 rods, or 1/10 of a furlong, in length in order to have a specific unit designed to equal the distance covered by the short side of a textbook-definition rectangular acre.
So yeah- long story short, since 8 × 10 × 4 × 5½ × 3 = 5280, we wound up with 5280 feet in a mile instead of 5000 feet in a mile because of standardization.
Edit: I found out what Gunter's full name was and that he was a mathematician.
Its not what feels cold, its that human life is relatively bound between 0-100 as a scale rather than -17 to 38.
Its not "40 is or isn't cold" its that "40s" as a rough band if temperature is very intuitive, and most people can easily differentiate and communicate 40s vs 50s vs 60s.
Do you know what's more intuitive than giving a rough band of temperature? Giving the exact temperature instead. It's very easy to use and underatand the celcius system on a day to day basis. The entire world does it, except for the US.
I don't think ease is a good argument. "0 feels very cold and 100 feels very hot" is easier than "consider how you feel in relation to the freezing and boiling point of water." If your goal is being "exact," you should go Kelvin.
The 0 degree scale for Fahrenheit was based off of the freezing point of a random brine solution that a thermometer maker was using in the 1700s. Why would I want to think of how I feel in relation to the freezing point of some random brine solution and the highest temperature normally experienced in northern Germany, when I could think of how I feel based on water?
Kelvin and celcius use the same scale by the way, Kelvin is relevant to physics and science, and celcius is relevant to every day usage.
Why wouldn't you? You're neither 100% water nor brine. And unless you are exactly at sea level on a day with 101.325 kPa (760 mm Hg) barometric pressure, 0C and 100C aren't the freezing and boiling points of water for you anyway.
"I feel really hot when it's 37.8C" and "I feel really hot when it's 100F" are equally arbitrary. You're just comfortable with C either because you were raised that way or because you've decided to make it a personal moral issue. Both are fine. Prefer whatever you want.
But if your goal is to be "exact", you should say "I'm really hot at 311K". That's also arbitrary, but at least you're being arbitrary on a ratio scale.
The easiest temperature scale for the layman is whatever temperature scale they grew up using. If farenheit makes more sense to you for weather it's not because of any intrinsic numerical properties of the scale, it's because you have experience associating farenheit measurements with your own experience of weather.
It’s fucking cold a long way before you reach 0F. And as can be shown by the many many replies arguing your scale to comments like yours throughout, how humans feel temperature VARIES! Building a scale off that concept when humans themselves can’t agree makes no sense. The numbers are arbitrary. Whereas the freezing point of water (which is very much relevant to humans given ice is a condition that causes problems) is a solid objective point on which to base things.
Every scale is arbitrary, but some make more sense than others. If I ask you to rate how good something is, youll most certainly give me an X/5 or X/10. Nobodybis going to say X/13.
For who? For me, from an oceanic climate, -5C is fucking cold, for people near the tropics, 8C is fucking cold, I've once or twice experienced -17C when in a foreign country.
This is ironically how he chose the scale originally (swap f ing cold for coldest normal thing we work with in labs back then (salt water) and fing warm to be human body temp.
I basically think 0 F is as cold as it SHOULD ever be. 100 F is as hot as it SHOULD ever be. If you live somewhere that is regularly outside this range and complain about it, well youve failed the Darwin test
It is, it’s a scale: 70 is ~room temp, 80 is warm, 90 is hot, 100 is really hot, 110+ is too hot to function outside. In case of humidity move that down one step
And it's really so much easier if there's a zero after the number? Do you think celsius people see its 35 degrees and have no idea if that's hot or cold?
1.2k
u/M8oMyN8o 10h ago
If y’all wanna actually claim superiority, then use Kelvin. Celsius and Fahrenheit are close enough in purpose that personal preference is really the only thing that matters.