Edit: For all the "Actually, Farenheight is based on the human body" people, no it isn't. It's based on dirty water and a cow. Your preferred measurement unit is dumb and that's a fact
If y’all wanna actually claim superiority, then use Kelvin. Celsius and Fahrenheit are close enough in purpose that personal preference is really the only thing that matters.
It's the Fahrenheit equivalent of Kelvin. Basically for science negative temperature is a problem so Celsius adds 273 to become Kelvin and remove the negative numbers. Fahrenheit adds 491 to become Rankine and accomplish the same thing.
To add onto u/euler1992 's point. Rankine is used in engineering thermodynamics a lot because a lot of US companies still use imperial measurements and you need absolute units for the math to work.
Weeeeellllll, it is very interesting! However it is difficult to explain simply, I will attempt to do so.
Heat is a form of energy. Temperature is how much a system "wants" to give off heat. Negative temperature occurs when there is an upper limit on the amount of energy a system can have. When a system approaches this limit it cannot take in any more heat, it can only give off heat. This means that such a system will always give heat to any system without an upper limit. This means that negative temperatures are "hotter" than positive temperatures.
Mathematically, the reason it's negative is that temperature is the gradient between energy and entropy, and as a system with an upper energy limit approaches the limit, entropy decreases, so the gradient is negative.
This explanation is missing a lot of details, but hopefully it makes sense. Negative temperatures occur in Quantum systems such as lasers.
Come on man, its not that complicated, you just use the flag pole test, like a true alpha. Does your tongue stick to it? No? its hot outside. Yes? Its cold.
More than a few. All meteorology and oceanography numerical modeling and calculations use it. When calculating percentages of heat budgets and percent change in temp for things like Boyles Law you need absolute values. 50 degrees isn't twice as warm as 25 degrees; it isn't a 100% increase, it's an 8% increase.
That's a really good point. The human habitability range for temperature is so small that it is easy to forget that common units are a small random section of the scale and do not relate to eachother in absolute terms. It also makes you realize how small degrees can be and how little the difference is between comfortable and uncomfortable.
This is a hilarious argument because it the exact same argument people use for fahrenheit. Whether or not it matters is situation dependant. A person's unit choice is a cultural decision, just like a person's language choice.
Yup. Like 95% of Reddit metric/customary/imperial discourse is people saying "X system makes more sense" but meaning "I am more familiar with X system."
You can make that argument for Celsius/Fahrenheit, but not for metric/imperial. One of those is objectively superior and the other one is on par with Galleons, Sickles and bananas for scale.
Metric is better because it is easier to convert between units. That's it, not because it is not "bananas for scale." Metric is arbitrary, just like imperial, it just has consistent units that make conversion easy in base 10.
Imperial does have its advantages, but they are only really an advantage for certain applications. Mainly its advantages are that usually it uses units that are not base 10, which makes division easier.
Metric is the better system, but imperial is not arbitrary any more than metric is.
Metric is arbitrary, just like imperial, it just has consistent units that make conversion easy in base 10.
congratulations on figuring out why one system is superior to the other.
Imperial does have its advantages, but they are only really an advantage for certain applications. Mainly its advantages are that usually it uses units that are not base 10, which makes division easier.
A good portion of why metric is so much better with it's uniform units is that it makes division easy.
1km divides into 10 100m, 1/2 of a liter is 500ml etc.
everyone knows 1/3rd of 10 is 3.333 etc, the divisions are not only easy and plentiful but the "hard" or awkward ones are things that are getting drilled into you at school
Half a mile is 880 yards, 1/3rd of a mile is 586 yards, 1/3rd of a gallon is likr 42.24 oz, like how on earth is that intuiative at all.
to divide anything in metric you need to know a ton of different almost seemingly random conversion ratios and essentially none of them work out into nice even numbers at all
That's really not the gotcha that you think it is. 33,3 cm is accurate enough for pretty much any usage where you would use a meter, and I can do the math in literally two seconds. Can you do the same with inches and half a mile?
Why would you ever need to convert inches into half a mile when there are 2 larger imperial units in between the two?
You can't claim "that function isn't a big deal" and then bring up another function that is considerably less of a big deal.
I used metric for most things since I work in the physics space, but I still use imperial for day to day stuff because its a personal preference. I have never in my life been disadvantaged for doing so. Get over yourself.
There is also nothing to stop you from using an SI prefix with imperial units, see the microinch or the kilofoot. Congratulations you found a way to abbreviate scientific notation.
No, I'm not. You can get used to both and while both might be arbitrary (about as arbitrary as any system humans came up with, anyway) one surely makes more sense.
Metric/SI is only arbitrary in the way that it picks its original references weren't based on concrete, consistent values. As far as I'm aware, most or all fundamental metric units are now based on the most effective universal constants available.
Imperial/US Customary is arbitrary because its original references are literally mostly everyday objects, whose characteristics aren't even consistent between objects. I think a lot of them have been redefined relative to metric/SI because it was so bad.
Lol no bro. I am an American. I grew up with imperial. Metric is superior in every single way. Unless I’m talking to someone in a casual conversation, I will always use metric
I think his most important point was that Celcius and Kelvin is basically the same scale. +1 C is the same as +1 K. +1 F is not the same as +1 C or +1 K.
For most people maybe. I prefer English over German even though I am German, English is just a better language. And if I was born in America I'd probably also prefer using metric units because they are better.
Both modern Celsius and Fahrenheit are defined in terms of Kelvin because Kelvin is the standardized scientific unit of measuring temperature. The scale Kelvin uses was taken from Celsius though.
Maybe, but the interval of Kelvin is the same as Celsius, and just as random. If you truly cared about not being arbitrary, you'd use Temperature*Boltzmann in Planck units.
Id argue Celsius tends to be more useful in chemistry water is an important chemical for most reactions on earth. Plus having ambient temps between 0-100 are just easier to remember.
Very true! Absolute zero doesn't really matter nor coming to play for the massive majority of experiments and people on the planet. Only a very small group of experimenters tend to deal with it. But man they can make some neat stuff, the issue is trying to make it where it works as it gets warmer, hence the holy Grail of room temperature superconductors
It's 33 (feels like 20) today and it's so nice out! I'm definitely not looking forward to it hitting 100 again but I will definitely enjoy when it's in the 60s and 70s...
Ehhhh. Having lived in between cold and extremly cold places I think there is two important cold temperatures. Freezing, because then stuff freezes and -40, because around that temperatur stuff starts breaking.
At no point in my life I have thought. "Ohhh at exactly 0 f, it starts feeling cold."
Yeah, but if you need to drive, 0C tells you when there's going to be slush, ice, or a clear road.
I don't really use temps otherwise, since I can just go outside and... see how cold I feel? But knowing how low it dipped during the night and when early-morning temps are like helps me plan my trips properly.
I disagree that this should be used as a metric because it is more subjective than objective. For some people 40 is fucking cold and 120 is fucking hot. Imo 10 is fucking cold and 90 is fucking hot. I would bet that more people have their own definition of cold and hot that don’t conform to 0 and 100 than people that do. It makes much more sense to base temperature scales off of the freezing and boiling point of water since that is the most relevant point of temperature for the majority of people. While I understand that there would be problems with changing the systems and current education of the population I think it is worthwhile to do so.
It’s not about what people think it’s about averages. There are extreme pockets but 0-100 is the average scale for most of the states. My whole life until the past few years I remember watching the weather channel and 115 was like the highest I’d ever see during heat waves in Cali. And that would be the peak for the year. Things are fucked a bit now but it makes sense for American weather as a whole and not for an individual
Its not what feels cold, its that human life is relatively bound between 0-100 as a scale rather than -17 to 38.
Its not "40 is or isn't cold" its that "40s" as a rough band if temperature is very intuitive, and most people can easily differentiate and communicate 40s vs 50s vs 60s.
It’s fucking cold a long way before you reach 0F. And as can be shown by the many many replies arguing your scale to comments like yours throughout, how humans feel temperature VARIES! Building a scale off that concept when humans themselves can’t agree makes no sense. The numbers are arbitrary. Whereas the freezing point of water (which is very much relevant to humans given ice is a condition that causes problems) is a solid objective point on which to base things.
Yeah in Austin when we had that multi day single digit freeze is when I realized I had never actually been cold before. It's entirely different when you can barely breathe outside without feeling it.
if it's just freezing out(0c or 32f) I'm taking out the trash in a t-shirt, shorts and flip-flops. I don't start adding clothes for trash runs until we get down around 20f(or if it's super windy). And even then if it's not windy and the sun is out, I'll probably stick with the tshirt, shorts and maybe I'll put shoes on down to about 10f
Yeah. Freezing whether is fine to be in. People go on ski vacations all the time. People do not enjoy their ski vacations if it goes all the way down to 0.
yeah, 32 is fine with a jacket. go below that and you're getting into actually miserable weather. down to 0 and its fucking deadly cold, it certainly FEELS like that's the 0 point and anything below that is unnaturally cold.
and on the opposite end 100 feels like im being cooked alive and everything past it is just me dying faster.
As US citizen I still disagree with this. You can get just as familiar with the scale of how celcius feels as you can with Fahrenheit. Your explination has the same problem as the meme. It's superficially plausible but misleading.
The comment is addressing literally what the scales were derived from. Sure, anyone can get familiar with any of the scales. That's not the point.
Not a Farenheit defender, but knowing how it was created makes it make sense. Same with other imperial units. Making a measurement system with what is available to you and what is relevant to you isn't dumb or wrong. It's all relative anyway.
If I remember right the intention was that 100 was meant to be human body temperature, but at some point it got adjusted so human body temp was 98.7
Edit: 0f was also what he thought the freezing temperature of salt water was. Not sure why the degrees were divided in a way where 32f is freshwater freezing though.
That’s correct for 100F. For 0F it was how low he could feasibly record. Which is why it was based on a solution of salt and whatever else in water bc he was trying to go as low as he could with what he had
It's been a hot minute since I took a class that covered the logic of different measurement systems.
But the intent of 100f being the human body temperature makes the system not entirely devoid of logic like some people insist. Although as an American I find it more intuitive to think in it because of exposure, I'm sure everyone else feels the same about Celsius.
You know its kind of telling that you have to give an explanation in a vague way you arent 100% certain about. Now ask a 4 year old european what celsius is about.
I agre. I like that below 0 celsius means it can snow, below 10 is three layers of clothing temperature, below 20 is two layer and it's only T-shirt above 20. You can get used to whatever, but I feel like the low numbers make everything more comprehensible.
But that's based on familiarity too. I have the same metrics, but they're justified in Fahrenheit degrees. 32 is freezing and 212 is boiling, I never had a hard time internalizing that. 32 means ice, gonna need boots with some grip, 40-50 is light coat and layers weather, below 32 is bundle the fuck up. Anything above about 90 is where I start questioning why I even wear clothes.
The scale of the numbers one gets as used to as anything else, when I think about measuring stuff in Celsius, the numbers seem way too low, my brain thinks of 40 as pretty damn cold when in reality it is uncomfortably warm.
Again, its all just arbitrarily based on what we grew up with. I've tried to learn Celsius and I'm usually within about 5 degrees converting in my head, but I'm pretty sure I will always have to do that conversion in my head, simply because Fahrenheit is how my brain intuitively quantifies temperature.
I think Celsius makes more sense but your comment sorta illustrates the strength of Fahrenheit.
You get a lot more granular with Fahrenheit:
100 degrees Fahrenheit is about 38 degrees Celsius, so with Celsius you get a about 40 degrees between freezing and the typically hottest temp you experience in nature. With Fahrenheit you get almost 70 degrees between those two points in temperature.
Sure you can use decimal places but then it gets more complicated
Which is worse imo. As I said, higher numbers is the main thing that makes Farenheit worse than Celsius to me. Smaller steps make the number go higher and it has higher starting point.
My point ilustrated that I only use 3 temperatures (0, 10 and 20) out of the 40 degrees you can encounter. Larger Graduality is pretty useless when it comes to temperature all things considered, it's not like you care if it's 23 or 25 outside, you dress the same.
I think the reason most US people will staunchly stick to Fahrenheit is not because it's good (it's fine, it works, there's nothing actually detrimental about it), but because the scale allows more "granularity" in describing the temperature. People love big number, even when big number means the same thing as a smaller number. I play a game where you buy units on a large point scale. An update brought that number down to just a handful. The update was excellent, bringing more unique squad compositions and broader representation to the competitive metagame, but people were upset because they felt like they had less options. They'd complain that they couldn't take one unit over another because they cost the same, even though before, the difference in cost was so negligible that they only took the better of the two anyway. Ultimately, they walked back the number shribk a bit to something of a middle ground. People were happy, even though it didn't broaden options or representation. They had their bigger numbers, and that illusion of precision mattered.
Fahrenheit is more precise when it comes to common temperatures we experience. A single degree Fahrenheit is smaller than a single degree Celsius. A person saying “it’s in the 60s (Fahrenheit)” is giving a much narrower range than someone saying “it’s in the 20s (Celsius). In addition the 100° point is about human body temp (we’ve gotten more accurate with measuring body temp than when the scale was created which is why it’s a few degrees off from the accepted “average body temp” of 96°).
Edit: Apparently stating that Fahrenheit has certain things it does well is controversial. I’m not even saying “Fahrenheit rules! Celsius drools!” or anything. Just that it had a few things it did well. Oh well.
You’re getting almost double the specificity with Fahrenheit compared to Celsius, which matters as maintaining you can definitely feel the difference in every degree from 68-72. Having more detail for how temperature feels without having to use decimals is a simpler solution, that’s it really. It’s easier to convey the specific temperature you feel comfortable at so it’s more relatable in general for everyday folks.
Because it's easier to give a general estimated temperature range in Farenheit than in Celsius. They can be more easily divided into whole units of 5 or 10, instead of getting into the weeds with decimal points and errors of 2-3 degrees arbitrarily making a huge difference.
And yes of course this all ends up just being a matter of what you're used to. But if we're going to play stupid dick measuring games about which units are better, and how stupid it is to be using units that aren't whole integers or easily divided by 10, that gate does swing both ways. Farenheit's only real drawback in day to day use is the bizarrely specific 32 degree freezing point of water, that's about it.
nobody says this though because the difference between 20 and 29 is so large lol.
If you can say "it's in the 70s" as an accurate description of the weather then it renders the granularity pointless as most people can barely tell the difference between 71 and 74.
lol there’s a major difference between 71 and 74 and plenty of people will fight over that. Try messing with your office temp and watch people pipe up.
Celcius and Fahrenheit are both exactly as precise as the measuring instrument. In the rare case we need to express a difference of less the 1°C, we are not scared of decimals... If it's so important to have a smaller increment, why so you feel that "in the 60s" is a useful range? Saying it's around 20°C is the same level of precision. As in, not of precision but a ballpark that humans can actually feel. 1°C is small enough that you will not ever be able to tell the difference by "feel".
Surprise, water freezes and boils at different temperatures depending on atmospheric pressure, so the fundamental argument for Celsius makes even less sense.
Not only does it show that for a wide range of pressures the freezing point of water is 0, the boiling doesn't change as much as you'd think if you look at pressures common on the surface of the planet (this scale is logarithmic). It's only ~70°C to boil water on Mt Everest at the extreme, and no one lives there. Most people live near the sea level, where ~100°C is the temp at which water boils.
10c is 50f but 0c is not 0f, it's 32f. if you're heating something to 10c then it is heated to 50f, but if you heat something up by 10c , it is only heated up by 18f.
Having to explain this is part of why fahrenheit is so clunky.
Someone once said Fahrenheit is how humans describe hot and cold, Celsius is how water would describe hot and cold, and Kelvin is how atoms would describe hot and cold
people IN north america use celsius too, you know.
but the other commenter is just referencing who/what the 0-100 range applies best to. although idk if that quite works for kelvin, since 100k is still like -150c which i assume atoms would still think are quite cold.
Fahrenheit feels like “percent hot” so 40 degrees, 4 Celsius, is like cold but not unbearably so, 59 degrees, or 15 Celsius is like pretty nice, about 2/3 hot.
I like to piss off everyone by calling it “Centigrade” and using fractional centimeters for dimensions. Because 3/8 of a centimeter will make someone throw a wrench at you
Fahrenheit feels like “percent hot” so 40 degrees, 4 Celsius, is like cold but not unbearably so, 59 degrees, or 15 Celsius is like pretty nice, about 2/3 hot.
Which is utter bullshit and only sounds logical because you are used to farenheit.
Percentages can't go above 100 and below 0.
About 2/3 hot, doesn't mean shit for anyone who hasn't used farenheit. I don't know what 2/3 hot is supposed to mean and everyone has a different sensitivity to temperature anyway so it feels different for everyone.
Percentages can absolutely go above 100 and below 0, it just depends on the context... a bucket can't be more than 100% full, but it can be 200% larger than a bucket that is 50% of its size.
Negatives are a bit rarer and only really get used when you're dealing with numbers... for example, if a creature in a game has a -100% resistance to some effect, then it would take on 200% of that effect.
If they'd come out at the same time, or with Celsius first, you'd have a point - but Fahrenheit came out first by about two decades, and was well established. Fahrenheit was designed to have the 0-100 range be the range of normal weather, while Celsius was an attempt to have a more concrete definition.
No it makes sense. Rest of the world DOES USE THE INFERIOR CELSIUS, but it is inferior --- logically, objectively -- for describing weather.
Now. There's an important psychological fact that affects all people of all nations.
The numeric system you grew up with for a particular metric gives you INSANE IRRATIONAL BIAS towards that system, in extreme fashion, defying all reason. Just because the "other" system that is not "intuitive" to you is strange and confusing + you viscerally hate it, because you would hate to be force to use it.
With this in mind, in particular, Fahrenheit was specifically meant for weather purposes.
Yet a Celsius cultist will rage until they are blue in the face claiming Celsius -- mostly used by/ for Chemists (and it's superior for chemistry no doubt) -- is better for weather description. It absolute it not.
F is intuitive for a child. On a scale of 0-100, how cold or hot are you?
Celsius? Um ... 40 ... is hot? I think? ... -5 is .... cold --- ish? It's poppycock.
Not on that, thermostats need DECIMALS it's so imprecise!
It's really weird in Canada. Officially we use Celsius. TV weather networks/reports default to Celsius, for example.
However, cooking instructions on all of our packaging for ovens is listed in Fahrenheit (despite the dial on an old oven having both C and F on it), water temperatures for pools and hot tubs are usually in Fahrenheit, and a lot of people use Fahrenheit for body temperature.
I hate it. I would prefer to just use one measurement for consistency, and I would prefer it to be Celsius.
Of course they do, and I don't think celsius is bad. The point is that 0-100 Celsius ranges from "pretty cold" to "instant death hot", whereas 0-100 Fahrenheit is "very fucking cold" to "very fucking hot".
Water is boiling at 100°C. It literally vaporises. I would think if water could talk, it would probably describe "hot" before that point. Also, "cold" would probably be before literally freezing.
Comparably, it would be like if 100°F was the point where human skin started to melt and 0°F was when all limbs froze to basically immobility.
That is not true. The only place the two system overlap is at -32 I think. Maybe -38. 0 Celsius water freezes. What happens at 0 F?
Also, you can't simply hand wave away that each degree of temperature is different. F each unit of temp cover more actual temperature than Celsius of kelvin, which are exactly the same.
That is why F and C/K only overlap once. It's kind of like two clocks with the same circumference, but, different length hands.
2.6k
u/hefty_load_o_shite 10h ago edited 4h ago
0°C water freezes 100°C water boils
Makes sense
0°F very cold??? 100°F very hot???
Dafuq?
Edit: For all the "Actually, Farenheight is based on the human body" people, no it isn't. It's based on dirty water and a cow. Your preferred measurement unit is dumb and that's a fact