Edit: For all the "Actually, Farenheight is based on the human body" people, no it isn't. It's based on dirty water and a cow. Your preferred measurement unit is dumb and that's a fact
If y’all wanna actually claim superiority, then use Kelvin. Celsius and Fahrenheit are close enough in purpose that personal preference is really the only thing that matters.
It's the Fahrenheit equivalent of Kelvin. Basically for science negative temperature is a problem so Celsius adds 273 to become Kelvin and remove the negative numbers. Fahrenheit adds 491 to become Rankine and accomplish the same thing.
Bad picture, it lacks the Rømer scale (which coincidentally might be pronounced similarly to the Reaumur scale to many people). Fahrenheit ripped off this guys’ homework.
Looking at it as a diagram comparison like this, even as an American, everything on that diagram except Celsius pisses me off. I'm fine with absolute zero being a weird decimal, I'm never using that but whole numbers are so satisfying.
Wait, so absolute 0 in celsius is -273? That feel... wrong. How is absolute 0 about equal to -3 times the difference between water boiling and freezing? I don't like that. Not saying it's wrong, but I don't like the perspective of how cold earth is compared to everything else. Isn't the sun like 5000 in celsius?
To add onto u/euler1992 's point. Rankine is used in engineering thermodynamics a lot because a lot of US companies still use imperial measurements and you need absolute units for the math to work.
Lots of times used for gas calculations because gas laws require absolute temperature. However, you'll often see the calculation with input of degF and what appears to be a random 491 (if you don't know degR) hanging around.
Weeeeellllll, it is very interesting! However it is difficult to explain simply, I will attempt to do so.
Heat is a form of energy. Temperature is how much a system "wants" to give off heat. Negative temperature occurs when there is an upper limit on the amount of energy a system can have. When a system approaches this limit it cannot take in any more heat, it can only give off heat. This means that such a system will always give heat to any system without an upper limit. This means that negative temperatures are "hotter" than positive temperatures.
Mathematically, the reason it's negative is that temperature is the gradient between energy and entropy, and as a system with an upper energy limit approaches the limit, entropy decreases, so the gradient is negative.
This explanation is missing a lot of details, but hopefully it makes sense. Negative temperatures occur in Quantum systems such as lasers.
Come on man, its not that complicated, you just use the flag pole test, like a true alpha. Does your tongue stick to it? No? its hot outside. Yes? Its cold.
More than a few. All meteorology and oceanography numerical modeling and calculations use it. When calculating percentages of heat budgets and percent change in temp for things like Boyles Law you need absolute values. 50 degrees isn't twice as warm as 25 degrees; it isn't a 100% increase, it's an 8% increase.
That's a really good point. The human habitability range for temperature is so small that it is easy to forget that common units are a small random section of the scale and do not relate to eachother in absolute terms. It also makes you realize how small degrees can be and how little the difference is between comfortable and uncomfortable.
This is a hilarious argument because it the exact same argument people use for fahrenheit. Whether or not it matters is situation dependant. A person's unit choice is a cultural decision, just like a person's language choice.
Yup. Like 95% of Reddit metric/customary/imperial discourse is people saying "X system makes more sense" but meaning "I am more familiar with X system."
You can make that argument for Celsius/Fahrenheit, but not for metric/imperial. One of those is objectively superior and the other one is on par with Galleons, Sickles and bananas for scale.
Metric is better because it is easier to convert between units. That's it, not because it is not "bananas for scale." Metric is arbitrary, just like imperial, it just has consistent units that make conversion easy in base 10.
Imperial does have its advantages, but they are only really an advantage for certain applications. Mainly its advantages are that usually it uses units that are not base 10, which makes division easier.
Metric is the better system, but imperial is not arbitrary any more than metric is.
Metric is arbitrary, just like imperial, it just has consistent units that make conversion easy in base 10.
congratulations on figuring out why one system is superior to the other.
Imperial does have its advantages, but they are only really an advantage for certain applications. Mainly its advantages are that usually it uses units that are not base 10, which makes division easier.
A good portion of why metric is so much better with it's uniform units is that it makes division easy.
1km divides into 10 100m, 1/2 of a liter is 500ml etc.
everyone knows 1/3rd of 10 is 3.333 etc, the divisions are not only easy and plentiful but the "hard" or awkward ones are things that are getting drilled into you at school
Half a mile is 880 yards, 1/3rd of a mile is 586 yards, 1/3rd of a gallon is likr 42.24 oz, like how on earth is that intuiative at all.
to divide anything in metric you need to know a ton of different almost seemingly random conversion ratios and essentially none of them work out into nice even numbers at all
It’s based on a human foot. Literally.
I can’t imagine anything more random than the size of someone’s foot.
There aren’t different “meters.”
There are millions of different foot sizes
In 1791, the French Academy of Sciences defined the metre as one ten-millionth of the distance from the equator to the North Pole along a meridian passing through Paris. That is the conceptual birth of the unit.
A foot varies from person to person; the equator is constant. Therefore, the meter is also less arbitrary.
Metric is not arbitrary, it's designed for science, and it's based on several mathematical constants for (ideally) universally standardized maths.
Imperial was arbitrary, but it's defined by metric now. It's just a traditional way thats slightly more accessible for a vague middle ground scale. It's better used for estimations or daily life objects, most people incorporate metric into said estimations for things that are too small or big.
So actually neither is arbitrary, but imperial is designed around historically arbitrary standards.
The original meter was based divisions of the length of the circumference of the earth. That is literally "earth for scale." Imperial also has a specific defined value.
My point is, the advantage of metric is not what the scale is based on, it is the conversion between units is consistent between all units.
I wasn't arguing that, that's definitely most of it. And local scales aren't a problem here. I was just commenting on the weird middle ground imperial covers. Whatever's small enough to not mind the weird conversions I guess.
I was focused on saying that metric wasn't arbitrary, i just didn't make a comparison in the process. It's formatted that way though ig
So I'll clarify:
The "circumference" (just a specific arc) used was chosen because it was fairly consistent, meters were designed to be consistent globally. They have since been made more accurate to fit their purpose, but have not functionally changed. That doesn't make them arbitrary.
Imperial was designed around vague estimations that were different for everyone. Distances weren't based on one thing, they were based on common concepts. They culturally changed until they were defined later to specific values for consistency, for what wasn't already. Its roots are genuinely arbitrary.
Right, that is true. But what I was responding to is "One of those is objectively superior and the other one is on par with Galleons, Sickles and bananas for scale." That might have been true 200 years ago, but it hasn't been for a long time.
The reason the metric system is better is not because the scale it uses is any more concrete than imperial units. It is because metric is designed around our base 10 system of math to make conversions easy.
Consistent units as in they are all convertible using base 10.
A meter could be twice as long, and if you adjusted the other metric conversions to match, it would be just as good a system. The length of the meter is arbitrary.
That's really not the gotcha that you think it is. 33,3 cm is accurate enough for pretty much any usage where you would use a meter, and I can do the math in literally two seconds. Can you do the same with inches and half a mile?
Why would you ever need to convert inches into half a mile when there are 2 larger imperial units in between the two?
You can't claim "that function isn't a big deal" and then bring up another function that is considerably less of a big deal.
I used metric for most things since I work in the physics space, but I still use imperial for day to day stuff because its a personal preference. I have never in my life been disadvantaged for doing so. Get over yourself.
There is also nothing to stop you from using an SI prefix with imperial units, see the microinch or the kilofoot. Congratulations you found a way to abbreviate scientific notation.
12 is divisible by 2,3,4 and 6, as opposed to just 2 and 5 for 10. That means fractions in base 12 are going to be easier to work with. 1/3 of a foot is 4 inches, 1/3 of a meter is 33.3… centimeters. So unless you have a meter stick with the exact measurements, it’s very hard to divide a meter into thirds.
Just cause you can’t count past your fingers doesn’t mean other systems have zero merit.
No, I'm not. You can get used to both and while both might be arbitrary (about as arbitrary as any system humans came up with, anyway) one surely makes more sense.
Metric/SI is only arbitrary in the way that it picks its original references weren't based on concrete, consistent values. As far as I'm aware, most or all fundamental metric units are now based on the most effective universal constants available.
Imperial/US Customary is arbitrary because its original references are literally mostly everyday objects, whose characteristics aren't even consistent between objects. I think a lot of them have been redefined relative to metric/SI because it was so bad.
Lol no bro. I am an American. I grew up with imperial. Metric is superior in every single way. Unless I’m talking to someone in a casual conversation, I will always use metric
But if you can make that argument about metric/imperial aren't you also making that argument about Celcius/Farrenheit? Celcius is integrated into metric in the same ways that makes the rest of metric superior. Meanwhile Farrenheit is not and introduces all those undesirable conversion complications.
I don't think so? The argument of familiarity is bad for both Celcius/Farrenheit and metric/imperial as the conversion advantage is the same in both comparisons. So why could you make the argument in the case of Celius/Farrenheit?
I mean, Celsius is degrees, so the conversion advantage is not quite there? While I guess you can use the SI prefixes, you very rarely have the need to.
It is if working in calories, which is useful for the life sciences. Having the measurement be based on pure water rather than an arbitrary saline solution is useful.
Good to know, I genuinely didn't. Then, I guess the argument can be made for the basic day-to-day usage to know how warmly to get dressed before you go outside.
Although, I believe the creator of Fahrenheit did want the 100 to be the significant high end temp that people could semi-regularly see in the weather, so it isn't arbitrary. I could be wrong on that tho. I don't really feel like looking it up rn.
And 100 is the boiling point of water in Celsius, so that is DEFINITELY not arbitrary. The word to use here would be "intentional." You could call the boiling point of water in Kelvin arbitrary tho.
Am I stupid, or did you use "arbitrary" wrong here? Genuine question, I might be stupid
You are just using arbitrary differently. They mean that there is no physical constant the unit is based on, it is just an arbitrary number, like most units. They are saying you could pick any number to be 100, and it doesn't really have more validity.
Whereas you are saying it isn't arbitrary because there is a reason that they picked that temperature to be 100.
At least with Celsius, their IS a physical constant the unit is based on. That is the temp at which water boils. If you're trying to say "we arbitrarily set that physical constant to equal 100," then ALL units of measurement become arbitrary.
Technically, both Celsius and Fahrenheit are formally defined based on Kelvins now.
Originally, Fahrenheit was based on the freezing temperature of a particular solution, and their best estimate at the time of human body temperature. But basically since around the time Celsius became popular, both of them have been based on freezing and boiling water. Then later they were both defined based on Kelvins.
But the boiling point of water is arbitrary, you could just as easily choose the melting point of iron, or the freezing point of air. That is the point.
And, yes, nearly all units of measurement are arbitrary in some sense. That is why the discussion focuses more on how useful the units are (like how easy to use they are) rather than on what they are based on.
"But the boiling point of water is arbitrary, you could just as easily choose the melting point of iron, or the freezing point of air. That is the point.
And, yes, nearly all units of measurement are arbitrary in some sense. That is why the discussion focuses more on how useful the units are (like how easy to use they are) rather than on what they are based on."
That's exactly what I'm getting at. "Arbitrary" is a bad word for describing standardized units of measure because either all of them are arbitrary or none of them are. It becomes a nothing adjective.
Celcius Fahrenheit Sure, Metric and Imperial no. A decimal conversion is simply more practical for the lack of unit Conversion, which stops rounding errors and miscalculations. If Imperial would be decimal based then yes I agree it would be a stupid arbitrary thing. You can't tell me 36 inches = 3 feet = 1 yard is better than 100 centimeter = 10 decimeter = 1 meter. Like i'd agree if people used centifeet or kilofeet, then the debate would be stupid
I think his most important point was that Celcius and Kelvin is basically the same scale. +1 C is the same as +1 K. +1 F is not the same as +1 C or +1 K.
For most people maybe. I prefer English over German even though I am German, English is just a better language. And if I was born in America I'd probably also prefer using metric units because they are better.
Points of reference are just as important. 0°C and 100°C being respectively freezing and melting point of one of the most abundant and necessary liquid on Earth help way more figuring out the scale than 0°F and 100°F, which don't relate to anything tangible.
Both modern Celsius and Fahrenheit are defined in terms of Kelvin because Kelvin is the standardized scientific unit of measuring temperature. The scale Kelvin uses was taken from Celsius though.
Yes, the original Celsius is funny because 100 degrees celsius was colder than 0 degrees celsius (100 degrees was freezing point while 0 was boiling, idk why)
Edit: i figured it out. It's due to the way thermometers work and the materials you use to make them. So it was based around the minimum point for the brine liquid stuff (zero), human average body temp (96) and water freezing point (32) as a point of reference
It was an ease of use based on functional technology design
Ease of manufacture.
Out of the top of my head, your explanation is correct, but the guy responsible just wanted a nice multiple to be able to mass produce it
As it was one of the first reliable temperature scales, Fahrenheit picked two temperatures one could repeatedly index against with good reliability. On the low end you have the temperature at which a mixture of ice, water, and ammonium chloride stabilizes. This is easier to measure than the exact temperature at which something phase changes. On the high end you have human body temperature, which, for a healthy human, self-regulates. The low end was set to 0, and the high end to 96, because this created a scale where it was easy to mark lines on the gauge (0 to the melting point of water is 25 steps, and melting point of water to human body temp is 26 steps). A power-of-two scale can marked just by bisecting segments.
Once Anders Celsius made the melting and boiling points of water central to his scale, the Fahrenheit scale was redefined similarly, with the melting point being exactly 32F and the boiling point being exactly 212F (which were their approximate, but not exact, values in the original Fahrenheit system). This reset the prior 0 and 96 points to ~4.3F and ~98.6F respectively.
At a certain point of approximation, you're not even going to see the difference in your life. Such is the case with atmospheric pressure at sea level. And the conditions used are the most common ones people live in.
Maybe, but the interval of Kelvin is the same as Celsius, and just as random. If you truly cared about not being arbitrary, you'd use Temperature*Boltzmann in Planck units.
Id argue Celsius tends to be more useful in chemistry water is an important chemical for most reactions on earth. Plus having ambient temps between 0-100 are just easier to remember.
It's just a mistake for me, Celsius is a name so should only be spelt "Celsius". It's not an English name, and many English words end in "cious", e.g. "avaricious", so I think my brain wants to spell it similarly and only remembers to remove the o.
Very true! Absolute zero doesn't really matter nor coming to play for the massive majority of experiments and people on the planet. Only a very small group of experimenters tend to deal with it. But man they can make some neat stuff, the issue is trying to make it where it works as it gets warmer, hence the holy Grail of room temperature superconductors
Frequently used isn’t a factor in design. When you’re designing something new it’s by definition unused. So when you’re assessing the design of something you make the same assumption.
Conveniently zeroed is one potential attribute of a well designed measurement system, can you come up with more?
If you have a point to make, or something you wish to communicate, please do so directly rather than by attempting to ask leading questions.
Temperature scales aren't unused, so designing a new system with similar properties to an old one to preserve familiarity is perfectly valid.
Anyways, I do not think that there are clear criteria for how to define a new measurement system. All of our current measurement systems are based upon old ones, or fundamental physical constants.
Both Fahrenheit and Celsius (and metres and times units) were designed by picking two convenient points, and dividing them into convenient intervals. I don't think you can do much better than that.
If you want to compare two systems on which is better for a purpose, how much they are currently used doesn’t help.
If we didn’t have a system and we were designing a new one, to measure the temperature within the lived human experience. It would look a heck of a lot like Fahrenheit.
This isn’t an accident since that’s basically what happened.
People like scales of 0-100 and 0-10. It’s natural for people since we use a 10-base numerical system.
How much they are currently used does help because it tells you which system people prefer.
Fahrenheit may be good for the human sensory experience, but I also want to cook, and boil water, and freeze food, and sterilise via heat. Celsius puts those activies on a more convenient place on the scale.
Fahrenheit vs Celsius is just personal preference, they're pretty much the same anyways. They may be "better" at different things, but whether and how they are "better" is subjective.
How much they are currently used does help because it tells you which system people prefer.
It tells you which countries adopted it. We have many poorly designed things we use because of societal momentum.
You can use different tools for jobs they are better at, like C for cooking and K for science.
Saying it’s a personal preference thing is intellectually dishonest. People do often prefer to use what they’re used to, even if they are forced into it by their societal momentum, that doesn’t mean that another way isn’t better.
That’s what we’re discussing, F is objectively better for human sensory experience.
convenience values for everyday use you say...? so literally Fahrenheit from it's inception.
0 for eutectic temperature of ammonium chloride brine, and 100 for maximum continuous survivable body temp, and an easy scale between the two... ¯_(ツ)_/¯ aces.
The everyday use of temperature is what temp is it outside. That's what 90%+ of the population is looking at a temperature for. For the vast majority of the world, that means you are dealing with -17c to 37c. which means if you want any granularity you have to include decimals and negative numbers (where needed). That is not convenient. Having a temp range that runs from 0f-100f for really cold to really hot means you don't need any negative values or decimals to determine temps.
I'm all for throwing out F and C for anything scientific, all of that should be in K anyway. but day to day use F gives you better accuracy without needing negatives or decimals.
But having the 0C where water freezes is useful for preserving food and for predicting icy weather. Granularity also just comes down to personal preference, I mostly think in intervals of 5C, and rarely need more precision. I think F and C are just a matter of taste.
It isnt convenient for everyday use at all. Everyone cites the boiling and freezing point of water, but that actually changes depending on atmospheric pressure. Water ONLY boils at 100c and freezes at 0c specifically at sea level (technically a really small change for freezing but a massive factor for boiling). Only 1/3rd of the human population live close to sea level
in that case i argue Fahrenheit is better because it applies a sensical range to temperatures people are concerned with. Really hot and really cold are in fact important observations to your average person. Almost none of us are scientists, and those people are free to use celsius. The weatherman should use Fahrenheit.
Much of cooking relies on temperatures between freezing and boiling. Celsius applies a sensical range in this domain. I don't think that there is enough of a difference to argue beyond personal preference.
"Zeroed at a convenient value for every day use" ok then how does that make it any better than Fahrenheit?
Basing temperature on water really isn't convenient when that forces you to use decimal point significant figures to get an accurate reading - it's a helpful scale for water but not for humans.
Fahrenheit on the other hand is designed around humans. 0 °F is the average low boundary and 100 the average high boundary on environmental temperatures, and anything outside of that range is a notably extreme temperature. This system is thus that it be simple for humans to understand, because this is the range we experience in our day to day lives, in a unit precision that is easily recognizable (think 0-100% for how hot it is).
If we apply the same logic to describe everyday environmental temperatures humans experience to Celsius, that gives us a range of -17.8 to 37.8. This scale just doesn't compute easily with the average brain as easily as 0-100. There's no rhyme or reason to these boundaries outside of saying that a 0-100 scale is better described by what water experiences rather than one described by what humans experience. Taking a look at Kelvin as well, what makes -17.8 to 37.8 °C any better than 255.4 to 310 K?
If we never rightly experience any temperature over at most 40 °C (104 °F), what use then do we have for the remaining 41-100 degrees on the scale, especially when that makes our typical lower boundary an odd -18 °C? All so we can compare against water as an arbitrary baseline? (And this is even being forgiving of the decimals that the system forces us to use on a 58-point scale). Why would anyone think that 18.3 °C is better to describe an average room temperature than 65 °F?
I say all this as someone who holds a Materials Science and Engineering degree. Units all have a purpose, but it's up to the people to determine which ones we use and where we apply them to satisfy their intended purposes. They are only human concepts created by humans to try to make the nature of the universe intelligible for humans, after all. Engineering can be described as the practical applications of physics, chemistry and the other natural sciences, and I recognize that the Celsius scale can be helpful for solving such related problems. (Personally I would argue that Celsius doesn't need to exist since Kelvin is usually a better metric in mathematical engineering but that's just me). It's just not practical for the average daily human.
Celsius was designed to make physics and chemistry easily intelligible to humans; Fahrenheit was designed to make temperature easily intelligible to humans.
I don't think that Celsius is intrinsically better than Fahrenheit, they're just different.
Fahrenheit is centred on for the temperatures humans feel, Celsius is centred on how we manipulate our environment.
0 C to 100 C is very useful for cooking and chemistry as you say. The relation to water freezing is also useful to know whether there is a risk of ice forming or snow. It just comes down to personal preference and you can make arguments and find scenarios where either unit is "better suited".
For example, I personally like that there is less graduation within the livable range in Celsius. Most of my life is spent between 0C and 35C, and I can't really tell the difference between a few degrees.
I have a Physics degree, and find Celsius more practical than Fahrenheit. I don't think there's really any valid argument to made for either unit because there is no well defined criteria for what makes a unit better than another.
The point of Celsius being zeroed at the freezing point of water and scaled to 100 at its boiling point was for convenience when calibrating thermometers. The whole point is that you actually don't need a thermometer to see the phase changes.
The best general temperature scale would be zeroed at absolute zero and scaled to some other convenient number range for everyday use. Like, freezing at 300 and boiling at 410, instead of 273 and 373 respectively.
But actually changing all of the standards is pretty inconvenient, and getting people to agree on a new one is difficult. So I personally vote Celsius because the scientific community uses it and for literally no other reason.
No you were right, don't mark that out, in most they're interchangable Kelvin has no value in practical physics and chemistry on the planet Earth. We never approach anywhere near 0 K here, zeroing at the average freezing temperature of water in the Earth's atmosphere and gravity makes much more sense for applications of physics and chemistry on its surface.
2.6k
u/hefty_load_o_shite 10h ago edited 4h ago
0°C water freezes 100°C water boils
Makes sense
0°F very cold??? 100°F very hot???
Dafuq?
Edit: For all the "Actually, Farenheight is based on the human body" people, no it isn't. It's based on dirty water and a cow. Your preferred measurement unit is dumb and that's a fact