Taking this opportunity to argue why Fahrenheit over Celsius makes way more sense. Fahrenheit is essentially based on a scale of 0 to 100, which is a scale we use for most things in life. Yes we can go below 0 or above 100, but it’s such a perfect scale to understand exactly HOW hot or cold something is.
It’s 70 degrees outside? Wow, we’re at 70% heat. I bet that feels really nice and not too hot. Oh no it’s 100 degrees so we’re at 100% heat? Probably want to stay inside today. Water freezes once we get to 32%? Makes sense, that’s pretty cold. I’ll need to wear a winter coat. 0% outside? No way I want to go out in that.
In terms of understanding how how or cold it is outside, looking at the temperature as a number out of 100% is a perfect way of understanding that and having a frame of reference.
Celsius is so dumb. A 70 degree day in summer sounds great. 70% heat, not too hot, not too cold, just right. But convert that to Celsius? It’s 21 degrees in the summer? What does that even mean? Stupid.
Also because of the way the math works, the scale for Celsius makes no sense. It’s 0 degrees out in the U.S., that’s -18 Celsius. But if it’s 100 in the U.S., that’s only 38 Celsius? What kind of stupid scale runs from -18 to 38? 0 to 100 is the way to go.
Imagine if test scores ran from -18 to 38. Would you support this nonsensical scale then?
To be clear, I’m on board with the metric system and I definitely don’t think the U.S. does everything right. But Celsius is trash.
to a european like me 90 ‘%’ or 20 ‘%’ human comfort would be very confusing - you could probably guess that one is hot and the other’s not but I’d have no point of reference until I convert it to Celcius. I think the numbers that someone grows up with will always make more sense no matter what
So 0°F was defined as the freezing temperature of a solution of brine made from a mixture of water, ice, and ammonium chloride (why in the world?). Originally, 90°F was set as human body temperature, which was later changed first to 96°F, and now it’s about 98.6°F.
Celsius is just:
0°C is the freezing temperature of water
100°C is the boiling temperature of water
Nobody uses a scale between -18 and 38. People in countries using Celsius just learned as a child that body temperature is 38°C, that’s all. -18°C has no special meaning to us.
At 0°C outside it’s freezing (32°F).
10°C is quite cool (50°F), you’ll need a jacket.
20°C is a comfortable temperature for me, if it’s sunny (68°F).
30°C is getting rather warm (86°F).
40°C is hell outside, or a bad fever (104°F).
To boil water, heat it to 100°C (212°F).
I get that this seems confusing at first when you’re used to completely different orientation points, but for people who are used to C, it’s very intuitive.
I get where you’re coming from cause you’re used to F, so comparing it to C naturally keeps you thinking of F as baseline (would be the same for me the other way around). But saying that C aint on a 0-100 scale is just objectively wrong. At 0 C water freezes, at 100 C water boils. It’s still a 0-100 scale, just based on something different. When it comes to metric vs imperial, it’s an easy conclusion to me, when it comes to temperature I think it’s more nuanced. I don’t have a better explanation on the difference other than “F is human focused” while “C is science focused” (I know, doesn’t quite cover it, just the best I got)
But then again, it really comes down to what you’re used to feeling “right”. For example, I could make similar conversions from C to F instead, and get weird numbers (water freezes at 32, and boils at 212? That makes no sense)
Never had too strong of an opinion on what temperature unit to use, but I will say this. C is a lot closer to K (kelvin) which is what is used for science, while converting F to K is a mess. So the one benefit C gives is a slightly easier time to get into that
I do need to let the nerd in me get this out too, F used to also be defined by water freezing/boiling. Meaning that technically C is a 0-100 scale, while F is a 32-212 scale. (Nowadays they’re both defined by K, making this point kinda irrelevant)
Fahrenheit has one advantage here: You’re used to it. If you’re used to Celsius, you know that 25° is warm and 5° is cold and don’t give a shit about it not being a 0-100 scale for that particular use case.
The 0-100 thing is pretty much the only argument I’ve ever heard in favor of Fahrenheit btw. Again, if you’re used to one of them, that’s the one that will make the most sense.
Being used to Celsius has the advantage of automatically being used to Kelvin. For example, if you ever want to calculate anything to do with the energy required to heat something to a certain temperature, you will have a way better time with Kelvin. Being used to and measuring in Celsius helps a lot here.
But sure, I get that you’re used to Fahrenheit. It’s just that the whole world has decided to use Celsius. Honestly, for good reason.
You are entitled to your opinion and obviously have your own preference and your way of explaining the scale is very good as well, but saying that Celsius is objectively worse is just wrong.
The argument for Fahrenheit based on a perceived “0 to 100 scale” representing a percentage of heat can be critiqued as it misunderstands how temperature scales work. Temperature is not a percentage system; it’s a measure of thermal energy. The notion that 70 degrees Fahrenheit represents “70% heat” is not scientifically accurate as it implies that temperature is a linear scale capped at 100, which it is not. The Fahrenheit scale was actually based on arbitrary points: the freezing point of brine (0°F) and the average human body temperature (96°F at the time, which has since been adjusted to 98.6°F).
Celsius, on the other hand, is based on the freezing and boiling points of water at 0°C and 100°C respectively, under standard atmospheric conditions. This makes it a decimal and scientifically consistent system that is easier to relate to the states of water, an essential reference in science and daily life.
Comparing temperatures to percentages, like test scores, is a flawed analogy because temperature doesn’t have an upper limit “score” and is not designed to be read as a proportion. The scale from -18 to 38 in Celsius correlates directly with the physical properties of water, which is logical for scientific purposes.
Moreover, many argue that Celsius is more intuitive for everyday weather-related use outside of the U.S., as the scale is more granular for colder climates (where a one-degree change in Celsius is noticeable) and aligns well with the metric system, which is used globally for scientific measurement.
Fahrenheit is the way to go.
Taking this opportunity to argue why Fahrenheit over Celsius makes way more sense. Fahrenheit is essentially based on a scale of 0 to 100, which is a scale we use for most things in life. Yes we can go below 0 or above 100, but it’s such a perfect scale to understand exactly HOW hot or cold something is.
It’s 70 degrees outside? Wow, we’re at 70% heat. I bet that feels really nice and not too hot. Oh no it’s 100 degrees so we’re at 100% heat? Probably want to stay inside today. Water freezes once we get to 32%? Makes sense, that’s pretty cold. I’ll need to wear a winter coat. 0% outside? No way I want to go out in that.
In terms of understanding how how or cold it is outside, looking at the temperature as a number out of 100% is a perfect way of understanding that and having a frame of reference.
Celsius is so dumb. A 70 degree day in summer sounds great. 70% heat, not too hot, not too cold, just right. But convert that to Celsius? It’s 21 degrees in the summer? What does that even mean? Stupid.
Also because of the way the math works, the scale for Celsius makes no sense. It’s 0 degrees out in the U.S., that’s -18 Celsius. But if it’s 100 in the U.S., that’s only 38 Celsius? What kind of stupid scale runs from -18 to 38? 0 to 100 is the way to go.
Imagine if test scores ran from -18 to 38. Would you support this nonsensical scale then?
To be clear, I’m on board with the metric system and I definitely don’t think the U.S. does everything right. But Celsius is trash.
Except that my winters are -10% temperature my summers are 115% temperature.
With the current climate policy, soon the winters -10ºC and the summers at 115ºC, but
That just means it’s time to move
It means your temperatures are not compatible with human life. There is no confusion that 120% Hot would be “really fucking hot”.
Both of those temperatures mean you’re outside the human scale and should limit your time outside.
It’s pretty unclear that I can for a walk outside at 40 C but if I do that at 48C I might die.
Not unclear for someone familiar with the system
to a european like me 90 ‘%’ or 20 ‘%’ human comfort would be very confusing - you could probably guess that one is hot and the other’s not but I’d have no point of reference until I convert it to Celcius. I think the numbers that someone grows up with will always make more sense no matter what
I mean we use scales from 0 to 100 in every field. -18 to 38 is a scale used absolutely nowhere.
But I agree Humans can get used to pretty much anything, and once they do - it’s all they will prefer over the unknown.
So 0°F was defined as the freezing temperature of a solution of brine made from a mixture of water, ice, and ammonium chloride (why in the world?). Originally, 90°F was set as human body temperature, which was later changed first to 96°F, and now it’s about 98.6°F.
Celsius is just: 0°C is the freezing temperature of water 100°C is the boiling temperature of water
Nobody uses a scale between -18 and 38. People in countries using Celsius just learned as a child that body temperature is 38°C, that’s all. -18°C has no special meaning to us.
At 0°C outside it’s freezing (32°F). 10°C is quite cool (50°F), you’ll need a jacket. 20°C is a comfortable temperature for me, if it’s sunny (68°F). 30°C is getting rather warm (86°F). 40°C is hell outside, or a bad fever (104°F). To boil water, heat it to 100°C (212°F).
I get that this seems confusing at first when you’re used to completely different orientation points, but for people who are used to C, it’s very intuitive.
deleted by creator
I get where you’re coming from cause you’re used to F, so comparing it to C naturally keeps you thinking of F as baseline (would be the same for me the other way around). But saying that C aint on a 0-100 scale is just objectively wrong. At 0 C water freezes, at 100 C water boils. It’s still a 0-100 scale, just based on something different. When it comes to metric vs imperial, it’s an easy conclusion to me, when it comes to temperature I think it’s more nuanced. I don’t have a better explanation on the difference other than “F is human focused” while “C is science focused” (I know, doesn’t quite cover it, just the best I got)
But then again, it really comes down to what you’re used to feeling “right”. For example, I could make similar conversions from C to F instead, and get weird numbers (water freezes at 32, and boils at 212? That makes no sense)
Never had too strong of an opinion on what temperature unit to use, but I will say this. C is a lot closer to K (kelvin) which is what is used for science, while converting F to K is a mess. So the one benefit C gives is a slightly easier time to get into that
I do need to let the nerd in me get this out too, F used to also be defined by water freezing/boiling. Meaning that technically C is a 0-100 scale, while F is a 32-212 scale. (Nowadays they’re both defined by K, making this point kinda irrelevant)
But why do I care when Water Boils at Sea level? What am I to do with that knowledge in my day to day? The 0-100 is irrelevant to me.
I’m dead long before water boils. And I’m very uncomfortable below water freezing but it won’t kill me quickly.
Fahrenheit has one advantage here: You’re used to it. If you’re used to Celsius, you know that 25° is warm and 5° is cold and don’t give a shit about it not being a 0-100 scale for that particular use case.
The 0-100 thing is pretty much the only argument I’ve ever heard in favor of Fahrenheit btw. Again, if you’re used to one of them, that’s the one that will make the most sense.
Being used to Celsius has the advantage of automatically being used to Kelvin. For example, if you ever want to calculate anything to do with the energy required to heat something to a certain temperature, you will have a way better time with Kelvin. Being used to and measuring in Celsius helps a lot here.
But sure, I get that you’re used to Fahrenheit. It’s just that the whole world has decided to use Celsius. Honestly, for good reason.
When it’s snowing and freezing outside is very helpful knowledge for places where that happens. That being at zero is nice.
No sauna for you I guess
You are entitled to your opinion and obviously have your own preference and your way of explaining the scale is very good as well, but saying that Celsius is objectively worse is just wrong.
Why does this matter? I have no idea what 0% or heat means, but know what -18° or 38° mean. For me 32% heat doesnt mean anything.
How hot it feels can change. In summer, 10°C feels cold, but in winter it feels hot
A lot more people use °C so more people would have to relearn how temperature is measured
I dont think either of them is better than the other, its just that you got used to what numbers means its hot or cool
The argument for Fahrenheit based on a perceived “0 to 100 scale” representing a percentage of heat can be critiqued as it misunderstands how temperature scales work. Temperature is not a percentage system; it’s a measure of thermal energy. The notion that 70 degrees Fahrenheit represents “70% heat” is not scientifically accurate as it implies that temperature is a linear scale capped at 100, which it is not. The Fahrenheit scale was actually based on arbitrary points: the freezing point of brine (0°F) and the average human body temperature (96°F at the time, which has since been adjusted to 98.6°F).
Celsius, on the other hand, is based on the freezing and boiling points of water at 0°C and 100°C respectively, under standard atmospheric conditions. This makes it a decimal and scientifically consistent system that is easier to relate to the states of water, an essential reference in science and daily life.
Comparing temperatures to percentages, like test scores, is a flawed analogy because temperature doesn’t have an upper limit “score” and is not designed to be read as a proportion. The scale from -18 to 38 in Celsius correlates directly with the physical properties of water, which is logical for scientific purposes.
Moreover, many argue that Celsius is more intuitive for everyday weather-related use outside of the U.S., as the scale is more granular for colder climates (where a one-degree change in Celsius is noticeable) and aligns well with the metric system, which is used globally for scientific measurement.