Oh, America. You do so confuse the rest of the world, with your “pounds per square inch” and “cups”, instead of sensible measurements like Pascals and liters. And part and parcel of that is your stubborn refusal to drop Fahrenheit in favor of Celsius.
Of course, we admit that Fahrenheit does have some advantages. But are they really enough to justify its use in the 21st century? Let’s find out.
The case for Fahrenheit
We’ll say it: Fahrenheit gets a bad rap. Perhaps thanks to its weird values of melting and boiling points – 32 and 212, versus Celsius’s nice round 0 and 100; perhaps because it’s so intrinsically linked to the USA versus… well, just about everybody else in the world, but it’s sort of got lumped in with all those nonsense metrics like a “pound” or an “inch” – measures which ultimately come from some guy 1,000 or so years ago saying “eh, this is about how big a barleycorn is, probably.”
But in fact, there’s quite a few advantages to using the Fahrenheit scale – and it’s no less scientific just for its lack of a decimal base. Indeed, “any scale, including Fahrenheit or Celsius or even Réaumur, can be linked to the metric system with equal ease,” wrote Eric Pinder, author of Tying Down the Wind: Adventures in the Worst Weather on Earth.
“The Celsius scale is not really ‘metric’ in the same practical way that, say, centimeters and kilometers are,” Pinder pointed out. “Had Gabriel Fahrenheit lived in France and Anders Celsius in Britain, it might have been the Fahrenheit scale which was ‘attached’ to the metric system instead of vice versa.”
If anything, the Fahrenheit scale may be more accurate than Celsius – or at least, more precise. Because there are nine degrees Fahrenheit for every five Celsius, the scale “has more degrees over the range of ambient temperatures that are typical for most people,” Jay Hendricks, a researcher in the National Institute of Standards and Technology’s Fundamental Thermodynamics Group, told HowStuffWorks.
“This means that there is a ‘finer grain’ temperature difference between 70 degrees F and 71 degrees F than there is between 21 degrees C and 22 degrees C,” he explained. “Since a human can tell the difference of a 1 degree F, this scale is more precise for the human experience.”
In fact, it’s this “human experience” element that many people used to the Fahrenheit scale will point to as its biggest selling point. Despite having its origins in the astronomical and meteorological sciences, Fahrenheit has sort of worked out to be really intuitive for human use: “When we’re talking about temperature, we’re usually talking about comfort level,” Pinder pointed out. “In general, ‘Temperature will rise into the triple digits today!’ means the outside air will be warmer than your internal body temperature.”
In other words, “it’s the point at which your body becomes a heat sink instead of a heat source,” he explained. “In terms of comfort, that’s very significant and not at all arbitrary.”
The case for Celsius
In the opposing camp, we have Fahrenheit’s younger, prettier, and more popular cousin, Celsius – at least, if its proponents are to be believed.
First proposed 18 years after Fahrenheit put his temperature scale forward, Anders Celsius’s original plan was to have zero represent the boiling point of water at sea level, and 100 to be the freezing point – but, since that makes no sense, it was quickly reversed by other scientists.
Still, the plan from the beginning was for a 100-point scale based on the physical properties of water, which made it a natural fit for the new metric systems of the burgeoning Age of Enlightenment. Indeed, it’s such an intuitive idea that Celsius was far from the only person to come up with it – ask a French person who invented the 100-degree temperature scale, and you might get the answer “Jean Pierre Christin”, who was working on the same idea at around the same time as Celsius.
What set Celsius apart, though, was the careful scientific work that went into establishing his two fixed points – zero and 100 – and that’s why, in the end, he got the eponymity. But the 100-degree scale would likely have taken off whosever name it bore: by the mid-19th century, scientists such as Carl August Wunderlich were working entirely in centigrade, arguing that “the convenience of this scale will probably shortly lead to its general adoption by all scientific men.”
And boy has he been proven right. Assuming that, here in the 21st century, we’re all “scientific” in our outlook – which, admittedly, isn’t necessarily as true as we’d like to think, but hey – centigrade, and Celsius in particular, really is the temperature scale to use. Just about everyone on the planet is familiar with it, apart from the usual suspects; there’s even a jaunty little poem to help you remember how it works.
Celsius is, generally, way easier to work with when it comes to science. Take a calorie, for example: you may think of it as 1/550 of a Big Mac, but it’s actually defined as the amount of heat needed to raise the temperature of one liter of water by one degree Celsius.
But what really cemented Celsius’s place in the scientific world was its adoption by Lord Kelvin – aka the guy who gave us, well, Kelvin: the scale that starts at absolute zero and increases in degrees that are exactly equal to those in the Celsius scale. This makes conversion between the two systems extremely easy: no multiplication by funny fractions needed, just add 273 to get from Celsius to Kelvin, or subtract it to go the other way.
Why was Kelvin’s use of a centigrade scale so important? It’s conceptual: the Kelvin scale measures not just temperature, but thermodynamic temperature – it is, as Julia Scherschligt, an expert in vacuum and pressure metrology at the National Institute of Science and Technology in the United States, told Live Science in 2021, “absolute, not relative to fixed points.”
“It describes the amount of kinetic energy contained by the particles that constitute a blob of matter, that wiggle and jiggle around at sub-microscopic levels,” she explained. “As the temperature drops, the particles slow down until at some point, all motion ceases. This is absolute zero, which is the benchmark of the Kelvin scale.”
Verdict
So, who comes out on top? Well, it depends: are you after an easy, intuitive way to think about the weather, or are you looking to describe some kind of cosmological phenomenon?
Well, certainly Fahrenheit has some things going for it. Not least of which is ease of use – let’s face it: if you’re reading this in the US, then fluency with Celsius probably isn’t going to be that helpful in day-to-day life.
But if we’re speaking scientifically – and, given the name of the publication, let’s hope we are – Celsius probably beats Fahrenheit. As a derived unit from Kelvin, it can be defined purely in terms of physical constants; it’s easy to use in experiments since it’s nicely based on the physical properties of water, and it fits in nicely with measurements of other quantities.
Overall then, perhaps the real winner is… Kelvin. “A number can be measured with arbitrary precision on any scale,” pointed out Scherschligt. “But only the Kelvin is physics-based, which means it is the most accurate scale.”
All “explainer” articles are confirmed by fact checkers to be correct at time of publishing. Text, images, and links may be edited, removed, or added to at a later date to keep information current.
Source Link: Celsius Vs Fahrenheit: Which Is Better?