If you're an American and you've ever had a conversation with someone from another country about the weather, you've probably been a little confused when he or she says that the afternoon temperature is a nice 21 degrees. To you, that might sound like a chilly winter day, but to them, it's a pleasantly warm springtime temperature.
That's because virtually every other country in the rest of the world uses the Celsius temperature scale, part of the metric system, which denotes the temperature at which water freezes as 0 degrees, and the temperature at which it boils as 100 degrees. But the U.S. and a few other holdouts – the Cayman Islands, the Bahamas, Belize and Palau – cling to the Fahrenheit scale, in which water freezes at 32 degrees and boils at 212 degrees. That means that the 21 degrees C temperature that we previously mentioned is the equivalent of a balmy 70 degrees F in the U.S.
The persistence of Fahrenheit is one of those puzzling American idiosyncrasies, the equivalent of how the U.S. uses the word soccer to describe what the rest of the planet calls football. So why is it that the U.S. uses a different temperature scale, and why doesn't it switch to be consistent with the rest of the world? There doesn't seem to be a logical answer, except perhaps inertia. Americans generally loathe the metric system – this 2015 poll found that just 21 percent of the public favored converting to metric measures, while 64 percent were opposed.
It might make more sense if Fahrenheit was old-school and Celsius was a modern upstart, sort of the New Coke of temperature. But in reality, they were created only about two decades part. Fahrenheit was created by its namesake, a German scientist named Daniel Gabriel Fahrenheit, who in the early 1700s was the first to design alcohol and mercury thermometers that were both precise and consistent, so that any two of his instruments would register the same temperature reading in a given place at a given moment. "His great mechanical skill in working glass enabled him to carry out his designs," as Henry Carrington Bolton explained in his 1900 book, "Evolution of the Thermometer, 1592-1743."
Invention of the Fahrenheit Scale
When Fahrenheit started out, the key thing that he was interested in was coming up with the same temperature reading all the time, not comparing temperatures of different things or different times of day. But when he presented a paper on his system for measuring temperature to the Royal Society of London in 1724, he apparently realized that he had to come up with a standard temperature scale as well.
"Basically, the Fahrenheit scale was devised with zero as the coldest temperature for a mixture of ice and salt water, and the upper end was thought to be body temperature (approx. 96 degrees F), making a scale that could be progressively divided by 2," explains Don Hillger, a research meteorologist at Colorado State University's Cooperative Institute for Research in the Atmosphere, and also president of the U.S. Metric Association, a group that advocates conversion to the metric system. "This resulted in the freezing/melting temperature being 32 degrees F, not a very useful number! The boiling temperature for water was then set at 212, again not a very useful number. The two temperatures are 180 degrees apart, again a multiple of 2."
The Celsius Scale
Nevertheless, the system apparently sounded pretty good to officials of the British Empire, who adopted Fahrenheit as their standard temperature scale, which is how it eventually became established in the American colonies as well. Meanwhile, though, in 1742, a Swedish astronomer named Anders Celsius came up with a less unwieldy system based on multiples of 10, in which there was a precisely 100 degree difference between the freezing and boiling temperatures of water at sea level. (Oddly, according to ThoughtCo, he started out with water freezing at 100 and boiling at zero, but eventually, someone flipped it around.)
The neat 100-degree symmetry of the Celsius scale made it a natural fit for the metric system, which was formally developed by the French in the late 1700s. But the English-speaking world nevertheless clung stubbornly to its preference for awkward units such as the pound and the inch, and Fahrenheit went along for the ride. But finally, in 1961, the U.K. Met Office switched to using Celsius to describe temperatures in weather forecasts, in order to be consistent with other European countries. Most of the rest of the world soon followed suit – with the notable exception of the U.S., where the National Weather Service still publishes temperature data in Fahrenheit – even though its own staff long ago switched to Celsius.
"The NWS is catering to the public by reporting in degrees Fahrenheit, whereas much of their operations, such as forecast models, use degrees Celsius," Hillger explains. "And, for most automated weather observations the temperatures are recorded in Celsius as well. Should we choose to go metric in weather reports, the Fahrenheit layer that's now added for the U.S. public could be removed. Yet, the NWS is more attuned to metric than TV meteorologists, most of whom are catering to their audiences and seldom if ever use degrees Celsius, except maybe some stations near our borders with Canada and Mexico?"
Jay Hendricks, who heads NIST's Thermodynamic Metrology Group, points out that the Fahrenheit scale does have one significant advantage. "It has more degrees over the range of ambient temperatures that are typical for most people," he says via email." This means that there is a 'finer grain' temperature difference between 70 degrees F and 71 degrees F than there is between 21 degrees C and 22 degrees C. Since a human can tell the difference of a 1 degree F, this scale is more precise for the human experience."
On the other hand, though, the advantage goes away if a fractional temperature in Celsius is used. "For example, the equivalent Celsius temperature for 70 and 71 Fahrenheit are equivalent to 21.1, 21.7 Celsius," Hendricks explains.
Originally Published: Jun 11, 2019