Digital contactless thermometer with infrared light taking temperature of a woman on the forehead

Thermometer: Extraordinary Ordinary Things

I live in Brussels. Every time I leave my house, I am bombarded with information about the air temperature because most pharmacies here seem obsessed with showing the air temperature on electronic signs outside their shops, along with the time of day (24-hour clock) and the establishment’s business hours. I imagine the same is true in cities throughout Europe, North America, and elsewhere. We can’t seem to live without constantly being reminded of how warm or cold it is. It is virtually an obsession.

But of course most of us don’t need to leave home to get temperature information. In my case, all I need to do is go out to my terrace and look at the thermometer hanging on the wall. This is something I do faithfully virtually every morning when I wake up, as well as two or three times during the day. It is like a game. I check the thermometer in the morning (usually about 7 a.m.), look at the sky, feel the moisture in the air, and try to guess how high the temperature will rise during the day.

I am talking about a simple liquid thermometer, i.e. the type in which liquid in a glass tube rises and falls as the temperature rises and falls. This is generally what most people mean when they say “check the thermometer.” However; there are many other types of thermometers they might be checking such as the type you reach for when feeling ill, the type you stick into meat when cooking it, the temperature gauge in your automobile, etc. Still, no matter where you go, there is almost always a thermometer around whether we notice them or not. And we cannot seem to live without them.

This is why I consider the thermometer, in whatever form, fully deserves to be included on the list of what I like to call “extraordinary ordinary things.

What Is a Thermometer?

By simplest definition, a thermometer is a device that measures and indicates changes in temperature, most commonly air (ambient) temperature. The most familiar and oldest type of thermometer is the classic liquid (alcohol or mercury) in a thin tube, which rises and falls in concert with rises and falls in the ambient temperature.

The principles that govern liquid thermometers have been known since antiquity; however, the instruments themselves have become part of daily life only within fairly recent times. Why? For the same reason that most other things (fridge magnet, toothbrush, razor blade, umbrella, etc.) have become part of daily life only within fairly recent times. Being invented before the Industrial Revolution, thermometers were largely handmade, which meant that producing them was slow, difficult, and expensive. Only when their production became industrialized could they be made in large quantities and sold at a price that made them readily available.

The “invention” of the liquid-in-a-tube thermometer is generally attributed to Hero of Alexandria (10–70 CE). But this is an overstatement. Hero made observations that were fundamental to later development of the thermometer, but never actually built such an instrument.

Hero studied the principle that certain substances, notably air, expand and contract in response to heat. Add heat and the substance expands; subtract heat and the substance contracts.

As a demonstration, he described an apparatus constructed of a closed tube partially filled with air and having one end immersed in a container of water. When heat was applied, expansion of the air caused the water/air interface to move along the tube, traveling in one direction when heat was added and in the opposite direction when the apparatus was allowed to cool.

However, it was only in the 16th century (some 1500 years later) that this phenomenon was seriously re-envisaged by European scientists such as Galileo Galilei (1564-1642) and Santorin Santorio (1561-1636). As a result of their work, any device that reliably exhibits this expansion/contraction effect became known as a “thermoscope.” The term “thermometer” quickly followed. The difference? A thermoscope simply exhibits expansion and contraction in response to heat, while a thermometer also has a scale to quantify the extent of the expansion and contraction.

Whatever they were called, the two types of instruments suffered from the same defect. They were not sealed, so in addition to being responsive to heat, they were also responsive to air pressure, i.e. they acted like barometers. Moreover, the liquid in the instrument would evaporate.

In 1629, Joseph Solomon, a student of Galileo and Santorio, published an illustrated proposal for a sealed liquid-in-glass thermometer. Like today’s liquid thermometers, it was described as having a bulb (reservoir) at the bottom of a sealed tube. There was also a numbered scale to quantify the movement of the liquid in the tube.

There is no evidence that Solomon ever built such a thermometer. Apparently, this first happened in 1654, some 25 years later, with the credit going to Ferdinando II de Medici, Grand Duke of Tuscany (1610–1670).

Many other scientists and inventors continued to work on thermometers; however, there was little or no coordination among them. In particular, there was no agreed-upon standard scale for measuring the movement of the liquid up and down the tube. In 1665, Dutch scientist Christian Huygens proposed using the boiling point of water and the melting point of ice as the standard. In 1701, Isaac Newton proposed the standard should be a 12-degree scale between the melting point of ice and normal body temperature.

Many other scales were proposed, but the clear winners turned out to be the Celsius scale and the Fahrenheit scale.

Daniel Gabriel Fahrenheit (1686–1736) was a Dutch scientist credited with inventing the first reliable thermometer, i.e. which accurately and repeatedly tracked temperature changes. He insisted on using mercury as the heat-sensitive liquid rather than mixtures of alcohol and water, which he considered unreliable. He also manufactured thermometers, which gave him the perfect opportunity to impose his scale on users.

For people used to the Celsius scale, where water freezes at 0° and boils at 100°, the fact that on the Fahrenheit scale water freezes at 32° and boils at 212° must seem quite odd. However, there is no reason why a temperature scale must necessarily reflect the freezing and boiling points of water, no matter how important these may be in daily life. The important thing is that the scale must be reliable over the range of temperatures being measured. At very low temperatures, certain liquids expand and contract more rapidly or slowly than they do at higher temperatures. Likewise, at very high temperatures, certain liquids expand and contract more rapidly or slowly than they do at lower temperatures. To ensure the reliability of a thermometer, these physical characteristics must be taken into account.

Fahrenheit decided to create his scale based on three fixed temperature points: freezing water, the human body, and the coldest point to which he found it possible to repeatedly cool a solution of water, ice, and ammonium chloride.

Anders Celsius (1701–1744) was a Swedish astronomer, mathematician, and physicist. In 1742 Celsius established that the freezing point of water is independent of latitude, which had not previously been confirmed. More germane to our discussion, he also developed a consistent means of determining the boiling point at different barometric (atmospheric) pressures.

The mantra that water freezes at 0°C and boils at 100°C is too facile. These figures are essentially accurate at sea level; however, as one rises to higher elevations, the boiling point of water decreases due to decreasing atmospheric pressure. For example, water boils at 100°C (212°F) at sea level, but only 93.4°C (200.1°F) at an altitude of 1,905 meters (6,250 feet). As you go higher, the boiling point continues to drop due to declining atmospheric pressure.

Unlike Daniel Fahrenheit, Anders Celsius was deeply concerned about the freezing point and boiling point of water. To avoid the need for negative numbers, the original Celsius scale at sea level placed the boiling point of water at 0°C and the freezing point at 100°C, i.e. the colder the water the higher the number. However, the taxonomist Carl Linnaeus (1707–1778), a colleague at Uppsala University, apparently didn’t like this arrangement, so in a paper he delivered sometime later, he proposed reversing the numbers so that “our thermometer shows 0 (zero) at the point where water freezes and 100 degrees at the boiling-point of water.” And the rest, as they say, is history.

For the more scientifically inclined, there is a third temperature scale that today is widely used. This is the Kelvin scale, named for Irishman William Thomson, Lord Kelvin (1824–1907).

The Kelvin scale is in fact the Celsius scale with the starting point moved from the freezing point of water to “absolute zero.” Absolute zero is the temperature at which all molecular motion stops. Since heat is the energy of molecules in motion, in theory nothing can be colder than absolute zero. So this is where Kelvin starts.

On the Celsius scale, absolute zero occurs at -273.15°C. Since this is what Lord Kelvin called 0°K, there are no negative numbers on the Kelvin scale. Thus, to convert Kelvin to Celsius, it is only necessary to add 273.15° to the Kelvin reading. Example: 20°K = 20 + 273.15 = 293.15°C.

To convert Celsius to Kelvin is slightly more complicated because Celsius has both positive and negative values, while Kelvin has only positive values. This apparent problem can easily be overcome by applying the very simple formula °K = °C + 273.15 where C can be either positive or negative. For example:

  • 10°C = 273.15 + 10 =  283.15°K.
  • -10°C = 273.15 + (-10) = 273.15°K – 10 =. 263.15°K

To complete the picture, “absolute heat” is the theoretically highest possible temperature. This is not easy to determine because it lies at the edge of current knowledge of physical cosmology. Today, the consensus is that absolute heat is most probably the “Planck temperature,” evaluated as 1.41×1032 degrees K. This is unimaginably hot. The temperature at the core of our sun is only about 15 x 106 K, which by comparison would be like diving into the icy waters of the Arctic Ocean at the height of winter.

However, absolute heat may not be quite as absolute as absolute zero. According to current scientific theories, above the Planck temperature, particle energies would be so great that gravitational forces between them would become as strong as other fundamental forces. At present, there is no theory to predict the behavior of particles at these extraordinary energies. So stand by. The story of absolute heat and how it might be evaluated is yet to be written.

Misconceptions About Thermometers

A number of important misconceptions have grown up about thermometers. Here are a few of the more important ones.

  • Degree is an absolute quantity.

Thermometers (with some exceptions) use scales that measure temperature in degrees. Since the term “degree” is virtually universal, it is normal to believe that it means the same thing whenever it is used. This is almost correct, but not quite.

For the Celsius and Kelvin scales, it is completely correct. Remember: the K scale is simply the C scale with the starting point moved from the freezing point of water to absolute zero. By contrast, a degree on the Fahrenheit scale is quite different. On both the C scale and the K scale, the difference between the freezing point and boiling point of water is 100 degrees. However, on the F scale, it is 180 degrees (32–212°F). Thus, a degree C or a degree K is 80 percent bigger than a degree F (180/100). This is why converting from C to K or from K to C requires only straightforward addition or subtraction (C = K + 273.15; K = C – 273.15). However, converting from F to C or C to F requires mathematical formulas: F = 9/5 C + 32; C = 5/9(F – 32).

Few people with any science background are likely to mistake the difference between a C or K degree and an F degree. However, in other areas, dealing with common terms that mean something quite different depending on where you are can be somewhat discomforting.

I have heard that in the 1940s a slight difference in the definition of the British inch and the American inch caused serious problems of incompatibility of spare parts during World War II. I can find no confirmation of this (the two inches were brought into line in the 1950s. However, here is a story I know is true, because I lived it.

In the mid-1960s, I was a volunteer math and physics teacher in Tanzania, which had only recently gained its independence from Britain. Being a good American, I loved fresh milk. So whenever the opportunity arose (which was rare), I would pop into a local shop and buy two pints (approximately a liter) of milk, which I then proceeded to drink instead of eating lunch. Each time I did, I ended up feeling bloated. This had never happened to me in Los Angeles, so I concluded that the milk I was drinking in Tanzania was richer than what I had been drinking at home. Only later did I discover that a British pint was about 25 percent larger than an American pint. I was feeling bloated because I had been drinking 25 percent more milk than I had thought. This difference still exists today.

So just as a degree is not a degree is not a degree, a pint is not a pint is not a pint (apologies to Gertrude Stein).

  • Temperature scales are parallel.

Many liquid thermometers show two scales, with Celsius on one side of the tube and Fahrenheit on the other. This gives the impression that the two scales are “parallel,” i.e. there is a simple one-to-one correspondence to the reading on one scale to the equivalent reading on the other. But as we saw above, this is not the case.

Rather than being parallel, mathematically the  Celsius and Fahrenheit scales cross each other, i.e. instead of being l l they are actually X. This suggests that at some point the temperature on the C scale and on the F scale will be equal, which in fact is the case. You can easily find this equality point by simple mathematics. F = 9/5 C +32 = C. Solving this equation gives F = C = -40 degrees. Using the other conversion formula of course gives the same result: C = 5/9(F – 32) = F, so C = F = -40 degrees.

Thus, the closer you get to this -40 degree crossover point, the less will be the difference in temperature readings between C and F. Likewise, the farther away you get from this crossover point, the greater the difference in temperature readings will be between C and F.

  • The Celsius scale is part of the metric system.

Well, yes and no. The K scale was developed several decades before the metric system was created. It was adopted into the metric system because it runs 100 degrees from the freezing point to the boiling point of water. But it was not created for the metric system; it was absorbed into it.

  • Metric countries don’t use Fahrenheit thermometers.

Today, there are only three countries in the world that have not adopted the metric system: Liberia, Myanmar, and the United States. However, this does not mean all the other countries of the world have abandoned the Fahrenheit thermometer in favor of the Celsius thermometer because they find certain advantages in retaining it for specific purposes.

For example, Canada uses Fahrenheit primarily for baking and measuring temperature in swimming pools. Celsius is preferred for human body temperature and weather.

Some countries, such as India, use Celsius mainly for weather reports but commonly use Fahrenheit for body temperature.


Before the Industrial Revolution when people had to manually paint lines on medical thermometers, having exactly 180 degrees between the freezing and boiling points of water (freezing = 32º, boiling = 212º) made the onerous task considerably easier. This is because the number 180 has so many divisors. Thus, a person hand-painting a thermometer could paint temperature marks every 3 degrees (180/3 = 60 marks), every 4 degrees (180/4 = 45 marks), every 10 degrees (180/10 = 18 marks), every 12 degrees (180/12 = 15 marks), and so on. The number 180 in fact has some 18 integer divisors: 1, 2, 3, 4, 5, 6, 9, 10,12,15,18, 20, 30, 36, 45, 60, 90, 180. By contrast, the 100 degrees between freezing and boiling on the Celsius scale has only nine integer divisors: 1, 2, 4, 5, 10, 20, 25, 50, 100.

Thus, it would have been significantly more difficult to hand-paint scales on a thermometer with the precision needed for measuring body temperature. This is one reason why in the early 20th century, Fahrenheit dominated in medical device labeling even in countries that were using Celsius for virtually everything else.

  • The Kelvin scale is unsuitable for reporting air (ambient) temperature because the figures would be too high.

I have seen this argued, but I can’t really take it seriously. The contention is that reporting air temperature in K would put figures into the hundreds, i.e. 18°C would be 18 + 273 = 291°K. Having grown up in Los Angeles with Fahrenheit, I distinctly remember my first encounters with Celsius in Brussels. If I heard that the air temperature was going to be 27°, I viscerally felt the need to put on a scarf and mittens; instead, I should have been putting on a tee-shirt and shorts. 27° F is below freezing while 27°C is beach weather. How could a number that low indicate an air temperature that high?

It is all a question of habit, i.e. what you are used to. I very quickly became acclimatized to Celsius temperatures and very quickly abandoned my reliance on Fahrenheit because Celsius was all around me.

I am not certain that replacing C with K would bring any advantages. I am certain that if it were proposed, people would rail against the idea at the top of their lungs. Just as I am certain that if it were implemented, these same people fairly quickly would stop complaining and eventually wonder what the ruckus had been all about.

Types of Thermometers

Lest we get distracted by all this talk of thermometers and liquids, we should not forget that by definition a thermometer is a device that can measure temperatures in any state of matter. They can measure the temperature of solids such as food, liquids such as water, and gases such as air. However, there are various ways of making these measurements, so there are various types of thermometers. These include but are not limited to:

  • Liquid thermometer. This is the type of thermometer most people think of when the word thermometer is mentioned, consisting of a bulb (reservoir), a thin tube, and a liquid that rises and falls in the tube in response to temperature changes.
  • Maximum thermometer. This is the basic liquid thermometer with a very useful added feature, a carefully designed restriction inserted just above the bulb. With a basic thermometer, when heat causes the liquid in the bulb to expand and rise, it will immediately fall back once the heat is reduced, e.g. when the instrument is taken out of the user’s mouth. With a maximum thermometer, when heat causes the liquid to expand and rise, because of the restriction it is unable to fall back when the heat is reduced.

    Because it remains at the maximum temperature reached, the maximum thermometer is especially prized for measuring body temperature. To use it again, it is only necessary to vigorously shake the instrument to force the liquid above the restriction back into the bulb. 
  • Electronic thermometer. The Kelvin scale is used primarily to measure very low temperatures, very much lower than the freezing points of mercury or alcohol contained in a liquid thermometer. Mercury freezes at -38.83° C. There are different types of alcohol, but whatever the type, alcohol generallyfreezes around -100° C. Thus, low-temperature thermometers are usually electronic. Electronic thermometers incorporate a device called a “thermistor,” which changes its resistance to an electric current as a function temperature. A computer records the change in the thermistor’s electrical resistance and converts the result into a temperature reading, usually Kelvin.
  • Pill thermometer. Pill thermometers, which are swallowed, are used in sports to prevent and treat heat-related illnesses such as heatstroke. Pill thermometers use liquid crystals to track changes in body heat and transmit radio waves to a device outside the body which records and displays the data. Once swallowed, a pill thermometer can transmit data about the body’s core temperature for up to 30 hours.
  • Nanothermometer. Permitting extremely high precision, nanothermometers are used to measure temperature variations inside a single living cell.

Thermometers and Computers

Because liquid thermometers can freeze or boil within a limited range of temperatures, specialized thermometers are needed to measure temperatures not attainable by liquid thermometers. A thermometer that measures very low temperatures is called a “cryometer” (cryo = freeze); a thermometer that measures very high temperatures is called a “pyrometer” (pyro = fire). In either case, most such thermometers are electronic. Basically, this means they consist of a sensor used as a probe plus a computer to interpret the data the sensor gathers.

Low-temperature thermometers (cryometers) usually incorporate a device called a “thermistor” as the sensor. The thermistor changes its resistance to an electric current as a function of temperature. A computer measures the thermistor’s changed resistance and converts it into a temperature reading. Without this computer-thermistor tandem, accurately measuring very low temperatures would be virtually impossible.

The same sort of assembly (sensor + computer) is used in pyrometers to measure very high temperatures. For example, in industry pyrometers are indispensable for measuring the temperatures inside a furnace such as used in steel making.

The popular types of pyrometers are:

  • A resistance pyrometer determines temperature by a computer calculating how radiation coming from a hot body affects electrical resistance generated in the sensor.
  • A voltage pyrometer determines temperature by a computer calculating how radiation from a hot body affects voltage being generated in the sensor.

A particular type of computerized thermometer has become almost a household word over the past couple of years. This is the so-called “infrared thermometer.”

We need to be careful with terminology here. “Infrared” is the name given to a certain segment of the electromagnetic radiation spectrum. There are several other segments on the electromagnetic radiation spectrum, e.g. radio waves, microwaves, visible light, X-rays, and gamma rays. But where one segment ends and another begins is largely a matter of definition. There are no sharp physical dividing lines from one segment to another in terms of their properties; they simply merge into each other.

The different types of electromagnetic radiation are defined by the amount of energy found in the photons. Radio waves have photons with low energies, microwave photons have a little more energy than radio waves, infrared photons have still more, then visible, ultraviolet, X-rays, and, the most energetic of all, gamma rays.

The infrared segment is used largely for detecting and measuring heat. This is the type of computer-based thermometer that has proved so useful for rapidly checking people’s temperature in airports, hospitals, department stores, schools, and other locations as an indication of COVID-19 infection.

The design basically consists of a lens that focuses infrared radiation coming from an object onto a detector. The device converts the detected radiation into an electrical signal that can be displayed as temperature. Since an infrared thermometer permits temperature measurement from a distance, there is no contact with the object being measured. In the case of COVID-19, the objects being measured are people and no-contact is a sine qua non of the ongoing battle against this deadly disease.

Long before the emergence of COVID-19 in 2020, sophisticated sensing devices were developed by NASA (United States National Aeronautics and Space Administration) to measure the temperature of distant celestial bodies. In 1965 while searching the sky with such a device, astronomers detected diffuse radiation with a temperature of 3°K (-270°C). Since the radiation was detected in all directions everywhere they looked, the researchers concluded that this very cold radiation was almost certainly the theoretically predicted faint remnant of the Big Bang.

In short, this was strong evidence that the universe as we know it began expanding from a single, unimaginably dense point (singularity) and has been spreading out ever since for some 13.7 billion years.

Detection of this background radiation was made in 1964 by Arno Penzias and Robert A. Wilson at Bell Laboratories while the two scientists were conducting experiments with the Holmdel Horn Antenna, an extremely sensitive device originally used to detect radio waves that were bounced off Echo balloon satellites, and later Telstar, the first active communications satellite. The two scientists were awarded the Nobel Prize in Physics in 1978.

Editor's Note. Technically, the instruments used in this research are called microwave detectors rather than infrared detectors because the radiation being studied was not in the infrared segment of the electromagnetic spectrum. However, the principle is the same.

Post Scriptum

As a final word, it would be useful to remember that not all of mankind is obsessed with temperature. Indeed, perhaps the majority aren’t. I know from experience.  Early in my life, I lived for two years in a country where there was no such obsession. In fact, for most people, there was no way of knowing the temperature even if they had wanted to.

From 1965–1967, I was a volunteer teacher of mathematics and physics in the then newly independent nation of Tanzania (formerly Tanganyika). The vast majority of people lived in rural villages rather than in cities. Being mainly subsistence farmers, they were more concerned about the changing seasons than the changing temperature on any given day.

I imagine this has somewhat changed over the past half-century as developing countries have become more urbanized, with more of the populace living in larger and larger cities and farther and farther away from the land. However, I believe it would be a safe bet that hundreds of millions, and possibly even billions, of people around the globe still have never seen a thermometer and wouldn’t know (or care about) what to do with it if they ever did see one.