Grant Maloy Smith

Wednesday, February 26, 2025 · 0 min read

The History of Temperature Measurement

In this article, you will learn about the importance and development of temperature measurement technology. We will cover the topic in enough depth that you will:

  • Understand what temperature is and why it’s so important

  • Learn about the history of temperature sensors

  • See how important temperature sensors are in test and measurement applications

The most measured physical property in the world 

Temperature is critically important to our health and environment: the buildings we live and work in, our roads and infrastructure, and every machine in the world. Abnormal human body temperature is one of the first indications that we are sick. Extreme temperatures can degrade or destroy machines and structures. Measuring temperature has always been vitally important, but it wasn’t always possible to measure it accurately.

What is temperature?

The freezing and boiling points of water are a practical reference in both the Celcius and Fahrenheit temperature measurement scales

The freezing and boiling points of water are a practical reference in both the Celcius and Fahrenheit temperature measurement scales

Temperature is a physical measurement of the hotness or coldness of matter or radiation. It also expresses the flow of energy from hotter bodies to colder ones. 

K (Kelvin) is the (SI) international system of units for temperature. The coldest temperature is “absolute zero,” a state of virtually no molecular activity. This point is referenced to 0 °K. 

If you were wondering, absolute zero is −273.15 °C (−459.67 °F). That’s cold!

On the Celsius scale, 0 degrees is the freezing point of water, and 100 is the boiling point. Fahrenheit also uses these two reference points, but they are set to 32 and 212 °F. 

Once the world standard, Fahrenheit was replaced by the Centigrade scale in the 20th century. Today, Fahrenheit is only used in the USA and a few other countries. Centigrade was renamed “Celcius” in 1948 to honor Anders Celsius, a Swedish scientist who developed it in 1742. He proposed that 0 degrees should be the boiling point of water and 100 degrees the freezing point. This scale was later reversed to make it more intuitive. 

The Celcius and Fahrenheit scales intersect at -40 °. 

Atmospheric pressure (aka barometric or air pressure) affects water's behavior. Because water's freezing and boiling points are used as references in both the Celcius and Fahrenheit scales, it is specified that the measurements be made at sea level at 1 atmosphere.

We write temperature values in this format:  

  • 25 °C

  • 107 °F

  • 23.45 °K

A space follows the temperature value, and the degree symbol (°) is followed by the unit abbreviation (C, F, or K). Values can be integer or floating point, like 32 or 32.938, depending on the application's resolution requirement.

Early relative measurements

Until the 17th century, temperature measurement was crude at best. No instruments could measure an actual temperature value; they could only make relative measurements. 

Circa 250 BC, Greek engineer Philo Mechanicus of Byzantium filled a hollow lead globe with air and water and connected it with a tube to an open-air pitcher. As the air temperature inside the globe rose, the globe expanded and pushed water into the tube. When the temperature fell, the water receded. 

Hero of Alexandria developed a primitive thermoscope around 50 AD. It was also based on the idea that water or air in a globe would expand when heated and contract when cooled. This could move water along a connected pipe to indicate rising or falling temperature.

Science Museum Group. Reconstruction of Hero's Thermoscope (c. 50 A.D.). 1925-198 Science Museum Group Collection Online.

A hundred years later, Greek-born Roman physician Galen created a similar thermoscope. But he added a scale marked hot, cold, and neutral temperature that was neither hot nor cold. It would not be until the late 1500s and 1600s that a significant advance would be made in measuring temperature.

The thermoscope

Famed Italian astronomer Galileo Galilei (1564-1642) is one of the scientists behind the invention of the thermoscope, along with Santorio Santorio, Robert Fludd, and Cornelius Drebbel. A bulb attached to a long tube was placed in a jar of colored water. The air in the bulb expanded at higher temperatures, pushing the level of the liquid in the tube. 

Pucicu, CC BY-SA 4.0 via Wikimedia Commons

Galileo and his contemporaries did not realize it then, but changes influenced these measurements in barometric pressure. The thermoscope helped confirm an observation of relative temperature changes but did not provide accurate readings.

Science Museum Group. Reconstruction of Galileo's Thermoscope, 1592-1600. 1988-507 Science Museum Group Collection Online

The advent of calibration

In the early 1700s, Danish astronomer Ole Christensen Rømer devised a temperature scale based on water's freezing and boiling points, from 7.5 to 60 degrees. This was the first calibrated temperature scale, and it was completely new. All temperature measurements before had been relative ones. 

Coincidentally, British physicist Isaac Newton proposed a calibrated temperature scale at almost precisely the same time. Newton set zero degrees to be the freezing point of water. However, because he was the warden of the British Mint, he used secondary reference points such as the melting point of tin, lead, and other metals.

The first practical thermometer

To eliminate the effects of barometric pressure on the reading, scientists experimented with using a liquid inside a sealed glass vessel. Ferdinando II de' Medici, the Grand Duke of Tuscany, made such a thermometer using alcohol as early as 1654.

In 1709, Polish-born Daniel Gabriel Fahrenheit created the first practical glass thermometer, filling a sealed tube partially with colored mercury. Fahrenheit got the idea for his scale after visiting Rømer in 1708. He marked the tube with a scale from 32 at the freezing point of water to 212 at the boiling point of water. Therefore, Fahrenheit’s invention made reliable measurements and was of practical use in medicine and other applications.

Early mercury thermometer.. Image by Pieter Kuiper via Wikimedia Commons

The pyrometer

Intended to measure the temperature inside his kilns, English potter Josiah Wedgewood invented a mechanical pyrometer in the middle of the 18th century.

Wedgewood’s Pyrometer, Darling, Charles R. “INDUSTRIAL PYROMETRY. Lecture I.” Journal of the Royal Society of Arts, vol. 59, no. 3031

Other pyrometers were more complex in construction, like the one shown below. Based on the expansion of a metal bar (a) when heated by the kiln, the expanding bar pushed a needle along a marked scale. When the temperature fell, a spring pushed the bar and needle back to their original positions.

Early pyrometer circa 1852, John Draper. Public domain, via Wikimedia Commons

Seebeck’s thermocouple

In 1821, Baltic doctor and physicist Thomas Seebeck became fascinated by the relationship between heat and magnetism. He discovered that connecting two dissimilar conductive metals at two points and then exposing one of the junctions to a heat source generated a magnetic field (and thus a tiny voltage) along the wire. 

The ion had not yet been discovered, so he did not realize it, but he was observing an electromotive force later named The Seebeck Effect in his honor. Seebeck also observed that this current went up or down based on the ambient temperature. His work led directly to the invention of the thermocouple.

Typical thermocouple sensor. The bare wire end is the measuring point. Hartke, Wikimedia Commons

Learn more about thermocouples and their history in this related article:

Enter the RTD 

At the Bakerian Lecture of 1871, Sir Carl Wilhelm Siemens presented a paper on the tendency of electrical conductors to increase their electrical resistance with rising temperatures. He observed that the resistance that could be measured across certain metals changed with temperature. 

He invented a platinum-based RTD (resistance temperature detector), which British physicist Hugh Longbourne Callendar made more reliable and commercially successful in 1885. 

Left Wire Wound Pt100 RTD, 25 x 3 mm. Right: Flat Film Pt100 RTD, 2 x 2 x 0.4 mm. Via Wikimedia Commons

Unlike thermocouples, the RTD’s outputs are linear. However, they require sensor power to be provided. As a result, wiring them is more complex than thermocouples. You can learn more about RTDs and how they are used in this article:

The thermistor

A thermistor is a semiconductor made of metal oxides pressed into a small bead, disk, wafer, or other shape and sintered at high temperatures. It is typically coated with epoxy or glass.

Typical bead thermistor, Ansgar Hellwig, CC BY-SA 2.0 DE. Via Wikimedia Commons

When a current is passed through a thermistor, you can then read the voltage across the thermistor and determine its temperature. A typical thermistor has a resistance of 2000 Ω at 25ºC. 3.9 percent temperature coefficient. You can learn more about thermistors in this article:

Contactless infrared temperature measurement

The infrared pyrometer

British scientist William Herschel discovered the existence of infrared radiation in the early 1800s. However, it was more than 100 years before German physicist Max Planck developed his mathematical equations regarding electromagnetic radiation. It combined quantum theory and physics to describe spectral radiance. 

The first IR thermometer was introduced in 1931, but the technology progressed rapidly because of the Second World War of the 1940s. Early IR sensors were expensive and impractical for daily use until the 1960s when German Doctor Theodore Benzinger developed an inexpensive handheld pyrometer for medical applications. 

The most commonly used IR thermometer is the pyrometer, or “spot” infrared pyrometer. A harmless red laser dot helps the user aim the pyrometer, which is found in inspection applications across nearly every industry. 

IR forehead thermometer

Thermal imaging cameras

A spot infrared pyrometer can only measure temperature at one specific location at a time. But what if we want to look at a running engine and see the temperatures across it? Or a whole human body? Luckily, there is a solution for that called thermal imaging.

Hungarian physicist Kálmán Tihanyi patented numerous ground-breaking technologies related to the invention of television in the early 1920s. In 1929, he moved to London and worked with the British Air Ministry to create a ​​camera that used IR technology so that anti-aircraft defenses could effectively see in the dark. Thus, he invented and patented the first IR camera that same year.

Near the end of World War II, German scientists continued Tihanyi’s “night vision” technology for military targeting applications. By the 1970s, IR technology had progressed, but the sensors still had to be cooled with liquid nitrogen and were quite large and expensive. 

One of the offshoots of the USA’s US Strategic Defense Initiative of the 1980s was the invention of “smart sensors.” It combined a sensor with processing, filtering, and other advanced computations in a single package. Advances in smart sensors and processing power pushed the technology along even further. 

Today, thermal imaging cameras are used in a wide range of applications, including machine inspection, medicine, fire and rescue operations, and more.

Today, thermal imaging cameras are used in a wide range of applications, including machine inspection, medicine, fire and rescue operations, and more.

Temperature sensor signal conditioning

All data loggers and DAQ systems provide signal conditioning for thermocouples and RTD sensors. Although it is possible to interface thermometers directly with DAQ systems, it is rare. IR cameras with digital outputs and hardware drivers can be interfaced directly with computer-based DAQ systems.

Summary

Temperature measurement has been important for centuries, but being able to measure it accurately and consistently is a relatively recent innovation. Today, billions of temperature sensors are used worldwide every second of every day. From the tiniest thermistors to the most advanced IR cameras, temperature remains a mission-critical measurement. 

Dewesoft DAQ systems provide best-of-breed signal conditioning for today’s thermocouples, RTDs, thermistors, and digital interfaces for state-of-the-art thermal imaging systems. 

We hope this brief history of temperature measurement and review of the key technologies in use today has been interesting and valuable. 

To learn more about today’s signal conditioners and DAQ systems for temperature measurements, please visit: