Resistance Thermometer Basics + RTD Calibration in 5 Steps

What Is a Resistance Thermometer (Resistance Temperature Detector)?

Also known as a resistance temperature detector (RTD), a resistance thermometer is a temperature measurement sensor commonly made from platinum. Resistance thermometers are simple: as temperature rises, the electrical resistance of the platinum (or an alternative material) increases linearly.

Calibrating an RTD with a Drywell

Resistance thermometers are utilized in various industrial applications, especially where precision is paramount. RTDs are renowned for their accuracy and stability; their measurements are also highly repeatable. Typically, they measure temperatures from -200 °C up to 660 °C, but some resistance thermometers, such as the Fluke 5624, can withstand temperatures up to 1000 °C.

How Accurate Are Resistance Thermometers (RTDs)?

Resistance thermometers, or resistance temperature detectors (RTDs), are highly accurate, stable temperature measurement devices. For those who demand consistency, resistance thermometers offer repeatable measurements under identical conditions. This ensures consistent readings for monitoring over time.

Types of Resistance Thermometers (RTDs)

Resistance thermometers (RTDs) come in various dimensions to suit different applications. Some of the most common types of RTDs include:

  • Industrial resistance thermometers: These typically have a lower accuracy vs a secondary RTD but come in a variety of shapes and sizes. Examples of these are the Fluke 5606, 5627A, 5608, and 5618A.
  • Secondary resistance thermometers: These provide a high accuracy level over an industrial probe due to a higher-grade platinum used in the sensor makeup. An example of these is the Fluke 5615 series.
  • Thin-film RTDs: These feature a thin layer of platinum deposited on a substrate, making them more robust and cost-effective. Fluke offers a thin-film designed probe in our 1551A thermometer.

Resistance Thermometer (RTD) Calibration

If you are calibrating a medium- to high-accuracy resistance thermometer (RTD), you will probably use the characterization method of calibration. Characterization is the type of calibration in which the device under test (DUT) resistance is determined at several temperature points, and the data are fitted to a mathematical expression.

5 Steps for RTD Calibration:

  1. Place the reference probe and the DUTs in the temperature source. Make sure they are all placed as close together as possible, in a radial pattern with the reference probe in the center of the circle.
  2. Connect the leads to the readout(s), using the proper 2-, 3-, or 4-wire connection.
  3. Measure the reference probe and determine the temperature. Ideally, you would use a readout designed for temperature work that can measure the resistance and calculate the temperature from calibration coefficients previously entered into the readout.
  4. Measure and record the resistance of the DUT(s). Since the DUTs are resistance thermometers similar to the reference probe, they are measured in a similar manner. It is also good practice to close the process by measuring the reference probe one more time.
  5. Fit the data. Data fitting is a process in which you solve a set of simultaneous equations that contain the calibration data to arrive at a set of coefficients unique to the RTD and calibration. There are several software programs available to accomplish this task.

You might also be interested in

Chat with ourFluke assistant
Clear Chat