As soon as a sensor is calibrated, the accuracy of its future measurements are dictated by the accuracy of that first calibration. As a result, it’s vital that this first calibration is performed as accurately as possible.
Errors can arise in temperature sensor calibration using a dry-block or heat bath from several elements, from temperature homogeneity and sensor geometry to the age of the inserts used, and more. This article explores the most common reasons for errors in temperature calibration and details how to avoid the worst of them in order to minimize uncertainty in calibration.
Temperature sensor calibrations often require the creation of an “uncertainty budget” which represents the maximum possible calibration error of a sensor for a given application.
There are a high number of contributing factors determining the total calibration uncertainty in temperature sensing. The list of error contributors is extensive, and includes:
- Curve fit error
- Spatial variation in temperature (axial and radial gradients)
- Thermal load in the calibration unit
- Thermal stability
- Type and age of the insert used.
The factors that most commonly contribute to errors are discussed in more detail below.
Reducing Error Due to Thermal Load
Typically, the specifications of a temperature calibrator are founded on its performance when using a low thermal load reference sensor. As a result, the calibrator’s performance begins to vary from its specifications when equipment other than a reference sensor is used in the insert. This can result in inaccuracies when simultaneously calibrating large-diameter sensors or multiple sensors. For example, the thermal load of a 10 mm sensor in a typical dry-block calibrator can often result in errors over 0.15 ºC.
Luckily, this error contributor can be reduced by an order of magnitude or more through the installation of an external reference. An additional reference sensor can be placed into the insert together with the unit being tested and used as a reference to specify the accuracy.
Additionally, it can also be used as the controlling sensor. It’s possible to use the external sensor as an independent sensor connected to an external handheld thermometer. However, it’s preferable to have the external sensor connected directly to the calibrator.
Reducing Error Due to Axial Gradient
The laws of thermodynamics dictate that even in highly controlled environments, the area in close vicinity to a heat source will contain a temperature gradient. Although it isn’t possible to entirely eradicate this temperature gradient, temperature calibration devices do aim to minimize this spatial variation in any way possible. The best way of ensuring this temperature variation is kept to a minimum is to maximize the temperature homogeneity in the space around the sensor being calibrated.
In a perfect world, heat sensors would be calibrated in heat baths with rapidly stirred, low-viscosity fluid in order to achieve a very high-temperature homogeneity in the area surrounding the sensor. However, there are a number of issues associated with this method, ranging from the size of heat baths, safety concerns around the use of hot oil, and the potential to pollute sensors with silicone oil. This means that heat baths are not often believed to be a practical solution for many applications. As such, a dry-block calibrator is often the solution when performing on-site calibration.
Normally, temperature sensors have a fairly small radius compared to their length. For this reason, the errors caused by radial temperature gradient is generally very small, typically just 0.01 ºC. The error due to axial gradient (i.e. the temperature gradient down the length of the sensor) is commonly much higher and is also influenced by the effects of different loads and different temperatures.
Minimizing Axial Gradient Errors with Dynamic Load Compensation
The most effective way of minimizing errors caused by axial gradient is to use a calibrator with a dual-zone design and dynamic load compensation (DLC). Where dry-block calibrators often only have one heating zone, dual-zone, dry-block calibrators use two heating zones to compensate for heat loss.
The temperature difference between the two zones can be continuously measured by embedding additional sensors within the insert along with the unit being tested. This will facilitate dynamic control over the thermal output of each heater, consequently compensating for heat losses and minimizing the temperature gradient.
The result of this set-up is a dry-block system that behaves like a heat bath when considering thermal homogeneity that can notify the user of the internal temperature distribution.
The superior heat-distribution within DLC systems makes them perfectly suited to testing large-diameter sensors. DLC systems can also be more time effective by enabling the calibration of multiple sensors at the same time. Our tests show that the total uncertainty in sensor calibration (at a 95% confidence interval) can be reduced from 0.185 ºC to 0.034 ºC.
Ametek is a world-leading producer of temperature calibration instruments for research and industry.1 The Ametek STC RTC range of Reference Temperature Calibrators is made up of the most advanced and accurate portable temperature calibrators yet. The range covers temperatures from -100 to 700 ºC with 7 different models, all featuring DLC and dual- or triple-zone temperature control for unbeatable temperature homogeneity even when testing large sensors or multiple sensors at once.2
References and Further Reading
- Ametek Dry Block Calibrators. Available at: https://www.ametekcalibration.com/industries/oil-and-gas/dry-block-calibrators.
- Ametek Reference Temperature Calibrators. Available at: https://www.ametekcalibration.com/products/temperature/temperature-calibrators/rtc-series-reference-temperature-calibrator.
This information has been sourced, reviewed and adapted from materials provided by AMETEK STC.
For more information on this source, please visit AMETEK STC.