r/AskElectronics 1d ago

What is the secret behind high accuracy temperature readings?

The most accurate thermistors that I've ever seen are in the temperature sensitivity of 0.1°C How can DAQs and instrumentation devices can go far beyond that?

20 Upvotes

24 comments sorted by

28

u/Adversement 1d ago

High accuracy: With a high accuracy calibration target to remove the initial uncertainty of component values at a fixed and known temperature.

To get precision from a thermistor is easy. Like, even down to 0.001 °C in a temperature controller. Well, not easy. But doable even with reasonably inexpensive (precision) components.

Lets take a basic 10,000 ohm NTC thermistor with the standard ~3950 K curve.

At 25 °C the resistance is R. This might be anything between 9,900 and 1,0100 ohms for a generic thermistor (which is good enough for what follows).

At 24.999 °C the resistance is 1.000044*R. We do not really care if it is 9900.44, 10000.44 or 10100.45 ohms.

Let us also take our (high quality precision thin film) reference resistor that we keep at nearly constant temperature (even though it was also selected for its low temperature coefficient of resistance). Its resistance is a random, but quite well fixed value between about 9990 and 10010 ohms, with only very small changes over our narrow temperature range of interest, and it is also known to age slowly. Now, if we divide our analogue reference voltage between these two resistors, the voltage at the node between these two resistors will change by about 11 ppm of the analogue reference voltage (nominally, we go from 0.5 to 0.500011, but in real world we might go from, say, 0.497500 to 0.497511 or from 0.502500 to 0.502511 as the two resistors are not exactly 10,000 ohm).

To read this change, we need a bit less than 17 noise free bits. This is doable with any decent modern ADC at appreciably fast rates, of hundreds of readings per second. Given that we are reading a ratio of two voltages, we do not even need a good voltage reference. Like, we can use almost anything for this. And, well, a cheap kitchen scale has a harder problem with precision than our system will have! It needs the likes of all of its 24 bits to be noise free (which it can only do a few times a second) to get to the usual range for the modern cheap digital kitchen scales.

The only big problem is, we know that the temperature changed by about 0.001 °C, but we do not know if it was 25 °C or say 24 °C to begin with (or 24.9 or 25.1 °C if getting fancy thermistor with better factory calibration).

That is where we need something other than the thermistor. Something we can calibrate easily from a higher standard, like classically the triple point of water and then transfer to the reading range of our thermistor, which won't be the widest if we do. Now, we are limited mostly by the calibration accuracy.

3

u/polongus 19h ago

You're making this sound a lot easier than it really is :)

Typical ADCs have a full scale accuracy (gain error) of ~ +/-1%.

It's also not straightforward to make a heat bath that's stable and uniform even within 0.1C, let alone 0.001C.

1

u/Adversement 15h ago

I must agree with you ... the "well not easy" undersells just how badly the accuracy can be lost at any individual step though the process.

Absolute accuracy of 0.001 °C requires some voodoo.

Though, the ADC gain error should not be in the set of hard parts, unless one also wants to have a large measurement range from the calibration point. The analogue side is where the things can go bad much easier. And, of course the calibration which will be a nightmare.

Fortunately, most times one only wants to stabilise temperature to millikelvin stability. In which case the whole calibration issue becomes moot (this leaves just the still-not-trivial issue with the thermal stability of the readout, unless one also temperature stabilises the readout with a second lesser temperature sensor).

1

u/PLANETaXis 10h ago

If you are trying to make anything that is highly accurate, then you need to characterise every part of the instrument chain and compensate for it.

By that I mean you keep the temperature probe constant but subject the instrument hardware and electronics to a range of voltage, temperature, pressure and humidity conditions, then plotting that on a chart. You can then back-calculate compensation values at any point. The "turtles all the way down" part is that you need to embed extra instruments within your instrument chain to measure the conditions and compensate.

1

u/polongus 35m ago

Yep, but that "temperature probe constant" is its own barrel of worms.

For example, you might think a pot of boiling water gives you a nice 100C reference. But in reality the temperature within will fluctuate by several C, and that's not even considering the effects of atmospheric pressure.

You can pay $10k for a deep well calibrator that will claim accuracy/stability to 0.01C, but if you're not careful, thermal conduction down your probe cables can easily throw that off.

4

u/nixiebunny 1d ago

Use a delta-sigma ADC with a well-calibrated thermistor and average many readings together to get high repeatability and resolution, by averaging out the noise. 

5

u/Chagrinnish 1d ago

Why not just buy a delta-sigma-delta-sigma ADC? Then delta-sigma that a few times, and at the bottom put a turtle.

9

u/1310smf 1d ago

Typically moving from a resistive/voltage measurement to a frequency-based one (e.g. using quartz crystal temperature sensing elements.)

I mean, assuming the device is actually managing accuracy, (the number reported is real) not just precision (the number reported has many digits to the right of the decimal point, but they might as well have come out of a bovine's hind orifice.)

One example I found in a quick search, no affiliation or endorsement: https://statek.com/wp-content/uploads/2019/05/Temp-Sensor-10162-Rev-D.pdf

3

u/3X7r3m3 1d ago

Just throw in a 24 or 32b ADC, then realise how hard it is to not measure just noise beyond the second decimal place.

3

u/paulusgnome 1d ago

You will get better accuracy by using pt100 or pt1000 temp sensors, but you need a suitable interface. The sensor current must be accurately controlled, and the adc must have enough bits for the required resolution.

3

u/SmartLumens Power 1d ago

Platinum RTD that has its NIST calibration data connected to four wire ohmmeter that has been calibrated with NIST certified resistors.

What temp range are you working with?

4

u/blue_eyes_pro_dragon 1d ago

Sensitivity and accuracy are not the same.

Just because it can tell if the temperature went up 0.1c does not mean it can tell you it’s 22.5c.

So keep in mind that these specs are highly specific, and depend on application.

Some sensirion temperature sensors do 0.1c accuracy* but that’s because they do calibration. And they typically have graph showing how far away it can get depending on temperature.

Another point to add — at those points it’s extremely difficult to separate out self heating.  Even a tiny amount of current (milliwatts on mcu for example) are enough to skew the sensors if it’s on the same pcb.

(The problem is that heat transfer is proportional to temperature difference, so if you have a small temperature difference you have small heat transfer.)

1

u/Rough_Treat_644 1d ago

Wouldn't a high value resistor limit the slef heating effect by keeping the current to a minimum. And furthermore if I want to measure a specific point of interest wouldn't I try to keep the device as far to the pcb as possible?

1

u/Rough_Treat_644 1d ago

And by device I mean the thermistor itself

1

u/PLANETaXis 10h ago

Yes these are all good ideas, but there are trade-off.

You cant measure resistance directly, generally it's used as a voltage divider, which is then read by an ADC. Using a larger resistance as the detecting element means that the internal resistance / effective resistance of the ADC becomes more significant and can influence the results. It also makes the signal more susceptible to noise.

1

u/kangadac 5h ago

Keep in mind that the Johnson noise from the resistor goes up as the resistance goes up (V ~ sqrt R). You get some benefit by increasing the resistance/lowering the current but only up to a point.

2

u/snp-ca 1d ago

Use RTD.

2

u/Mal-De-Terre 1d ago

I've got some upsetting news for you. In many settings, the temperature may not be all that uniformly distributed, which decidedly complicates accurate measurements.

1

u/edman007 1d ago

Why would the DAQ be a limit? haven't not exactly designed one, I would think you'd have an op-amp or something to bump up the sensitivity.

The real trick is put all the things that affect accuracy inside an oven, so you operate the circuit in a +/- 5 degree conditions, and then calibrate it. I don't think 0.1C would be all that difficult, but calibrating it to that level probably takes some work.

1

u/PLANETaXis 10h ago edited 10h ago

You're probably confusing accuracy vs precision.

It is easy to make a temperature probe that is highly precise, e.g. measure changes of 0.0001degC. The change in characteristics is continuous and you can detect extremely small changes in resistance or voltage with a high gain operational amplifier or high bit count ADC's.

What is hard is being accurate - e.g. is it truly 20.0000degC or 20.0001degC. This would require characterising the response of the detecting element and all of the wiring/hardware along the instrument chain, so that you can compensate for all of the biases and influences.

0

u/CompetitiveGuess7642 1d ago

i think you use a thermocouple when you want accuracy, at least that's how I measure my stove pipe and with a proper adc I have like 1/4 celcius resolution from 0 to 700C or something.

3

u/1310smf 1d ago edited 1d ago

Thermocouples are terribly useful, and widely used, expecially for wide-range, but not all that accurate:

https://www.thermocoupleinfo.com/thermocouple-accuracies.htm

2

u/Rough_Treat_644 1d ago

1/4 = 0.25°C