Taking a temperature – how hard is it, really?

January 15, 2020

On Tuesday, we had a paper our on the arXiv: https://arxiv.org/abs/2001.04096. This one is about ‘taking the temperature’ of quantum systems. In particular, we ask how precisely it is possible to measure the temperature, when the temperature is very low and the thermometer is imperfect.

Everyday thermometers – like the ones you use to check if your kids have a fever or whether your beard is likely to grow icicles on your way to work – have a precision of about 0.1°C. At everyday temperatures, that is. If things get much warmer or much colder, these thermometers tend to be less precise. As you might imagine, a medical thermometer is no good for measuring inside a 1000 °C melting furnace (even if it wouldn’t melt itself…) or at cryogenic temperatures in a vacuum chamber, as used in many physics experiments. Of course, at these extreme temperatures, no-one sane would use a regular medical thermometer. Instead, specialised instruments adapted to the temperature range as developed. Any sensor – like a thermometer – is only good within a certain range, and one should clearly use one designed for the range one is interested in. But even so, there may be restrictions on how good the precision can be. Maybe it is just fundamentally harder to measure at very cold or very hot temperatures?

In this paper, we look at such fundamental restrictions when the temperature tends to absolute zero (that is, as cold as it gets). How hard is it to measure cold temperatures?

One answer, which was already well known previously, is that for physical systems that are quantum, it is very hard. In the following sense. In quantum systems, the energy is – well, quantised. That is, different configurations of the system have energies that differ by a nonzero amount. In particular, it requires some amount of energy to take the system from the configuration with the least energy (usually called the ground state) to the configuration with the second lowest energy (usually called the first excited state). There is a gap in energy between these two states, and we say that the system is gapped. When the temperature is so low that the corresponding thermal energy becomes smaller than this gap, then the precision of any temperature measurement starts to become worse very quickly. In fact, it goes bad exponentially quickly as the temperature decreases.

This limits how well any thermometer can do, when there are no other constraints. However, the gap is not always the relevant energy scale. In many cases, the fundamental energy levels of the system cannot be resolved by measurements accessible in practice. For example, they grow closer and closer as the system becomes bigger, so for large systems, we may not have access to any measurement that can distinguish them. The size of our thermometer also limits the kind of energy resolution it has. In these situations, the relevant limit on precision is not provided by the gap size. Nevertheless, the temperature may still be cold relative to other energy scales of interest. And we may well ask, how the precision then behaves with temperature?

This is the question, which we answer in this paper. In previous work, together with Patrick Potts and Nicolas Brunner, we developed a mathematical framework able to deal with temperature measurements that have a finite energy resolution. However, at that time we were not able to determine what the ultimate precision at low temperatures would be. In the new paper, identifying a new criterion for finite resolution, and extending our previous framework, Mathias Rønnow Jørgensen was able to derive a tight bound on how the best precision behaves with temperature. Quite surprisingly, the precision can actually get better with smaller temperature, for systems with an energy spectrum that is just right.

Published paper: https://journals.aps.org/prresearch/abstract/10.1103/PhysRevResearch.2.033394