INTERESTING IN an old post at Watt’s Up With That (the blog by Andrew Watts, a meteorologist who has been in the forefront of debunking the CAGW fraud).
“Unless temperature sensors are regularly calibrated I think it is unreasonable to expect accuracy of greater than a couple of degrees,” writes Lon Glazner, a blogger and electronics engineer with some special expertise in temperature measurement instruments.
Using the clock analogy, if you and your friend’s watches are off by 1 minute how do you know which one is correct? You might call “Time” … which is essentially calibrating your clock. That might make both watches accurate today, but which one is more accurate in 10 years? Now what if you wanted to make sure your watch had been accurate 50 years ago? How would you calibrate your watch to be accurate to half a second if you knew calling “Time” was accurate, but only had a resolution of 10 seconds?
–Glazer in comments to the linked post
In comments to Our Curmudgeon’s post here, it appears I may have been to charitable attributing +/- .5°C accuracy to the thermometers used at USCHN surface stations. Glazner’s statement doesn’t make it clear if he means +/- 2° or a total range of 2°, (+/-1°), but either way, it doesn’t convey much authority to results supposedly based on the record of such instruments when they try to claim to have been able to derive a delta of 0.2°C per century. (Or more.)