The invention of dual-slope integration and the ready availability of low-cost digital electronics saw the spread of the precision digital voltmeter (DVM) from the laboratory to the shop floor. But was it possible for the user to achieve even greater accuracy in measurement?
Marconi Instruments, circa 1980
Digital voltmeters with six-decade precision were starting to be fitted to large Automatic Test Equipment (ATE) used for testing military hardware. It was the nature of the business that these ATE could spend a long time in the manufacturer's works as both the test programs and the military equipment were under development in parallel. As a result both the Test and Commissioning Engineers became very familiar with the ATE and the dependability of the various sub-systems of which they were comprised. The relatively new precision DVMs were considered to be especially reliable so it was a surprise when all of a sudden voltage measurements never seemed to be 'right'.
Of course the first thought was that the 'wrong' readings were genuine or that build faults such as one finds during commissioning were to blame. But then it was found that previously highly repeatable tests were now in error. Even 'simple' test stimuli such as an ordinary dry cell, (a cheap but very reliable voltage 'standard'), were producing the 'wrong' result. To top it all, once the front panels were removed from the ATE the DVM's own built-in display confirmed that test voltages were still being measured accurately, so what had gone wrong?
It turned out that an engineer with informal access to one of the software team had decided that he could improve on the performance of the DVM as supplied by making it take ten readings for every one called for by the test program and then calculating and using the average instead. Essential a small software loop was added to the device driver that took a reading and added it to the voltage variable. This was executed ten times and the variable divided by ten to produce the final result. What they forgot was that the variable already held the initial DVM reading so effectively eleven readings were added together then divided by ten. Six digits of precision on a result that was intrinsically ten per cent in error!
The pair were given a 'rap over the knuckles' and told to use the formal change procedure in future. The other really stupid aspect of this case was that the DVM already incorporated the averaging techniques in the guise of various selectable 'filter' modes.