Chapter 9 Specifications

Interpreting Internal DMM Specifications

Resolution

Resolution is the numeric ratio of the maximum displayed value divided by the minimum displayed value on a selected range. Resolution is often expressed in percent, parts-per-million (ppm), counts, or bits.

For example, a 612-digit multimeter with 20% overrange capability can display a measurement with up to 1,200,000 counts of resolution.

This corresponds to about 0.0001% (1 ppm) of full scale, or 21 bits including the sign bit. All four specifications are equivalent.

Accuracy

Accuracy is a measure of the “exactness” to which the internal DMM’s measurement uncertainty can be determined relative to the calibration reference used. Absolute accuracy includes the internal DMM’s relative accuracy specification plus the known error of the calibration reference relative to national standards (such as the U.S. National Institute of Standards and Technology). To be meaningful, the accuracy specifications must be accompanied with the conditions under which they are valid. These conditions should include temperature, humidity, and time.

There is no standard convention among instrument manufacturers for the confidence limits at which specifications are set. The table below shows the probability of non-conformance for each specification with the given assumptions.

Specification

Probability

Criteria

of Failure

Mean ± 2 sigma

4.5%

Mean ± 3 sigma

0.3%

 

 

Variations in performance from reading to reading, and instrument to instrument, decrease for increasing number of sigma for a given specification. This means that you can achieve greater actual measurement precision for a specific accuracy specification number. The HP 34970A is designed and tested to meet performance better than mean ±3 sigma of the published accuracy specifications.

4179