Resolution
Resolution is the numeric ratio of the maximum displayed value divided
by the minimum displayed value on a selected range. Resolution is
often expressed in percent, parts-per-million (ppm), counts, or bits.
For example, a 612-digit multimeter with 20% overrange capability can
display a measurement with up to 1,200,000 counts of resolution.
This corresponds to about 0.0001% (1 ppm) of full scale, or 21 bits
including the sign bit. All four specifications are equivalent.
Accuracy
Accuracy is a measure of the “exactness” to which the multimeter’s
measurement uncertainty can be determined relative to the calibration
reference used. Absolute accu racy includes the multimeter’s r elative
accuracy specification plus the known error of the calibration reference
relative to national standards (such as the U.S. National Institute of
Standards and Technology). To be meaningful, the ac curacy specifications
must be accompanied with the conditions under which they are valid.
These conditions should include temperature, humidity, and time.
There is no standard convention among multimeter manu fa c tu re rs for
the confidence limits at which specifications are set. The table below
shows the probability of non-conformance for each specification with the
given assumptions.
Variations in performance from reading to reading, and instrument to
instrument, decrease for increas ing number of sigma for a giv en
specification. This means that you can achieve greater actual measurement
precision for a specific accuracy specification number.
The Agilent 34401A is designed and tested to meet performance better
than mean ±4 sigma of the published accuracy specifications.
Specification
Criteria
Mean ± 2 sigma
Mean ± 3 sigma
Mean ± 4 sigma
Probability
of Failure
4.5%
0.3%
0.006%
8
Chapter 8 Specifications
Interpreting Multimeter Specifications
227