Signal Management 5
Section 5 – Signal Management
5.1 Channel Control and Expansion
In the
5.2Scan Time and Resolution
5.2.1Scan Time
Scan Time (per scan) - The amount of time used for scanning (sampling) all selected channels’ input signal. For the
5.2.2 Resolution (Effective Number of Bits – ENOB, RMS)
Resolution (Effective Number of Bits - ENOB, RMS) – The number of reliable data bits that exist for a signal’s measurement. The greater the resolution, the more detailed the reading, for example, with increased resolution a reading of
5.12V could become 5.11896 V. The DAQ actually provides for 24 bits of data information; however, the accuracy of the least significant bits becomes less as the measurement duration speeds up.
At a scan time greater than 6.5 seconds, the last 1.5 bits are considered unreliable, resulting in a resolution of 22.5 bits. At a very fast scan time ( 1 milliseconds), the seven most least significant bits are unreliable, resulting in 17 bit accuracy.
When you select the Scan Time you also determine the scan(sample) rate and resolution for the applicable channel. For the DAQ’s analog input applications, scan(sample) rates range from 0.0002778 samples/sec up to 1000 samples/sec and corresponding resolution ranges from 22.5 to 17 bits.