42 Chapter 2
Performance Verification Tests
1. 10 MHz Reference Output Accuracy

1. 10 MHz Reference Output Accuracy

The setability is measured by changing the settings of the
digital-to-analog converter (DAC), which controls the frequency of the
timebase. The difference in frequency for each DAC step is calculated
and compared to the specification.
The related adjustment for this performance verification test is the
“10 MHz Reference Frequency Adjustment.”
Equipment Required
Universal counter (Instructions are for Agilent 53132A. For Agilent
5316B, refer to its user documentation.)
Frequency standard
Cable, BNC, 122-cm (48-in) (2 required)
Figure 2-1 10 MHz Reference Test Setup
Procedure
1. Connect the equipment as shown in Figure 2-1. The frequency
standard provides the reference for the universal counter.
2. Check that the analyzer is not in external reference mode. If
Ext Ref appears on the screen, the analyzer is in external reference
mode. If the analyzer is in external reference mode, disconnect the
external reference.
3. Ensure that the analyzer has been on and in internal frequency mode
for at least five minutes before proceeding.
4. Set the universal counter controls as follows:
a. Press Gate & ExtArm.
b. Press any one of the arrow keys until TIME is displayed.
c. Press Gate & ExtArm again. Using the arrow keys, set the time to