Agilent Technologies 54835A, 45A Change the signal generator frequency to 100 MHz 10 ns period

Models: 54835A 46A 45A

1 207
Download 207 pages 4.15 Kb
Page 56
Image 56
16Change the signal generator frequency to 100 MHz (10 ns period).

Chapter 3: Testing Performance

To test time measurement accuracy

Figure 3-12

Measurement Settings for Time Interval Measurement

For valid statistical data

In equivalent time mode, measurement specifications are valid with sixteen or more acquisitions averaged. Statistics accumulated before the required number of averaged acquisitions may show the instrument to fail the specification. This is particularly true for minimum and maximum in this case since they are set by measurements taken with the fewest averages.

If the procedure above is followed exactly, the required number of acquisitions are averaged before statistics are turned on. Therefore, if you clear and restart measurements, averaging and statistics are restarted simultaneously and the result is erroneous data collected from the early averages.

If in doubt about the statistical data, after #Avg is complete select Clear Measurements or Clear All from the Measure menu tool bar, then repeat the custom measurement again. This restarts the statistics without restarting averaging and the result is valid statistical data.

15Verify the period is 25 ns ± 44 ps, minimum 24.956 ns and maximum 25.044 ns. Record the minimum and maximum readings in the Performance Test Record.

16Change the signal generator frequency to 100 MHz (10 ns period).

17Select horizontal from the setup menu. Set the position to –11 ns.

18Clear measurement statistics.

Do this by clicking Clear Meas (Clear All) on the measurement toolbar, then selecting Delta time from the Time submenu of the measure menu.

19The delta time reading should be 10 ns ± 43 ps, minimum 9.957 ns and maximum 10.043 ns. Record the minimum and maximum readings in the Performance Test Record.

20Change the signal generator frequency to 20 MHz (50 ns period).

21Select horizontal from the setup menu. Set the scale to 100 ns/div, position to –11 ns.

22Clear measurement statistics as in step 18 and restart the measurement.

23The delta time reading should be 50 ns ± 283 ps; minimum 49.72 ns and maximum 50.28 ns. Record the minimum and maximum readings in the Performance Test Record.

24Change the signal generator frequency to 1 MHz (1 s period).

25Select horizontal from the setup menu. Set the scale to 1 s/div, position to –11 ns.

3–19

Page 56
Image 56
Agilent Technologies 54835A, 45A, 46A manual Change the signal generator frequency to 100 MHz 10 ns period