Verification and Calibration - B

Voltage Programming and Measurement Accuracy

This test verifies the voltage programming, GPIB measurement, and front panel meter functions. Values read back over the GPIB should be the same as those displayed on the front panel.

Figure B-1 shows the setup. Measure the ac output voltage directly at the output terminals. If you are verifying a three-phase source, sart by verifying output phase 1.

Action

Normal Result

1.Make sure the ac source is turned off. Connect the DVM and ratio transformer as shown in the test setup in Figure B-1.

2.

Turn on the ac source with no load. In the Output menu, execute the

*RST

 

*RST command to reset the unit to its factory default state.

 

3.

Program the output voltage to 150 volts and set the output current

CV annunciator on.

 

limit to its maximum value.

Output voltage near 0.

 

 

Output current near 0.

4.

Enable the output by pressing Output On/Off.

Output voltage near 150 V.

5.

Record voltage readings at the DVM1 and on the front panel display.

Readings within low voltage

 

 

limits specified in table B-2.

6.

Program the output voltage to 300 volts.

Output voltage near 300 V.

7.

Record voltage readings at the DVM1 and on the front panel display.

Readings within high voltage

 

 

limits specified in table B-2.

8.

If you are verifying a 3-phase source , repeat steps 1 through 7 for

Readings within specified High

 

phases 2 and 3. Press Phase Select to select the next phase.

range limits (300 V/1 kHz).

1Multiply the DVM reading by the transformer ratio if a ratio transformer is used.

RMS Current Readback Accuracy

This test verifies the current readback. Use the appropriate current shunt with the accuracy specified in table B-1. Use wire of sufficient size to carry the maximum rated current of the ac source (see table 2-1). If you are verifying a 3-phase source, start by verifying phase 1.

Action

1.Turn off the ac source. Connect the load resistor, current shunt, and the DVM across the current shunt as shown in Figure B-1. Use the

following load resistor values:

Agilent 6814B = 7.5Ω; Agilent 6834B = 15Ω; Agilent 6843A = 5Ω

2.Turn on the ac source. In the Output menu, execute the *RST command to reset the unit to its factory default state.

3.Program the output voltage to 100 volts and set the current limit as follows:

Agilent 6814B = 10 A; Agilent 6834B = 5A; Agilent 6843A = 15A Then enable the output by pressing Output On/Off.

4.Record the DVM voltage reading and calculate the rms current. Divide the DVM reading by the current monitor resistor value. Record the front panel reading.

5.If you are verifying a 3-phase source , repeat steps 1 through 4 for phases 2 and 3. Press Phase Select to select the next phase.

Normal Result

*RST

CCannunciator on. Output current near 10 A for Agilent 6814B near 5 A for Agilent 6834B near 15A for Agilent 6843A

Difference between the measured output current and front panel readings are within specified limits.

61

Page 61
Image 61
Agilent Technologies 6814B, 6834B, 6843A manual Voltage Programming and Measurement Accuracy, RMS Current Readback Accuracy

6834B, 6843A, 6814B specifications

Agilent Technologies, a leader in electronic test and measurement equipment, offers a range of powerful signal sources including the 6843A, 6834B, 6814B, 6813B, and 6811B models. These instruments are designed to support various applications in research, development, and manufacturing, providing precise signal generation capabilities.

The Agilent 6843A is a versatile signal generator known for its exceptional frequency range and modulation capabilities. It supports an extensive bandwidth, making it ideal for applications that require high-frequency signal generation. With its superior phase noise performance, the 6843A is an excellent choice for radar, wireless communications, and electronic warfare applications. The instrument features an intuitive user interface, allowing engineers to set parameters quickly and efficiently.

Next, the Agilent 6834B offers exceptional performance characteristics, including high output power and low distortion. This signal generator is particularly noted for its ability to produce complex modulation formats, making it suitable for testing advanced wireless communication systems. With a reliable and stable output, the 6834B ensures accurate and repeatable measurements, which is vital for thorough testing processes.

The 6814B model stands out for its dual-channel capabilities, allowing users to generate simultaneous signals for testing multiple components or systems. This feature significantly enhances testing efficiency and flexibility for engineers. With built-in arbitrary waveform functionality, users can create custom waveforms, making the 6814B suitable for a wide range of applications including device characterization and signal processing research.

For those seeking a more compact solution, the Agilent 6813B provides essential signal generation features without compromising on performance. It is designed for a variety of applications across telecommunications and consumer electronics, featuring a straightforward interface and robust performance metrics.

Lastly, the 6811B is an entry-level yet capable model that supports a broad spectrum of testing needs. Perfect for educational and laboratory environments, it provides essential functionalities required for effective signal generation and analysis.

Overall, Agilent Technologies' 6843A, 6834B, 6814B, 6813B, and 6811B signal generators offer an array of features and technologies that cater to various application needs. Their precision, reliability, and user-oriented designs position them as invaluable assets in any testing environment, ensuring engineers can carry out their work with confidence and accuracy.