HP F100 manual Optical Transceivers, Cluster Interconnect, Client Network

Page 9

Overview of the HP ProLiant Cluster F100

9

For more information, refer to the HBA documentation.

Optical Transceivers

The F100 cluster uses optical transceivers, which convert data that is transmitted and received over a fibre-optic cable. The F100 cluster uses either SFP for the MSA1000 or GBIC-SW for the RA4100.

The transceivers hot plug into switches, Fibre Channel storage hubs, array controllers, and HBAs. The SFP transceiver provides 200 Mb/s performance and the GBIC-SW provides 100-Mb/s performance. The transceivers support distances up to 500 meters using multimode fiber-optic cable.

For more information, refer to the transceiver documentation.

Cluster Interconnect

The cluster interconnect is a data path over which nodes of a cluster communicate. This type of communication is termed intracluster communication. At a minimum, the interconnect consists of two network adapters (one in each server) and a crossover cable connecting the adapters.

The cluster nodes use the interconnect data path to:

Communicate individual resource and overall cluster status

Send and receive heartbeat signals

Update modified registry information

IMPORTANT: TCP/IP must be used as the cluster communication protocol. When configuring the interconnects, be sure to enable TCP/IP.

Client Network

Every client/server application requires a LAN over which client machines and servers communicate. The components of the LAN are no different than with a stand-alone server configuration.

Image 9
Contents HP ProLiant Cluster F100 Installation Guide Audience Assumptions Contents Technical Support Overview of the HP ProLiant Cluster F100 F100 Cluster ComponentsThis Section Guidelines for Multiple Clusters HP StorageWorks Storage Systems HP ProLiant ServersHP StorageWorks SAN Switches HP StorageWorks ControllersHost Bus Adapters Optical Transceivers Cluster InterconnectClient Network Private or Public Interconnect Interconnect AdaptersRedundant Interconnects Cables Server to StorageCluster Interconnect Network Interconnect Microsoft SoftwareArray Configuration Utility HP SmartStart CDProLiant Support Packs Resources for Application Installation HP Modular Smart Array 1000 Support Software CDSystems Insight Manager Overview of the HP ProLiant Cluster F100 Page Preinstallation Overview Setting Up the HP ProLiant Cluster F100Preinstallation Guidelines Setting Up the HP ProLiant Cluster F100 Setting Up the Nodes Installing the HardwareInstalling the Cluster Interconnect Installing Host Bus AdaptersSetting Up a Dedicated Interconnect Setting Up the Storage SystemPowering Up Configuring Shared StorageSetting Up a Public Interconnect Ethernet Direct Connect Using a Crossover CableEthernet Direct Connect Using a Switch or Hub Installing the Software Multiple Cluster SetupRedundant Interconnect SmartStart Installation Cluster-Specific SmartStart Installation SmartStart Installation StepsInstalling the Node 1 Operating System Setting Up the HP ProLiant Cluster F100 HP ProLiant Cluster F100 Installation Guide Installing the Node 2 Operating System Validating the Cluster Technical Support Before You Contact HPHP Contact Information HP ProLiant Cluster F100 Installation Guide HBA Acronyms and AbbreviationsMSA1000 MSA1000 ControllerPnP RA4100 RA4000 ControllerPage Glossary Disk group Dedicated interconnectDriver EthernetFailover cluster Fault toleranceFibre Channel Gigabit Interface Converter-ShortwaveHP StorageWorks Modular Smart Array Host bus adapterHP StorageWorks Modular Smart Array 1000 Controller HP StorageWorks RAID Array 4000 ControllerIP address Microsoft Cluster Server/ServiceLogical unit Logical unit numberNT File System Power-On Self-TestPartition PortRedundant Array of Inexpensive Disks RedundancyReliability ResourceTransmission Control Protocol/Internet Protocol SystemIndex Storage system, set up