Dell CX4 manual Connecting a PowerEdge Cluster to Multiple Storage Systems

Page 34

cConnect a cable from Fibre Channel switch 0 (sw0) to the second front-end fibre channel port on SP-A.

dConnect a cable from Fibre Channel switch 0 (sw0) to the second front-end fibre channel port on SP-B.

eConnect a cable from Fibre Channel switch 1 (sw1) to the third front- end fibre channel port on SP-A.

fConnect a cable from Fibre Channel switch 1 (sw1) to the third front- end fibre channel port on SP-B.

gConnect a cable from Fibre Channel switch 1 (sw1) to the fourth front-end fibre channel port on SP-A.

hConnect a cable from Fibre Channel switch 1 (sw1) to the fourth front-end fibre channel port on SP-B.

NOTE: Additional cables can be connected from the fibre channel switches to the storage system if there are available front-end fibre channel ports on the storage processors.

Zoning Your Dell/EMC Storage System in a Switched Environment

Dell only supports single-initiator zoning for connecting clusters to a Dell/EMC storage system in a switched environment. When using EMC PowerPath, a separate zone is created from each HBA port to the SPE.

Connecting a PowerEdge Cluster to Multiple Storage Systems

You can increase your cluster storage capacity by attaching multiple storage systems to your cluster using a redundant switch fabric. Failover Clusters can support configurations with multiple storage units attached to clustered nodes. In this scenario, the Microsoft Cluster Service (MSCS) software can fail over disk drives in any cluster-attached shared storage array between the cluster nodes.

NOTE: Throughout this document, MSCS is used to refer to either the Microsoft Windows Server 2003 Cluster Service or the Microsoft Windows Server 2008 Failover Cluster Service.

When attaching multiple storage systems with your cluster, the following rules apply:

There is a maximum of four storage systems per cluster.

The shared storage systems and firmware must be identical. Using dissimilar storage systems and firmware for your shared storage is not supported.

34

Cabling Your Cluster Hardware

Image 34
Contents Hardware Installation Troubleshooting Guide October Contents Installing the Fibre Channel HBAs Cluster Configuration OverviewImplementing Zoning on a Fibre Channel Zoning Configuration Form Contents Introduction IntroductionCluster Hardware Requirements Cluster SolutionCluster nodes Cluster storage Identical Cluster NodesCluster Node Requirements Component Minimum Requirement For the internal drivesRequirement Cluster StorageCluster Storage Requirements Enclosure ExpansionNavisphere Manager Storage Views Description EMC PowerPath Limitations in a Direct-Attached Cluster Supported Cluster ConfigurationsDirect-Attached Cluster Other Documents You May Need SAN-Attached ClusterPage Cabling the Mouse, Keyboard, and Monitor Cabling the Power SuppliesCabling Your Cluster Hardware Cabling Your Cluster HardwareCabling Your Cluster Hardware Cabling Your Cluster for Public and Private Networks Network Connections Description Cabling the Public NetworkPrivate network Cluster node Cabling the Private Network Using Dual-Port Network AdaptersCabling the Storage Systems NIC TeamingCabling Storage for Your Direct-Attached Cluster Fibre Channel ConnectionsCabling a Cluster to a Dell/EMC Storage System Cabling a Two-Node Cluster to a Dell/EMC Storage SystemCluster node HBA ports CX4-120 or CX4-240 storage systemCabling a Two-Node Cluster to a CX4-960 Storage System Cabling Your Cluster Hardware Cabling Storage for Your SAN-Attached Cluster Cabling Two Two-Node Clusters to a Dell/EMC Storage SystemCluster node Private network Storage system Cabling Your Cluster HardwareStorage system Cabling Your Cluster Hardware Cabling a SAN-Attached Cluster to a Dell/EMC Storage System Connect cluster node 1 to the SAN Sw0 CX4-480 storage system Sw1 Cabling Your Cluster Hardware CX4-960 storage system 12. Cabling a SAN-Attached Cluster to the Dell\EMC CX4-960First cluster, connect cluster node 1 to the SAN Connecting a PowerEdge Cluster to Multiple Storage Systems There is a maximum of four storage systems per clusterConnecting a PowerEdge Cluster to a Tape Library Storage systemsObtaining More Information Configuring Your Cluster With SAN BackupTape library 15. Cluster Configuration Using SAN-Based Backup Tape library Storage systems Cabling Your Cluster HardwareCabling Your Cluster Hardware Preparing Your Systems for Clustering Cluster Configuration OverviewPreparing Your Systems for Clustering Preparing Your Systems for Clustering Installation Overview Implementing Zoning on a Fibre Channel Switched Fabric Installing the Fibre Channel HBAsInstalling the Fibre Channel HBA Drivers Using Worldwide Port Name Zoning Port Worldwide Names in a SAN EnvironmentSingle Initiator Zoning Port Worldwide Names in a SAN Environment IdentifierInstalling and Configuring the Shared Storage System Access ControlStorage Groups Name Name of the host system Displays whether the path is enabled or disabledStorage Group Properties Property Description IP address IP address of the host systemNavisphere Manager Navisphere AgentEMC PowerPath Preparing Your Systems for Clustering Configuring and Managing LUNs Configuring the Hard Drives on the Shared Storage SystemsConfiguring the RAID Level for the Shared Storage Subsystem SnapView Optional Storage FeaturesMirrorView SAN CopyInstalling and Configuring a Failover Cluster Updating a Dell/EMC Storage System for ClusteringPreparing Your Systems for Clustering Troubleshooting Troubleshooting Been booted Troubleshooting Attached storage systems Check all zone configurationsCheck all cable connections Use the Advanced withRecovery tab Window, double-click ServicesCluster Services My Computer and click ManageZoning Configuration Form Node HBA WWPNs Storage Or Alias WWPNs or NamesZoning Configuration Form Cluster Data Form Public IP Address Private IP Address Cluster Data FormDAEs Name Seed Cluster Data FormIndex IndexInstalling SAN Index