Dell CX4 manual Preparing Your Systems for Clustering

Page 40

5Configure each cluster node as a member in the same Windows Active Directory Domain.

NOTE: You can configure the cluster nodes as Domain Controllers. For more information, see the “Selecting a Domain Model” section of Dell Failover Clusters with Microsoft Windows Server 2003 Installation and Troubleshooting Guide or Dell Failover Clusters with Microsoft Windows Server 2008 Installation and Troubleshooting Guide located on the Dell Support website at support.dell.com.

6Establish the physical storage topology and any required storage network settings to provide connectivity between the storage array and the systems that you are configuring as cluster nodes. Configure the storage system(s) as described in your storage system documentation.

7Use storage array management tools to create at least one logical unit number (LUN). The LUN is used as a cluster Quorum disk for Windows Server 2003 Failover cluster and as a Witness disk for Windows Server 2008 Failover cluster. Ensure that this LUN is presented to the systems that you are configuring as cluster nodes.

NOTE: For security reasons, it is recommended that you configure the LUN on a single node as mentioned in step 8 when you are setting up the cluster. Later, you can configure the LUN as mentioned in step 9 so that other nodes in the cluster can access it.

8Select one of the systems and form a new failover cluster by configuring the cluster name, cluster management IP, and quorum resource. For more information, see "Preparing Your Systems for Clustering" on page 39.

NOTE: For Failover Clusters configured with Windows Server 2008, run the Cluster Validation Wizard to ensure that your system is ready to form the cluster.

9Join the remaining node(s) to the failover cluster. For more information, see "Preparing Your Systems for Clustering" on page 39.

10Configure roles for cluster networks.

11Test the failover capabilities of your new cluster.

NOTE: For Failover Clusters configured with Windows Server 2008, you can also use the Cluster Validation Wizard.

40

Preparing Your Systems for Clustering

Image 40
Contents Hardware Installation Troubleshooting Guide October Contents Installing the Fibre Channel HBAs Cluster Configuration OverviewImplementing Zoning on a Fibre Channel Zoning Configuration Form Contents Introduction IntroductionCluster Hardware Requirements Cluster SolutionCluster nodes Cluster storage Cluster Nodes Cluster Node Requirements Component Minimum RequirementIdentical For the internal drivesCluster Storage Cluster Storage RequirementsRequirement Enclosure ExpansionNavisphere Manager Storage Views Description EMC PowerPath Limitations in a Direct-Attached Cluster Supported Cluster ConfigurationsDirect-Attached Cluster Other Documents You May Need SAN-Attached ClusterPage Cabling the Power Supplies Cabling Your Cluster HardwareCabling the Mouse, Keyboard, and Monitor Cabling Your Cluster HardwareCabling Your Cluster Hardware Cabling Your Cluster for Public and Private Networks Network Connections Description Cabling the Public NetworkPrivate network Cluster node Using Dual-Port Network Adapters Cabling the Storage SystemsCabling the Private Network NIC TeamingCabling Storage for Your Direct-Attached Cluster Fibre Channel ConnectionsCabling a Cluster to a Dell/EMC Storage System Cabling a Two-Node Cluster to a Dell/EMC Storage SystemCluster node HBA ports CX4-120 or CX4-240 storage systemCabling a Two-Node Cluster to a CX4-960 Storage System Cabling Your Cluster Hardware Cabling Storage for Your SAN-Attached Cluster Cabling Two Two-Node Clusters to a Dell/EMC Storage SystemCluster node Private network Storage system Cabling Your Cluster HardwareStorage system Cabling Your Cluster Hardware Cabling a SAN-Attached Cluster to a Dell/EMC Storage System Connect cluster node 1 to the SAN Sw0 CX4-480 storage system Sw1 Cabling Your Cluster Hardware CX4-960 storage system 12. Cabling a SAN-Attached Cluster to the Dell\EMC CX4-960First cluster, connect cluster node 1 to the SAN Connecting a PowerEdge Cluster to Multiple Storage Systems There is a maximum of four storage systems per clusterConnecting a PowerEdge Cluster to a Tape Library Storage systemsObtaining More Information Configuring Your Cluster With SAN BackupTape library 15. Cluster Configuration Using SAN-Based Backup Tape library Storage systems Cabling Your Cluster HardwareCabling Your Cluster Hardware Preparing Your Systems for Clustering Cluster Configuration OverviewPreparing Your Systems for Clustering Preparing Your Systems for Clustering Installation Overview Implementing Zoning on a Fibre Channel Switched Fabric Installing the Fibre Channel HBAsInstalling the Fibre Channel HBA Drivers Using Worldwide Port Name Zoning Port Worldwide Names in a SAN EnvironmentSingle Initiator Zoning Port Worldwide Names in a SAN Environment IdentifierInstalling and Configuring the Shared Storage System Access ControlStorage Groups Displays whether the path is enabled or disabled Storage Group Properties Property DescriptionName Name of the host system IP address IP address of the host systemNavisphere Manager Navisphere AgentEMC PowerPath Preparing Your Systems for Clustering Configuring and Managing LUNs Configuring the Hard Drives on the Shared Storage SystemsConfiguring the RAID Level for the Shared Storage Subsystem Optional Storage Features MirrorViewSnapView SAN CopyInstalling and Configuring a Failover Cluster Updating a Dell/EMC Storage System for ClusteringPreparing Your Systems for Clustering Troubleshooting Troubleshooting Been booted Troubleshooting Check all zone configurations Check all cable connectionsAttached storage systems Use the Advanced withWindow, double-click Services Cluster ServicesRecovery tab My Computer and click ManageZoning Configuration Form Node HBA WWPNs Storage Or Alias WWPNs or NamesZoning Configuration Form Cluster Data Form Public IP Address Private IP Address Cluster Data FormDAEs Name Seed Cluster Data FormIndex IndexInstalling SAN Index