Cluster Configuration Repository (CCR)
- /etc/cluster/ccr (directory)
Important Files
- /etc/cluster/ccr/infrastructure
Global Services
- One node is to specific global services. All other nodes communicate with the global services (devices, filesystems) via the Cluster interconnect.
Global Naming (DID Devices)
-
– /dev/did/dsk and /dev/did/rdsk
- DID used only for naming globally — not access
- DID device names cannot/are not used in VxVM
- DID device names are used in Sun/Solaris Volume Manager
Global Devices
- provide global access to devices irrespective of there physical location.
- Most commonly SDS/SVM/VxVM devices are used as global devices. LVM software is unaware of the implementation of global nature on these devices.
/global/.devices/node@nodeID
- nodeID is an integer representing the node in the cluster
Global Filesystems
-
# mount -o global, logging /dev/vx/dsk/nfsdg/vol01 /global/nfs
or edit the /etc/vfstab file to contain the following:
/dev/vx/dsk/nfsdg/vol01 /dev/vx/rdsk/nfsdg/vol01 /global/nfs ufs 2 yes global,logging
Global Filesystem is also known as (aka) Cluster Filesystem (CFS) or PxFS (Proxy File system)
NOTE: Local failover filesystems (ie. directly attached to a storage device) cannot be used for scalable services — one would have to use global filesystems for it.
Console Software
- SUNWccon There are three wariants of the cluster console software:
- cconsole ( access the node consoles through the TC or other remote console access method )
- crlogin (uses rlogin as underlying transport)
- ctelnet (uses telnet as underlying transport)
/opt/SUNWcluster/bin/
&
Cluster Control Panel
/opt/SUNWcluster/bin/ccp [ clustername ] &
All necessary info for cluster admin is stored in the following two files:
-
–> /etc/clusters Eg: sc-cluster sc-node1 sc-node2
–> /etc/serialports
sc-node1 sc-tc 5002 # Connect via TCP port on TC
sc-node2 sc-tc 5003
sc-10knode1 sc10k-ssp 23 # connect via E10K SSP
sc-10knode2 sc10k-ssp 23
sc-15knode1 sf15k-mainsc 23 # Connect via 15K Main SC
e250node1 RSCIPnode1 23 # Connect via LAN RSC on a E250
node1 sc-tp-ws 23 # Connect via a tip launchpad
sf1_node1 sf1_mainsc 5001 # Connect via passthru on midframe