I'm trying to get a two node test cluster going using OES2018 on VMWare 6.5.

When I try and configure NCS with it set to do not start NCS right away, the host panics whenever I do start it via "systemctl start novell-ncs.service". It is fully patched up to today.

Here is the error:

2017-12-21T11:06:10.902692-08:00 tc10cn01 kernel: [ 37.787136] CLUSTER-<INFO>-<6135>: Searching for SBD partition ...
2017-12-21T11:06:15.906688-08:00 tc10cn01 kernel: [ 42.789194] CLUSTER-<INFO>-<6135>: Searching for SBD partition ...
2017-12-21T11:06:17.728346-08:00 tc10cn01 ndsd[2250]: [Info]InsertVolume: Volume _ADMIN is mounted and added to volume table
2017-12-21T11:06:20.910688-08:00 tc10cn01 kernel: [ 47.793194] CLUSTER-<INFO>-<6135>: Searching for SBD partition ...
2017-12-21T11:06:25.914686-08:00 tc10cn01 kernel: [ 52.797197] CLUSTER-<INFO>-<6135>: Searching for SBD partition ...
2017-12-21T11:06:30.918687-08:00 tc10cn01 kernel: [ 57.801197] CLUSTER-<FATAL>-<6022>: There is NO SBD Partition required by the cluster !!!
2017-12-21T11:06:30.918698-08:00 tc10cn01 kernel: [ 57.801197] Please run "sbdutil -c" to create SBD Partition.
2017-12-21T11:06:30.918699-08:00 tc10cn01 kernel: [ 57.801202] CLUSTER-<NORMAL>-<6103>: SBD.NLM unloaded.
2017-12-21T11:06:30.942797-08:00 tc10cn01 ldncs[4496]: modprobe: ERROR: could not insert 'sbd': Operation not permitted
2017-12-21T11:06:30.943315-08:00 tc10cn01 systemd[1]: novell-ncs.service: Control process exited, code=exited status=6
2017-12-21T11:06:30.943456-08:00 tc10cn01 systemd[1]: Failed to start Novell Cluster Services(NCS).
2017-12-21T11:06:30.943558-08:00 tc10cn01 systemd[1]: novell-ncs.service: Unit entered failed state.
2017-12-21T11:06:30.943650-08:00 tc10cn01 systemd[1]: novell-ncs.service: Failed with result 'exit-code'.

If I boot the host into rescue mode it sure looks like I have an SBD:

tc10cn01:~ # sbdutil -f
/dev/nss/18clus.sbd
tc10cn01:~ # sbdutil -v

Cluster (SBD) partition on /dev/nss/18clus.sbd.

Signature # HeartBeat State eState Epoch SbdLock Bitmask NSSLIB
SBDS 0 00000001 0 UNLK 00000000 255
SBDS 1 00000001 0 UNLK 00000000 255
SBDS 2 00000001 0 UNLK 00000000 255
SBDS 3 00000001 0 UNLK 00000000 255
SBDS 4 00000001 0 UNLK 00000000 255
SBDS 5 00000001 0 UNLK 00000000 255
SBDS 6 00000001 0 UNLK 00000000 255
SBDS 7 00000001 0 UNLK 00000000 255
SBDS 8 00000001 0 UNLK 00000000 255
SBDS 9 00000001 0 UNLK 00000000 255
SBDS 10 00000001 0 UNLK 00000000 255
SBDS 11 00000001 0 UNLK 00000000 255
SBDS 12 00000001 0 UNLK 00000000 255
SBDS 13 00000001 0 UNLK 00000000 255
SBDS 14 00000001 0 UNLK 00000000 255
SBDS 15 00000001 0 UNLK 00000000 255
SBDS 16 00000001 0 UNLK 00000000 255
SBDS 17 00000001 0 UNLK 00000000 255
SBDS 18 00000001 0 UNLK 00000000 255
SBDS 19 00000001 0 UNLK 00000000 255
SBDS 20 00000001 0 UNLK 00000000 255
SBDS 21 00000001 0 UNLK 00000000 255
SBDS 22 00000001 0 UNLK 00000000 255
SBDS 23 00000001 0 UNLK 00000000 255
SBDS 24 00000001 0 UNLK 00000000 255
SBDS 25 00000001 0 UNLK 00000000 255
SBDS 26 00000001 0 UNLK 00000000 255
SBDS 27 00000001 0 UNLK 00000000 255
SBDS 28 00000001 0 UNLK 00000000 255
SBDS 29 00000001 0 UNLK 00000000 255
SBDS 30 00000001 0 UNLK 00000000 255
SBDS 31 00000001 0 UNLK 00000000 255

Log capacity: 0, valid records: 0 (0/0/0/0).

Anyone seen anything like this on 2018?

thanks,
Andrew