Software Raid Question

James Bensley jwbensley at
Tue Jun 8 22:11:07 UTC 2010

I have an Ubuntu 9.10 box with 7 SATA II drives attached, each one has
a single partition that is size of the entire drive and they are in a
software RAID 6.

(all drives are the same size, make and model etc).

I believe one of my drives is dying because when I boot up the drive
is only correctly picked up say 1in every 10 boots but I am not
bothered (or at least, was not bothered) because I have a RAID 6 so
that this wouldn't be a problem however I find that when I boot the
machine up with the problem drive not attached the RAID won't mount
but with it attached, like I said the drive is only correctly detected
by the BIOS every say 1 in to 10 boots, so I have to reboot over and
over until the drive is picked up, then I can use the RAID.

Surely this is not normal behaviour for a RAID? Also I just booted the
box up only to find that the other six drives are now marked as spares
(I believe that is what the 's' in curly braces means yes?)

bensley at ubuntu:~$ cat /proc/mdstat
Personalities : [linear] [multipath] [raid0] [raid1] [raid6] [raid5]
[raid4] [raid10]
md0 : inactive sdf1[0](S) sdg1[5](S) sde1[1](S) sdc1[6](S) sdb1[4](S) sdd1[3](S)
      5860559616 blocks

unused devices: <none>

Anyone got any idea why the RAID doesn't mount and why all disks are
now marked as spares?


There are 10 kinds of people in the world; Those who understand
Vigesimal, and 9 others...?

More information about the ubuntu-users mailing list