I have been running a 3 * 750 gig Raid 5 system for some time on firmware 1.0.10 without much trouble. I began to run short of space and so tried to add another 2 750 gig drives.
I think i made an error and the drives were automaticly set to add as spare (i since read this can cause issues). I then tried to migrate the 3 drive system into a 5 drive Raid 5 system, the interface let me do this without any complaints.
I noticed after a few minutes the web interface stopped responding but as all the drives seemed to be active i let it continue realising it would take many hours to complete. After 24 hours there was still no web interface and the drives were no longer being accessed. The LCD screen reported 30 odd percent complete on the migration and after a few more hours of no activity and no progress i decided to reboot the system as i assumed it had crashed.
The system powered up with a Raid N/A message and that is how it has stayed. The boot log shows 3 drives being mounted and 2 as spares, then complains the lvm is corrupt (or something of that nature).
I emailed Thecus about this a month ago and still havent heard back from them.
I really dont want to start again with the array as this was my backup and some data is not replacable.
If anyone can help me access what is left of my files i would be most grateful.