Read more.Powered by 24 Intel DC S3700 800GB SSDs!
Read more.Powered by 24 Intel DC S3700 800GB SSDs!
just can't quite justify this for my house, but I really want to.
Would have been interesting to see the RAID5/6/10/50 configuration performance, but it seems that time was not permitting on this occasion. Of course I would also like to see what this does when hooked up to an apple based system for video editing, or a *nix system as a db...
I'd rather have a whiptail
my Virtualisation Blog http://jfvi.co.uk Virtualisation Podcast http://vsoup.net
Typo on the first page, says Server 2008 RC2 instead of R2.
Shame more tests weren't done with this, really, it does seem like a silly amount of performance.
I'd take a guess 300,000 IOPS is limited by the RAID cards and system rather than the drives - as with many things 24x drives is never going to be anywhere near 24x the performance, much is lost in the RAID processing overheads. RAID 0 was a totally useless configuration to test when the underlying disks are so fast and it's interface to the world is only 4x 1Gb NICs (i.e. 500MB/s - a couple of SSDs could saturate that nevermind 24 of them). Almost nobody would actually seriously use the array in that configuration...
I'd be tempted to configure a RAID 50/60 if supported with 2-4 hot spares, would be nice to have seen the performance of that. Whereas in a 24 disk array with spinning disks I'd have to use 600-900GB 10/15K disks in RAID 10 to get higher performance I might be able to use 400GB SSDs in a RAID 50 and still get better performance with closer drive prices.
24x 600GB RAID 10 excluding 2 hotspares = ~6.6TB usable (600GB 15K drives are ~£300 each)
24x 400GB RAID 50 excluding 4 hotspares = ~7.2TB usable (~£775 each for SC700 400GB)
So 2.5x the cost per drive rather than 5x (for the 800GB) creates similar capacity and probably still much faster and maybe even still controller/NIC limited - which would have been a great test to run.
However performance aside this unit isn't really enterprise grade in my opinion, not for any critical front-line "live" role anyway (e.g. VMWare shared storage), it's very disappointing to see the OS drive is a single point of failure and non-hotswap as well is unforgivable. A proper SAN like an Equalogic or HP Px000 is in the same price range and has redundant hot-swap controllers, more network interfaces (maybe 10Gb) and a tightly focussed embedded OS rather than Windows.
Many might Argue that an Equalogic isnt a proper SAN ( though they usually work for the Compellant side of Dell ) What would mitigate things more would be the ability to use the storage in a shared mode with another Server 2012 box & storage spaces. Its the only way I'd be happy with windows at my primary storage OS.
my Virtualisation Blog http://jfvi.co.uk Virtualisation Podcast http://vsoup.net
Hammer Time!!
Output (29-07-2013)
True - though I'm not seeing any reference to wear levelling or any other SSD optimisations for either !
my Virtualisation Blog http://jfvi.co.uk Virtualisation Podcast http://vsoup.net
"Boss, we have 45000 to spend before year-end when the budget is taken away. I have an idea. Go ahead?"
Can it run mine sweper?
Single OS drive is the least of its problems. Looks like if you lose a raid controller, the other one can't take over that storage. If you lose the server, I can't see how another server in a cluster can take over the storage.
So yeah, interesting toy, but not enterprise grade.
hello . i am Mark wu. it is a pleasure to be in this group.
Nice and all but I'd rather buy a much cheaper Equalogic hybrid or roll my own. Still not completely sold on full SSD SANs and their worth (and their implementation in some cases).
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
There are currently 1 users browsing this thread. (0 members and 1 guests)