Read more.Four onboard Ethernet ports, combined via Teaming, provide the headlining bandwidth.
Read more.Four onboard Ethernet ports, combined via Teaming, provide the headlining bandwidth.
Wowzors, 10G LAN ports!
I wonder how much this is going to come in at.....
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Lovely. But it would be nice to have 1Gb "Internet" speeds before we worry about getting 10GB LAN ports
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
nice.... now can they do a matx version
I'm thinking about my high-network-use work environment, but even there I struggle to see how this can possibly be useful... My office network connection is 1GBit to a router down the corridor, shared with a whole bunch of other people in the building. That router connects to the main servers along with a load of other similar routers on site. At no point can I use more than the minimum bandwidth between my closest router and the servers...
I can see the point of it in the server environment, marginally also for heavy users in a commercial/university environment, it seems so far ahead of most existing infrastructure that for an end user, especially in the domestic market, this is utterly pointless. Marketing it for domestic end users is pretty ridiculous!
I think the primary use will be content creation. Video editing, photo editing, CAD/CAM, that sort of stuff. All being kept safe on a centralised backed-up and RAID'd storage server. Perhaps a youtuber stepping up his hardware to the next level. NAS with a 10Gbit port, Xeon processor and a selected card to accelerate your program of choice, this board means you won't need a 10Gbit NIC add in card.
Think 4K files. An hour of 4K @ 25fps can be about 500GB! If you are working with that kind of source material from a server I think the 10Gbit line will become very noticeable.
It's a niche use I know, but this is a fairly niche product.
I work day-to-day with very large datasets used to solve the 3D structure of proteins, and in my free time I'm also into photo editing, especially composite/macro photography and so on; I get the benefits of a 10GBit connection with appropriate hardware! What I doubt hugely is the ability for anyone to actually use this tethering approach to get 22GBit in real terms: you'd need the same board in your server/NAS box, otherwise you're just limited to the 1 or 10GBit ethernet cable at that end... So you'd need two of these, 8 cat6 ethernet cables and a more-than-8-port 10GBit ethernet switch.
What I do is work on a PC, do all the editing there, then backup files to my NAS (which then backs up with a mate's NAS overnight so we have off-site storage for critical files). 10GBit is great - hopefully this will become mainstream soon. 22GBit by linking multiple connections is basically a non-starter for the vast majority of people, though.
Actually this looks like it would make a decent server board.
Although I'd point out that the router/switch you plug this into will need to also support 10gb connection, still 1gb has become pretty standard on new switches and the ability to use 4 separate lan connections as one would still give you a total of 4gb connection and many well work well for reducing latency if you're running virtual servers on it.
Saying it'll improve internet access is of course total bull
[rem IMG]https://i69.photobucket.com/albums/i45/pob_aka_robg/Spork/project_spork.jpg[rem /IMG] [rem IMG]https://i69.photobucket.com/albums/i45/pob_aka_robg/dichotomy/dichotomy_footer_zps1c040519.jpg[rem /IMG]
Pob's new mod, Soviet Pob Propaganda style Laptop.
"Are you suggesting that I can't punch an entire dimension into submission?" - Flying squirrel - The Red Panda Adventures
Sorry photobucket links broken
If this went into a home server, then it might save you the cost of buying a £1000 10GbE switch and allow you to talk to a pair of machines each with one port at 10Gb speeds. The 1Gbs port can be used for WAN and a slow entertainment LAN.
Then of course, to saturate a 10GbE link with anything other than very simple linear reads you are going to need a *lot* of disk drives.
Main PC: Asus Rampage IV Extreme / 3960X@4.5GHz / Antec H1200 Pro / 32GB DDR3-1866 Quad Channel / Sapphire Fury X / Areca 1680 / 850W EVGA SuperNOVA Gold 2 / Corsair 600T / 2x Dell 3007 / 4 x 250GB SSD + 2 x 80GB SSD / 4 x 1TB HDD (RAID 10) / Windows 10 Pro, Yosemite & Ubuntu
HTPC: AsRock Z77 Pro 4 / 3770K@4.2GHz / 24GB / GTX 1080 / SST-LC20 / Antec TP-550 / Hisense 65k5510 4K TV / HTC Vive / 2 x 240GB SSD + 12TB HDD Space / Race Seat / Logitech G29 / Win 10 Pro
HTPC2: Asus AM1I-A / 5150 / 4GB / Corsair Force 3 240GB / Silverstone SST-ML05B + ST30SF / Samsung UE60H6200 TV / Windows 10 Pro
Spare/Loaner: Gigabyte EX58-UD5 / i950 / 12GB / HD7870 / Corsair 300R / Silverpower 700W modular
NAS 1: HP N40L / 12GB ECC RAM / 2 x 3TB Arrays || NAS 2: Dell PowerEdge T110 II / 24GB ECC RAM / 2 x 3TB Hybrid arrays || Network:Buffalo WZR-1166DHP w/DD-WRT + HP ProCurve 1800-24G
Laptop: Dell Precision 5510 Printer: HP CP1515n || Phone: Huawei P30 || Other: Samsung Galaxy Tab 4 Pro 10.1 CM14 / Playstation 4 + G29 + 2TB Hybrid drive
Seek speed is magnitudes better, but still not free, so the same applies. A decent SSD can manage 3Gb/sec so it will take about 7 to saturate 20Gb/sec ethernet link in one direction. OFC a 10GbE link is full duplex, so you might want to double that, and add a bit for good measure, and another 20% for RAID redundancy. This is why big NAS machines have fibrechannel links to fibrechannel raid controllers, because SATA just won't cut it.
So the only way this thing could saturate the 10GbE ports is to use high end PCIe SSD cards in 3 slots (though 3 would be close enough).
This is the next step towards affordable 10 GbE for the (high tech) home.
I believe 10 GBE is at least a process shrink away from integration in £200 ish motherboards simply because current chips use way too much power.
Feeling smug about my Cat6 cabled house right now
"In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."
I'd say pretty much any SSD these days can do 5 Gbit. However that's not the only point. 5 Gbit is still better than 1 Gbit, even if it's only using 1/2 of the bandwidth.
Also, forget this 22Gbit rubbish or even 20 Gbit. It's like adding the 10 lanes of parts of the M25 together and saying the speed limit is 700 MPH.
Link aggregation only works with multiple servers and/or clients in the real world, regardless of what any theory says.
"In a perfect world... spammers would get caught, go to jail, and share a cell with many men who have enlarged their penises, taken Viagra and are looking for a new relationship."
I was just trying to get across just how quick 10Gb is, and why people generally haven't been that bothered (agreed that unless you are serving to multiple clients you aren't going to use both ports in a usual aggregation setup).
If you look here: http://www.anandtech.com/bench/SSD/260
The top performers are expensive PCIe cards, the usual suspects are running around 260MB/sec, or about half the speed of the SATA link they are on. So you are looking at 2.6Gbit is better than 1Gbit, and at that point I have to wonder how much I would pay for a doubling in speed. Unless you only ever move big files, in which case you would see that 5Gbit/sec.
There are currently 1 users browsing this thread. (0 members and 1 guests)