Results 1 to 13 of 13

Thread: why do they always do this? (server rooms)

  1. #1
    Senior Member
    Join Date
    Sep 2005
    Posts
    587
    Thanks
    7
    Thanked
    7 times in 7 posts

    why do they always do this? (server rooms)

    Hey, i am just curious about this. I'm just going to post a couple pics found on google images to illustrate what I'm talking about.

    The question: Why do they always have racks of servers, and then a separate rack of switches, and then connect them all together with a huge bundle of cables going across the whole room?

    For example: Racks of servers


    Rack of switches


    I would think it might be much less cabling if there was a switch at the top of each rack, which handled all the computers in that rack. Then each of those switches would connect to a central switch to tie the racks together.

    My awesome MS Paint skills at work:

  2. #2
    You're god damn right Barry's Avatar
    Join Date
    Jul 2003
    Posts
    1,484
    Thanks
    70
    Thanked
    75 times in 59 posts
    • Barry's system
      • Motherboard:
      • Gigabyte Z270M-D3H
      • CPU:
      • Intel i7 7700
      • Memory:
      • 16GB (2x8GB) Avexir 2400
      • Storage:
      • Samsung 860 256GB SSD, Sandisk Ultra 3D 500GB, LG BR Writer
      • Graphics card(s):
      • Evga GeForce GTX Titan X 12GB
      • PSU:
      • Corsair RM750I
      • Case:
      • Fractal Design Focus G
      • Operating System:
      • Windows 10 Professional
      • Monitor(s):
      • 28" Acer UHD 4K2K
      • Internet:
      • Sky Fibre

    Re: why do they always do this? (server rooms)

    One reason is to keep others away from it all, data centres have more than staff roaming around in
    Someone left a note on a piece of cake in the fridge that said, "Do not eat!". I ate the cake and left a note saying, "Yuck, who the hell eats paper ?

  3. #3
    Senior Member Blastuk's Avatar
    Join Date
    Nov 2008
    Location
    Newcastle
    Posts
    984
    Thanks
    93
    Thanked
    66 times in 64 posts
    • Blastuk's system
      • Motherboard:
      • Gigabyte Z77X-D3H
      • CPU:
      • Intel Core i5 3570
      • Memory:
      • Corsair Vengeance LP 4x4GB @ 1600mhz
      • Storage:
      • Samsung 840 Pro 250GB, Samsung 850 EVO 500GB
      • Graphics card(s):
      • GeForce GTX 970
      • PSU:
      • OCZ ZS 650W
      • Case:
      • Antec Eleven Hundred
      • Operating System:
      • Windows 7 64bit
      • Monitor(s):
      • Dell 2209WA 22" + Dell U2412M 24"
      • Internet:
      • Virgin 152Mb

    Re: why do they always do this? (server rooms)

    lol, that's done in paint?
    i don't believe it

    and you seem to have found one of the tidier pics of a rack of switches

  4. #4
    Jay
    Jay is offline
    Gentlemen.. we're history Jay's Avatar
    Join Date
    Aug 2006
    Location
    Jita
    Posts
    8,365
    Thanks
    304
    Thanked
    568 times in 409 posts

    Re: why do they always do this? (server rooms)

    your method has too many single points of failure. The reason you split your patches / switches and servers is due to a few things. One is that having to route patch panels around servers is a nightmare, another is so access to switches and servers can be separated, cooling and positioning of racks is not random, the room is planned with airflow in mind, ease of access. It all depends on space really. I have (had )a few nodes with 42U racks mixed with servers at the bottom and switches at the top, its not the best way to do it but due to space issues I hadn't other option. I tend to usedifferent racks for different purposes as well. Eg I try to use the HP/Dell/APC racks for servers and due to their size I try and use prysim racks for patch / switches.
    □ΞVΞ□

  5. #5
    NOT Banned
    Join Date
    Jan 2007
    Posts
    5,905
    Thanks
    410
    Thanked
    276 times in 252 posts

    Re: why do they always do this? (server rooms)

    Quote Originally Posted by Blastuk View Post
    lol, that's done in paint?
    i don't believe it
    It's a bloody good paint job if it's true

  6. #6
    Administrator Moby-Dick's Avatar
    Join Date
    Jul 2003
    Location
    There's no place like ::1 (IPv6 version)
    Posts
    10,665
    Thanks
    53
    Thanked
    384 times in 313 posts

    Re: why do they always do this? (server rooms)

    We have edge switches in each row , with fibre going back to the core switches , much less cabling - all servers have teamed NIC's so a cable is patched from each port to each edge switch.
    but thats for our given setup. Jay also makes valid points regarding power and cooling - spreading the load of both around the datacentre is important. Structured networking makes it a lot easier to move things.
    my Virtualisation Blog http://jfvi.co.uk Virtualisation Podcast http://vsoup.net

  7. #7
    Registered+
    Join Date
    Feb 2009
    Posts
    26
    Thanks
    0
    Thanked
    1 time in 1 post

    Re: why do they always do this? (server rooms)

    Quote Originally Posted by moogle View Post
    It's a bloody good paint job if it's true
    Clearly someone with too much time on their hands!

  8. #8
    Senior Member
    Join Date
    Sep 2005
    Posts
    587
    Thanks
    7
    Thanked
    7 times in 7 posts

    Re: why do they always do this? (server rooms)

    omg its not that hard to make, its just lines and boxes. i made one and copy paste. one time i drew a picture of a backhoe in paint on windows 98. it was really good but i cant find it any more

  9. #9
    Senior Member
    Join Date
    Sep 2005
    Posts
    587
    Thanks
    7
    Thanked
    7 times in 7 posts

    Re: why do they always do this? (server rooms)

    Update! I found pictorial evidence that some people do it just like my MS Paint drawing (a switch per rack)




  10. #10
    Senior Member
    Join Date
    Jul 2003
    Location
    Holsworthy, Devon
    Posts
    513
    Thanks
    9
    Thanked
    11 times in 11 posts
    • Ben Rogers's system
      • Motherboard:
      • Asus P8P67 B3
      • CPU:
      • Intel core i5 2500k @ 4400MHz
      • Memory:
      • 12GB DDR3 (8GB Corsair Vengeance 1600MHz)
      • Storage:
      • 60GB OCZ Agility 3 SSD (boot) + 1TB Samsung F3 + 500GB Samsung F1 SATA II
      • Graphics card(s):
      • MSI HD7870 2GB
      • PSU:
      • 650W Coolermaster VX
      • Case:
      • Coolermaster Centurion 5 II
      • Operating System:
      • Windows 7 64 bit SP1
      • Monitor(s):
      • 19" Samsung SyncMaste
      • Internet:
      • 23Mbit / 1.1 Mbit ADSL2

    Re: why do they always do this? (server rooms)

    I have worked removing rackmount servers for 5 hours at CSC DataCentre in Kingswood, Bristol and the cabling was no where near as neat as that, to say the least. Apparently they host data bases etc for the NHS, or did 2 years ago. They wouldn't tell me what Internet connection they had lol
    E6850@ 3700MHz / 6GB DDR2 / 500GB SATAII / nVidia 7800 GTX / Lian Li Plus7B

  11. #11
    Member
    Join Date
    Jul 2003
    Posts
    163
    Thanks
    4
    Thanked
    1 time in 1 post
    • Chan's system
      • Motherboard:
      • Asus M4A88TD-M EVO
      • CPU:
      • AMD Phenom II X4 955 BlackEdition
      • Memory:
      • 4GB Corsair XMS3, DDR3
      • Graphics card(s):
      • 1GB GTX460
      • PSU:
      • 520W Corsair HX Series Modular
      • Case:
      • Antec 300
      • Operating System:
      • Win 7 Pro
      • Monitor(s):
      • 2x Samsung SyncMaster P2450H 1920x1080

    Re: why do they always do this? (server rooms)

    Rack of switches
    MAINTAIN PAIR TWIST, those panels are so boring to terminate. Systimax don't even recommend tie-wrapping bunches perfectly like that, can get cross talk! Least it looks good I guess.

  12. #12
    Senior Member
    Join Date
    Jul 2003
    Location
    Holsworthy, Devon
    Posts
    513
    Thanks
    9
    Thanked
    11 times in 11 posts
    • Ben Rogers's system
      • Motherboard:
      • Asus P8P67 B3
      • CPU:
      • Intel core i5 2500k @ 4400MHz
      • Memory:
      • 12GB DDR3 (8GB Corsair Vengeance 1600MHz)
      • Storage:
      • 60GB OCZ Agility 3 SSD (boot) + 1TB Samsung F3 + 500GB Samsung F1 SATA II
      • Graphics card(s):
      • MSI HD7870 2GB
      • PSU:
      • 650W Coolermaster VX
      • Case:
      • Coolermaster Centurion 5 II
      • Operating System:
      • Windows 7 64 bit SP1
      • Monitor(s):
      • 19" Samsung SyncMaste
      • Internet:
      • 23Mbit / 1.1 Mbit ADSL2

    Re: why do they always do this? (server rooms)

    I don't think that kind of neatness with cables is even required in such an enviroment tbh.
    E6850@ 3700MHz / 6GB DDR2 / 500GB SATAII / nVidia 7800 GTX / Lian Li Plus7B

  13. #13
    Senior Member
    Join Date
    Sep 2006
    Location
    UK
    Posts
    1,011
    Thanks
    17
    Thanked
    14 times in 13 posts
    • Craig321's system
      • Motherboard:
      • Asus P8P67 Pro
      • CPU:
      • i7 2600k
      • Memory:
      • 4x 4GB Corsair XMS3 1600MHz
      • Storage:
      • 120GB OCZ Vertex 3
      • Graphics card(s):
      • Asus GTX480 1536MB
      • PSU:
      • 650W Corsair HX
      • Case:
      • Fractal Design Define R3
      • Operating System:
      • Windows 7 Professional 64-bit
      • Monitor(s):
      • Dell U2410

    Re: why do they always do this? (server rooms)

    Quote Originally Posted by Chan View Post
    MAINTAIN PAIR TWIST, those panels are so boring to terminate. Systimax don't even recommend tie-wrapping bunches perfectly like that, can get cross talk! Least it looks good I guess.
    Would have thought they'd use STP cables in such a big setup?

    Quote Originally Posted by latrosicarius View Post
    Hey, i am just curious about this. I'm just going to post a couple pics found on google images to illustrate what I'm talking about.

    The question: Why do they always have racks of servers, and then a separate rack of switches, and then connect them all together with a huge bundle of cables going across the whole room?

    For example: Racks of servers
    [img]http://zhost.tk/up/b0a341d0d3e6ef0afc225dbe339e8482.jpg[/i=mg]

    Rack of switches
    [img]http://zhost.tk/up/dbd925b18186cbd6a2fbf4a7e0d6720f.jpg[/ig]

    I would think it might be much less cabling if there was a switch at the top of each rack, which handled all the computers in that rack. Then each of those switches would connect to a central switch to tie the racks together.

    My awesome MS Paint skills at work:
    [img]http://zhost.tk/up/acb47e348d6b707851de5a9fa1bf5980.PNG[/ig]
    You have way too much time at work

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Looking Through Windows
    By Paul Adams in forum Software
    Replies: 16
    Last Post: 19-10-2018, 09:07 AM
  2. Hexus UT3 Server
    By Spud1 in forum Gaming
    Replies: 388
    Last Post: 22-02-2010, 01:17 PM
  3. Replies: 5
    Last Post: 26-04-2008, 06:45 PM
  4. VPN Server 2003 and DG834GT
    By pcpower in forum Networking and Broadband
    Replies: 1
    Last Post: 21-08-2007, 08:37 PM
  5. Building a Home Server....
    By EtheAv8r in forum PC Hardware and Components
    Replies: 1
    Last Post: 08-12-2004, 08:17 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •