This is a bit of a rant, so adjust your filters accordingly.

I'm currently doing some work not really production datacenter (unless you ask the developers) that has a variety of systems. Some of the systems I'm dealing with are the 4 servers in 2U variety. It's a neat idea, but great care needs to be taken to avoid problems. One of the big issues is cable density, the systems I'm managing have a BMC connection, 2x 1Gb connections, 2x 10Gb connections and a KVM dongle. That's 7 things plugged into the back of each server (KVM is VGA + USB). Multiply times 4 and that's 28 cables per 2U, plus 2x power cables. That's a lot of cables and they aren't running in neat rows like a 48 port switch.

Adding to the problem is the fact that the disks plug in to the front of the system and the server electronics plug in from the back of the server. Right through that rat's nest of cabling. It's a challenge. If you consider yourself ok at cabling, you don't have anywhere near the skills to do cabling at this density. Typical cabling standards are not adequate for these kinds of setups. Mediocre cabling also really blocks the air flow, the systems I'm dealing with are nice and toasty at the back of the cabinet.

Another option I'm dealing with are blade enclosures. They manage to get 16 servers into 10U of rack space and (at least here) they have switches built in. This means I only have 7 network cables running to a top of rack switch/patch panel. So much easier to deal with. The blades are accessable via the front of the rack, which is also much easier. The enclosures have built in management, which again makes things easier. A downside is that certain failures require taking down the entire encloser to fix, so you lose 16 servers instead of the 4 in the other kind of high density server. I have never ben a big fan of blade enclosures, but I'm starting to come around.

Of course, one issue that too few people think about until it is too late is the issue of power density and cooling capacity. Being able to put 4 servers in 2U sounds really nifty until you discover you can only power and cool half a rack of them.

This concludes my rant for today.  Maybe.

-- Matt
It's not what I know that counts.
It's what I can remember in time to use.
_______________________________________________
Tech mailing list
Tech@lists.lopsa.org
https://lists.lopsa.org/cgi-bin/mailman/listinfo/tech
This list provided by the League of Professional System Administrators
http://lopsa.org/

Reply via email to