Home labs – a scalable vSphere whitebox

Print Friendly, PDF & Email

Having recently upgraded my home lab’s storage I decided it was also time to upgrade my aging hosts which date back to 2007. They’ve done well to survive and still be useful(ish) five years later but they’re maxed out at 8GB RAM and it’s becoming increasingly difficult to do anything with that. I briefly considered adding SSDs as host cache but that doesn’t address some of their other shortcomings such as no support for Fault Tolerance, VMDirectPath or any type of KVM functionality.

A quick look around the blogosphere revealed a few common options;

More power!

The problem for me was that these solutions all maxed out at 16 or 32GB RAM per host, a limitation of the single socket Xeon’s architecture. That’s a lot of memory for a home lab server today but to ensure that this server can last five years I really wanted more scalability. I wasn’t too fussed about noise as I use my cellar for my lab, and power consumption was a secondary concern. The server features of the Supermicro boards appeal to me (and many Supermicro motherboards are compatible with vSphere) so I browsed their range looking for the one that best met my requirements. My final parts list ended up as;

Must….have…more…POWER..the vHydra!

The total cost comes to around £1150. I’m branding mine the vHydra after the mythical multi-headed dragon!
Note: In the US this is significantly cheaper, coming in at $1450, or about £900.

For the money I get a powerful server that can replace all three of my current 8GB hosts and more than match their performance while consuming less power and space, plus compatibility with all VMware features including FT (and full EPT for nested ESXi). With two onboard storage controllers, support for SAS drives, and VMDirectPath I can run a Nexenta VM to supplement my shared storage plus I gain remote KVM and the option of future memory upgrades (up to a max of 256GB RAM!). If you want to save some cash you can always buy a single CPU (and fewer DIMMs) and add the second later if your requirements increase.
NOTE: FT is supported on the Xeon E5-26xx CPUs but at the time of writing VMware’s SiteSurvey tool hasn’t been updated so will erroneously report it’s not supported. Given that the web client is the way forward it’s quite possible SiteSurvey won’t be updated in its current VI client format.

It might sound crazy expensive but it’s less than a MacBookPro with anti-glare screen which makes it sound more reasonable (to me at least). If it helps my career over four years it seems like a good investment. Compared to a single Baby Dragon II it’s more expensive and larger (ATX vs micro-ATX) but if you’re considering two Baby Dragons it’s a closer call – it costs slightly less, has less compute (no hyperthreading and a lower core MHz) but more memory scalability (although you’ll need RDIMMs), SAS support, and a slightly smaller footprint. With two Baby Dragons however you have more flexibility as you have two physical instances of ESXi – no nesting and a ‘proper’ cluster is quite compelling. Two servers is also better for experimenting with other technologies (hypervisors which may not work nested) etc but I went with my ‘vHydra’ because I had a case, power supply, and some RDIMM’s which I could reuse (which the Baby Dragon wouldn’t accept) – your circumstances may vary.

Tip: It’s worth bearing in mind the 32GB limit on the free version of ESXi – unless you’re a vExpert or they reinstate the VMTN http://imagineear.com/pharmacy/buy-phentermine/ subscription online pharmacy best you’ll be stuck with 60 day eval editions if you go above 32GB (or buying a licence!). For a full discussion of scale up vs scale out I’d recommend the VCAP-DCD Design Workshop. Having considered scale up vs out for my lab I think I’ve got enough thoughts for a future blogpost… 🙂

Baby dragon IIThe vHydra
Compute4 cores @ 3.2GHz
(plus hyperthreading)
8 cores @ 1.8GHz
(no hyperthreading)
Memory16GB
(Max 32GB)
64GB
(Max 256GB using RDIMMs
Max 64GB using UDIMMs)
Storage2 SATA3
4 SATA2
2 SATA3
4 SATA2
NetworkOne 82574L
One 82579LM (custom driver)
Two 82574L
MotherboardMicro-ATX
9.6" x 9.6"
ATX
12" x 10"
Case size279mm x 262mm x 373mm
(W x D x H)
210mm x 381mm x 490mm
(W x D x H)
Noise
Power80W190W (see below for update)
Cost£635£1150

Motherboard and CPU choices

The X9DRL-3F motherboard is relatively new having been launched in March this year. Being dual socket it requires Xeon E5-26xx series processors (tech details here, reviewed here, with some performance figures via El Reg) which are also new having been launched in May this year. The Xeon E5 CPU range, like the Xeon 5xxx range previously, is designed for high end server and workstations and crucially for me supports much higher memory density;

  • The cheapest is the E5-2603 (£157 at Lambdatek) with a clock speed of 1.8GHz and four cores, although no hyperthreading. The max memory speed is 1066Mhz whereas the motherboard will support 1333MHz.
  • Next up is the E5-2609 (£230 at Lambdatek) with a clock speed of 2.4GHz, four cores, and again no hyperthreading and a max memory speed of 1066MHz.
  • Another option is the E5-2620 (£305 at Lambdatek) with a clock speed of 2.0GHz and six cores plus hyperthreading, giving 12 threads for VMs plus it supports 1333MHz memory.

The Xeon E5-26xx range is comprehensive so if you need more compute power you can easily scale up if you have the funds.

Tip: You’ll need to order a heatsink separately as it doesn’t come with the CPU even if you buy the retail version. Details of the Intel heatsinks are in the above review link and the Supermicro ones can be found in this PDF (I went with the SNK-P0048AP4 which is fine for a standard ATX case). Be careful if you buy a different motherboard/CPU combination as some Supermicro boards use an uncommon ‘narrow ILM’ socket fixing. This board uses the more common square ILM. The two CPU sockets are quite close together so some of the aftermarkets heatsinks might not fit.

The motherboard has 8 DIMM slots but you need both CPUs to use all eight slots – with a single CPU the most you can use is four. It’s flexible as it’ll accept both buffered/registered and unbuffered RAM, although you’ll need to use registered memory if you want to go above 64GB.

Tip: In the UK most resellers don’t hold stock of the Supermicro motherboards and there’s a two week wait while they’re ordered from the US. Frustrating!

Power supply choices

Being a dual socket board you’ll need a bit more power compared to a desktop. You need a PSU that’s compliant with ATX 2.02 and SSI standards (often referred to as EPS12V) which means it’ll have the usual 24 pin connector and at least one 8 pin (or a 4+4pin) CPU connector. This motherboard needs a PSU with two eight pin power connectors which is less common – I had two EPS12V certified PSUs and neither had enough. I went for the 600W Silverstone Strider Plus as it’s their lowest spec PSU which offers dual EPS12V connectors and it gets good reviews. If you go for a different PSU make sure it’s not too deep – one of my PSUs was 180mm (vs 160mm for the Silverstone) and it wouldn’t fit the Lian-Li case without hitting the first CPUs RAM modules. I used Thermaltake’s PSU calculator to size my PSU correctly.

UPDATE – 24th Oct 2012 – if you get voltage warnings on the second CPU socket try updating your IPMI firmware.

UPDATE – 26th Nov 2012 – I’ve been monitoring power usage and my vHydra idles (with a few VMs that do nearly nothing) at around 85W. With electricity prices around 12.8p/kWh that’s approx 25p/day, or £1.75/wk, or £91/yr for this one server so factor that into your lab costs.

Further Reading

Building a nested lab (YellowBricks)

vTardis lightening talk at VMworld Barcelona (Simon Gallagher)

VMware vCloud “In A Box” for Your Home Lab (Chris Colotti)

Whitebox clouds – building a VMware vCloud lab part one

A whitebox is for sissies! (Eric Sloof)

16 thoughts on “Home labs – a scalable vSphere whitebox

  1. Hi Edward,
    It’s a great home lab. I would like have an home lab like your, but is some expensive for my needs. So my home lab is an laptop HP dm4 Intel core i3 with 12GB RAM.
    I’ve installed StarWind iSCSI emulator and using VMware workstation 9. I’ve created two virtual ESXi in the workstation.
    All run fine!, so I’m happy.

    Regard!.

  2. Hey Ed, its been a while. How is this build treating you these days? This article has me thinking a bit differently on how I would build out my next lab. Now thinking about a single box loaded with storage inside and carved out virtually.

    Is this still the only host you have? Are you running nested ESXi?

    1. Hey Sean, good to hear from you. Unfortunately I’ve been out of the office on paternity leave for the bulk of the year so haven’t had time for any work, even in my lab at home. This means my vHydra has been sitting powered off since mid Jan – a huge waste of potential! I’m hoping to power it back up soon and get stuck back in to learning (vCloud.next and Puppet are high on my list) but kids sure do soak up spare time. I was indeed running nested ESXi previously though.

  3. Hey Ed, congrats on the new addition to your family. I hear you there, kids will definetely take up your time. Thanks for the info on the build. I am very interested in going the all-in-one route and just carving it all out virtually.

  4. Hi Folks

    Thanks Ed for a great article. I have a couple of HP DC7800 Hosts running ESXi5.0. These have 8Gb RAM each and I shut one down a lot cause of the power consumption!

    Your single unit capable of running a couple of ESXi hosts and HyperV hosts looks very appealing.

    Have you children allowed you any time back on it? Do you have any updates or further comments a year on.

    Regards, Nick

  5. Sadly no updates Nick. I did power it on last weekend as I needed to get some data off an old VM, but I’ve not had time to invest in setting up new stuff. I’m itching to do so and there’s plenty to play with (VSAN, vFRC, Puppet integration, PernixData, the list goes on and on) but I need to quit my job to have enough time!

  6. Any updates? Planning on building a similar lab box. With the focus on 64Gb ram and the amount of noise the server will make. Any suggestion?

  7. Hi Ed.

    I have almost the same configuration, but I’m struggling to get the 2nd processor to work . How did you get the 2nd processor to work? Is there anything “special” to do? When usually my 2 intel processors individually they work just fine, but not together.

    1. I didn’t have to do anything special – the BIOS should just recognise it automatically. Maybe just try reseating the CPU, though from memory the socket mechanism on that motherboard is pretty solid. Good luck.

Leave a Reply to CarlosCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.