Building a Home Render Farm without Breaking the Bank
Cost ConsiderationsOne of my main considerations was cost. I am starting my own home studio, and the cost of software and other expenses of starting your own business grows every day. An Entertainment Creative Suite license with Maya from Autodesk costs $5,775. Add in the annual subscription and that brings the total over $6,000. With this in mind, I wanted to keep the cost of this machine to less than $3,500. I could've scrambled around and picked up used pieces here and there, and in one case I did, but everything else in this machine will be new. There are other articles out there about scrounging around to find used computers and parts to build a render farm. This blog is about a new machine to handle my rendering needs. When reviewing processors, we decided to go with AMD. I have built my last few computers using AMD, and really love the cost/reliability factor. We decided to go with dual AMD Opteron 6348, 2.8GHz, 115 watt, 12-core server processors. Twenty-four cores in one machine is just amazing. To do this, we needed to use a server motherboard which is a huge cost by itself, but lessened by the cost of the processors.
Hardware SelectionBefore we go any further, I have to thank my friend, John Vielee. He really knows his way around a motherboard and without his help this would have taken a lot longer!
Here’s a list of the hardware we chose to build this beast:
- SUPERMICRO MBD-H8DG6-F-O E-ATX Form Server Motherboard
- 2X AMD Opteron 6348 Abu Dhabi 2.8GHz 12MB L2 Cache 16MB L3 Cache Socket G34 115W 12-Core Server Processor
- 8X Corsair Vengeance 8GB (2x4GB) DDR3 1600 MHz RAM
- Corsair RM Series 850 Watt ATX/EPS 80PLUS Gold-Certified Power Supply
- 4X WD Red 3 TB NAS Hard Drive: 3.5 Inch, SATA III, 64 MB Cache
- Cooler Master HAF 932 Advanced Blue Edition, Full Tower Computer Case
- 2X Noctua NH-U9DO A3 AMD Opteron, 4 Dual Heat-pipe SSO Bearing Quiet CPU Cooler
- iStarUSA BPU-340SATA-BPL 3×5.25″ to 4×3.5″ SAS/SATA 6.0 Gb/s Hot-Swap Cage
- SAMSUNG 840 Pro Series MZ-7PD256BW 2.5″ 256GB SATA III MLC Internal Solid State Drive (SSD)
- ASUS DRW-24IBST Internal DVD R/RW Burner
- GeForce GTX 590 video card (used)
All of the new hardware was purchased at Amazon, Newegg and Microcenter. Shop around for the best prices, as it changes sometimes from week to week. I decided to put a video card in this machine so that if my main machine were to crash and burn for some reason, I would have a backup ready to go. Keep the clients happy, and no excuses.
I decided to use the GeForce GTX 590 for one reason: It’s just reliable! A gaming card is definitely the way to go these days for a CG machine and, for our purposes, it does a great job handling Maya. You can scrub dynamics no problem, run Viewport 2.0 with DirectX 11, anything. I’ve never had an issue using a GeForce GTX card.
Putting It All Together
We unpacked everything, put the motherboard on a nice piece of foam from its box to protect it, and started putting everything into the motherboard. Here we have the AMD processors in place, with the RAM installed. We chose low profile RAM so that the Noctua heat sinks would fit over them. Make sure to put the RAM in before the heat sinks.
You can get the RAM in and out after, but it’s a tight fit. We finished this part off by putting the motherboard in the Cooler Master case. The HAF 932 is a huge case, and a bit heavy. But this case will give you a lot of room to work with and when you are using a server motherboard, trust me, you need it.
Next, we put in the Corsair power supply. Pretty straight forward. The great thing about this power supply is that it’s modular, and the flat cable design will really help once we get all the cables in. Another consideration here was the cost to run this machine. An 850 watt power supply isn’t going to kill your electricity bill. After that, we installed the GeForce GTX 590 video card and then the Samsung SSD 840 Pro drive.
Next, we put the four 3TB Red Western Digital drives into the iStar Hot Swap Cage in the front panel of the case for the RAID drives. I felt it was important to have access to the RAID drives and this option worked out great. We set up a RAID 10 on this machine so we’d have our information mirrored across the four drives.
This gives us 6TB of storage with the same 6TB of storage backed up. And by the way, four drives is a minimum you can have in this kind of set up. This left us with a drive bay available for the DVD. All we have to do now is have pizza; it’s lunch time!
Operating System and Software/Render Management
Originally we were going to use Windows Server 2012 R2, but when I decided that this may need to be a back-up machine as well, I decided to go with Microsoft Windows 7 Ultimate because you can use up to 192GB of RAM and two CPUs are supported. There are also other aspects of Ultimate that use server features which made Ultimate a good choice for us. Cost is definitely a factor here as well. You may decide to go in a different direction, whatever works for you. Not only is this machine for rendering, but it also could act as a server if I ever have artists working for me.
With this in mind, you’ll need other software to manage all of this. I installed Perforce for file management. This allows you to keep versions of your files on the network. It’s a great tool if you have artists working with you. So one person works on a file at a time and, if they have the file checked out properly, everyone will know who is working on what.
Hansoft is also a good choice for shot assignments. I’ve used this at a few studios and it’s great for tracking shots assignments and for commenting as each artist works on their particular shots.
Finally, we tested the machine on Cinebench to test performance. We tested both CPU and OpenGL. As you can see from the screenshots below, we killed it! We’re in first place in both categories. Run the test on your own machines and see how they compare to this one. I bet you’ll be impressed.
This system was a lot of fun putting this together. We came in under budget at about $3,200, and I wound up using the extra money on more RAM. In the near future, I’ll be building another machine similar to this one, but without the RAID and video card.
Thanks for reading, and I hope this inspired you to build your own render farm!