ASRock Rack C2750D4I and U-NAS NSC-800: A DIY File Server
by Ganesh T S on August 10, 2015 8:45 AM EST- Posted in
- NAS
- storage server
- Avoton
- ASRock Rack
- U-NAS
Introduction and Testing Methodology
Small businesses and power users in a home setting have begun to face challenges with managing large amounts of data. These are generated either as part of day-to-day business operations or backing up of multimedia files from phones / tablets / TV recordings etc. One option is to use a dedicated COTS (commercial off-the-shelf) NAS from a vendor such as Synology or QNAP. Sometimes, it is also necessary to have a file server that is much more flexible with respect to programs that can be run on it. This is where storage servers based on Microsoft's offerings or even units based on Linux distributions such as Red Hat and Ubuntu come into play. These servers can either be bought as an appliance or assembled in a DIY fashion. Today, we will be looking at a system based on the latter approach.
A DIY approach involves selection of an appropriate motherboard and a chassis to place it in. Depending on the requirements and motherboard characteristics, one can opt for ECC or ordinary RAM. The platform choice and the number of drives would dictate the PSU capacity. The file server being discussed today uses the ASRock C2750D4I mini-ITX motherboard in a U-NAS NSC 800 chassis. 8 GB of ECC DRAM and a 400 W PSU round up the barebones components. The table below lists the components of the system.
ASRock C2750D4I + U-NAS NSC-800 | |
Form Factor | 8-bay mini-tower / mITX motherboard |
Platform | Intel Avoton C2750 |
CPU Configuration | 8C/8T Silvermont x86 Cores 4 MB L2, 20W TDP 2.4 GHz (Turbo: 2.6 GHz) |
SoC SATA Ports | 2x SATA III (for two hot-swap bays) 4x SATA II (for one OS drive) |
Additional SATA Ports | Marvell SE9172 (2x) (for two hot-swap bays) Marvell SE9230 (4x) (for four hot-swap bays) |
I/O Ports | 3x USB 2.0 1x D-Sub 2x RJ-45 GbE LAN 1x RJ-45 IPMI LAN 1x COM1 Serial Port |
Expansion Slots | 1x PCIe 2.0 x8 (Unused) |
Memory | 2x 4GB DDR3-1333 ECC UDIMM Samsung M391B5273DH0-YH9 |
Data Drives | 8x OCZ Vector 128 GB |
Chassis Dimensions | 316mm x 254mm x 180mm |
Power Supply | 400W Internal PSU |
Diskless Price (when built) | USD 845 |
Evaluation Methodology
A file server can be used for multiple purposes, unlike a dedicated NAS. Evaluating a file server with our standard NAS testing methodology wouldn't do justice to the eventual use-cases and would tell only a part of the story to the reader. Hence, we adopt a hybrid approach in which the evaluation is divided into two parts - one, as a standalone computing system and another as a storage device on a network.
In order to get an idea of the performance of the file server as a standalone computing system, we boot up the unit with a USB key containing a Ubuntu-on-the-go installation. The drives in the bays are configured in a mdadm RAID-5 array. Selected benchmarks from the Phoronix Test Suite (i.e, those benchmarks relevant to the usage of a system as a file server) are processed after ensuring that any test utilizing local storage (disk benchmarks, in particular) point to the mdadm RAID-5 array. Usage of the Phoronix Test Suite allows readers to have comparison points for the file server against multiple systems (even those that haven't been benchmarked by us).
As a storage device on a network, there are multiple ways to determine the performance. One option would be to repeat all our NAS benchmarks on the system, but that would be take too much time to process for a given system that we are already testing as a standalone computer. On the other hand, it is also important to look beyond numbers from artificial benchmarks and see how a system performs in terms of business metrics. SPEC SFS 2014 comes to our help here. The benchmark tool is best used for evaluation of SANs. However, it also helps us here to see the effectiveness of the file server as a storage node in a network. The SPEC SFS 2014 has been developed by the IOZone folks, and covers evaluation of the filer in specific application scenarios like the number of virtual machines that can be run off the filer, number of simultaneous databases, number of video streams that can be simultaneously recorded and the number of simultaneous software builds that can be processed.
Our SPEC SFS 2014 setup consists of a SMB share on the file server under test connected over an Ethernet network to our NAS evaluation testbed outlined below. Further details about the SPEC SFS 2014 workloads will be provided in the appropriate section.
AnandTech NAS Testbed Configuration | |
Motherboard | Asus Z9PE-D8 WS Dual LGA2011 SSI-EEB |
CPU | 2 x Intel Xeon E5-2630L |
Coolers | 2 x Dynatron R17 |
Memory | G.Skill RipjawsZ F3-12800CL10Q2-64GBZL (8x8GB) CAS 10-10-10-30 |
OS Drive | OCZ Technology Vertex 4 128GB |
Secondary Drive | OCZ Technology Vertex 4 128GB |
Tertiary Drive | OCZ Z-Drive R4 CM88 (1.6TB PCIe SSD) |
Other Drives | 12 x OCZ Technology Vertex 4 64GB (Offline in the Host OS) |
Network Cards | 6 x Intel ESA I-340 Quad-GbE Port Network Adapter |
Chassis | SilverStoneTek Raven RV03 |
PSU | SilverStoneTek Strider Plus Gold Evolution 850W |
OS | Windows Server 2008 R2 |
Network Switch | Netgear ProSafe GSM7352S-200 |
The above testbed runs 10 Windows 7 VMs simultaneously, each with a dedicated 1 Gbps network interface. This simulates a real-life workload of up to 10 clients for the NAS being evaluated. All the VMs connect to the network switch to which the NAS is also connected (with link aggregation, as applicable). The VMs generate the NAS traffic for performance evaluation.
Thank You!
We thank the following companies for helping us out with our NAS testbed:
- Thanks to Intel for the Xeon E5-2630L CPUs and the ESA I-340 quad port network adapters
- Thanks to Asus for the Z9PE-D8 WS dual LGA 2011 workstation motherboard
- Thanks to Dynatron for the R17 coolers
- Thanks to G.Skill for the RipjawsZ 64GB DDR3 DRAM kit
- Thanks to OCZ Technology for the two 128GB Vertex 4 SSDs, twelve 64GB Vertex 4 SSDs and the OCZ Z-Drive R4 CM88
- Thanks to SilverStone for the Raven RV03 chassis and the 850W Strider Gold Evolution PSU
- Thanks to Netgear for the ProSafe GSM7352S-200 L3 48-port Gigabit Switch with 10 GbE capabilities.
48 Comments
View All Comments
rrinker - Monday, August 10, 2015 - link
This chassis looks like just the thing to replace my WHS box. I was probably just going to run Server 2012 R2 Essentials and change over my StableBits DrivePool to the standard Server 2012 version. ALl these NAS boxes and storage system that everyone seems to go nuts over - none of them I've seen have the flexibility of the pooled storage that the original WHS, and WHS 2011 with DrivePool have had all along. Of course there are the Windows haters - but my WHS has been chugging along, backing up my other computers, storing my music and movies, playing movies through my media player, and the only time it's been rebooted since I moved to my new house a year and half ago was when the power went out. It just sits there and runs. One of the best products Microsoft came up with, so of course they killed it. Essentials is the closest thing to what WHS was. Replacing a standard mid tower case with something like this would save a bunch of space. 8 drives, plus a couple of SSDs for the OS drive.. just about perfect. I currently have 6 drives plus an OS drive in my WHS, so 8 would give me even more growing room. I have a mix of 1TB, 2TB, and 3TB drives in there now, with this, up to 8x 4TB which is a huge leap over what I have now.DanNeely - Monday, August 10, 2015 - link
At $400 for a (non-education) license, S2012 R2 Essentials is a lot more expensive than I want to go. If I build a new storage server on Windows I'm 99% sure I'll be starting with a standard copy of Win10 for the foundation.kmmatney - Monday, August 10, 2015 - link
The only thing missing from Windows 10 is the automated backup,which works great on WHS. That's the main thing holding me back from changing from WHS. I had to do a few unexpected bare-metal restores after installing Windows 10 on a few machines, and WHS really came through there. I had several issues restoring, but at the end of the day, it was successful in every instance.kmmatney - Monday, August 10, 2015 - link
I'm also a WHS 2011 + stablebit drivepool user. Best of everything - you can add or remove single drives easily, the data is portable and easy to extract if needed, you can choose what gets mirrored, and what doesn't. The initial balancing takes a while, but after that the speed is fine. I'm up to 8 drives now (7 in the drive pool), and can expand to 12 drives with my Corsair carbide case and a $20 SATA card. I keep an 80GB SSD out of the pool for running a few Minecraft servers. This DIY NAS is interesting, but it would be far cheaper for me to just replace some of my smaller drives with 4 TB models if I need more storage.Since WHS 2011 is Windows 7 based - it should still last a while - I don;t see a need to replace it anytime soon. But my upgrade path will probably be Windows 10 + Stablebit drive pool. Cheap and flexible.
DanNeely - Monday, August 10, 2015 - link
WHS 2011 is a pure consumer product (and based on a a server version of windows not win7); meaning it only has a 5 year supported life cycle. After April 2016, it's over and no more patches will be issued.Navvie - Tuesday, August 11, 2015 - link
I agree. Not being able to expand vdevs easily is a limitation. But weighing the pros and cons, it's a small price to pay.The last time I filled a vdev, I bought more drive and created an additional vdev.
BillyONeal - Monday, August 10, 2015 - link
If you want to pay the premium for hardware that can run Solaris nobody's stopping you.ZeDestructor - Monday, August 10, 2015 - link
ZFS is available on both FreeBSD and Linux, so it's no more expensive than boring old softraid on Linux.bsd228 - Friday, August 14, 2015 - link
What premium? I've run Solaris on many intel and amd motherboards, but most recently with the HP Microserver line (34L, 50L, 54L).digitalgriffin - Monday, August 10, 2015 - link
These are good articles. And for someone with a serious NAS requirement they are useful.But 99% of home users don't need a NAS
The 1% of us that do, only 1% need 8 bays with a $200 case and slow $400 intel board. That's a serious game system start up with at least 6 SATA connection motherboard.
For example Cooler Master HAF912 will hold over 8 drives and is $50.
6 SATA port motherboard 1150 socket mb $120.
3.2GHz i-3 (low power processor Y or T version for $130)
PCIe SATA card $50.
Lets see you build a build a budget system that can:
Handle 5 drives (boot/cache, Raid 6 (two drives + 2 parity))
Handle transcoding with Plex server.