Last Updated on March 18, 2024 7:40 pm
Intro
When building a NAS, many users prefer to have a compact unit, which usually means if you’re building your own, the use of a Mini ITX motherboard.
This can typically limit expansion options, unless you’re willing to pay a significant fee for a higher end motherboard like the CWWK which is pretty full featured but also costs about $450USD: https://cwwk.net/products/cwwk-amd-7735hs-7840hs-8845hs-7940hs-8-bay-9-bay-nas-usb4-40g-rate-8k-display-4-network-2-5g-9-sata-pcie-x16-itx-motherboard
While looking at AliExpress, I came across some options that included an N100 and N5105 CPU and included six SATA ports and two M.2 slots and four 2.5GbE ports. I ended up picking up both versions, N5105 from AliExpress, N100 from Amazon.
The two units I purchased, both ~ $125USD:
N5105: https://www.aliexpress.us/item/3256805947799076.html
N100: https://www.amazon.com/dp/B0CQZH8X2P
Full disclosure, after I had started some testing on the N100 board, it started showing issues, An ethernet controller would disappear, then I’d get phantom lockups. I also noticed that while the N5105 SATA chip had a heatsink on it, the N100 did not even though it has holes to mount one. Thankfully this is the one I bought from Amazon so issued an RMA and they promptly shipped me a new board which seemed to work perfectly fine throughout the testing.
I posted a review video if you’re interested, but most of the pertinent info is below: https://youtu.be/PO8Kfi4qpY8?si=9AuYTaGZmmMfM5NG
Components
They both offer:
- Components
- two 10Gbps USB3 Type A ports
- two M.2 SATA ports
- four 2.5GbE ports managed by the Intel I226-V chip
- six onboard SATA ports with the JMicron JMB585 controller
- Unique to each:
- one DDR5 So-DIMM (N100)
- two DDR4 So-DIMM (N5105)
- one PCIe 1x slot (N100)
That SATA controller supports up to 5 SATA III ports so I can only imagine the other one is provided by the CPU. The N5105 spec indicates that it can support two SATA ports, but the N100 specs weren’t clear.
The N100 has a single DDR5 So-DIMM slot that supports up to 16GB, that limit is apparently enforced by the CPU design. I don’t have a 32GB DDR5 So-DIMM otherwise I’d see if it actually can support it. The N5105 has two DDR4 slots, and like the N100 is limited to 16GB total RAM. I did insert a single 16GB chip in one slot, but it wouldn’t boot. But two 8GB or single 8GB worked just fine.
One unique thing about the N100 board is it offers a PCIe 1x slot. The N100 supports 9 PCIe 3.0 lanes, whereas the N5105 only 8, which is likely the reason it’s not on the N5105 version. That slot is open ended so you can add longer cards. The only caveat is that the card has to slot between the two rows of SATA ports. It fits fine, as I plugged in an RX6400 and GTX 1050 Ti video card, but you can’t use clipped SATA connectors because the clip will overlap into the area where the PCIe card wants to fit. Plus you will need 90 degree right angle connectors on the one side to avoid hitting any protruding part off the PCIe card.
OS Installation
I installed five operating systems on each motherboard.
- OS’s Installed:
- Windows 11
- Ubuntu
- OpenMediaVault
- TrueNAS Scale
- UnRAID
Installation of the Linux based OS’s went perfectly fine. Windows 11, on the other hand was lacking many devices, but most importantly were the Intel I226-V 2.5GbE drivers, so you couldn’t even connect to the internet without them. This can be problematic because Windows likes to force you on the internet during install. But a nice little workaround I found was to use SHIFT-F10
which will bring up a console window and then type oobe\bypassnro
reboot, and then you will get an option to install without internet, all the while trying to make you feel bad about yourself for not commiting your email and soul to Microsoft.
Once I got up and running, I loaded drivers from a USB (https://intel.com/content/www/us/en/download/15084/intel-ethernet-adapter-complete-driver-pack.html), and then performed the Windows update marathon. The N5105 was missing several drivers still, but I did find drivers on Gigabytes Website. I needed the chipset drivers from here: https://gigabyte.com/Motherboard/N5105I-H-rev-10/support#support-dl-driver-chipset
For the N100, I used the same I226-V drivers from the USB, and after updates there was just some Audio driver missing which was not so easy to track down. I did manage to get it from here:
https://catalog.update.microsoft.com/Search.aspx?q=10.29.0.9677+media
But then after installing that, anther audio or SM Bus Driver was still missing which I managed to get from the tenforums website, which linked to a Google Drive download. Sure, a bit shady, but this motherboard was already from AliExpress out of China, so I’ve probably already compromised my identity at this point. But seriously, I scanned it for viruses and it came up clean. You can grab it here: https://www.tenforums.com/sound-audio/182081-latest-realtek-hd-audio-driver-version-3-a-103.html
So with everything up and running I ran a multitude of tests on the different components.
Benchmarks
For general system tests, I ran Cinebench R23 in Windows and tracked the CPU usage, temps, power, etc. Nothing out of the ordinary. If you’re interested results were:
N5105 Single CPU: 577
N100 Single CPU: 886
N5105 Multi Core: 1990
N100 Multi Core: 2504
Both CPU temps hovered in the upper 70’s, but after the re-paste, the N100 dropped by about 20C and the N5105 by about 10C.
I also ran Handbrake encoding test of a 4k60 10 minute video using the Handbrake “1080p Fast” default setting which encodes to 1080p/30. The results were as follows:
N5105 QSV: 32.4 minutes
N5105 CPU: 39.7 minutes
N100 QSV: 21.2 minutes
N100 CPU: 28.6 minutes
So anywhere from 20-40 minutes for a 10 minute video. Not too impressive.
I also fired up a Plex media server on each motherboard, and it served up to four 4k videos just fine as long as they were native resolution and format. I mean, that’s just a bandwidth thing.
When it comes to transcoding on the fly, the Windows version of Plex currently can’t transcode using Quick Sync with the integrated GPU on the Intel N100 or N5105. But with a Linux distro, it easily managed to transcode four 4k/24 HEVC videos simultaneously to 1080p without an issue.
2.5GbE Intel I226-V Ethernet Ports
For the 2.5GbE Ethernet ports, I did a basic 10x 1GB file copy test and measured the resultant performance. They all performed about 270-280 MB/sec read and write. For some reason the N5105 in Windows write test was only about 240 MB/sec, but up to about 275MB/sec with read. Other OS’s it performed as expected. So not sure what to make of that other than Windows being Windows.
M.2 and USB
For M.2 and USB ports I ran CrystalDiskMark (Windows), KDiskMark (Ubuntu), hdparm -t read test (Linux OS’s), and a 10x 1GB file copy.
Bottom line, The M.2 and PCIe slot are definitely PCIe 1x (3.0). CrystalDiskMark, KDiskMark, and hdparm -t
tests resulted in about 850-900 MB/sec sequential read/write. During the actual 10x 1GB file transfer tests, the N5105 faltered a bit running at only about 650 MB/sec in OMV, TrueNAS, and UnRAID.
The USB ports actually performed better than the M.2 slots running over 1000 MB/sec with the artificial CrystalDiskMark/KDiskMark sequential and hdparm -t
tests. However, real world file transfers were all over the place. But that seems par for the course for USB.
SATA Ports
Now when it comes to the SATA ports, both motherboards use the JMicron JMB585 controller. This chip provides support for up to 5 SATA III (600 MB/sec) ports. Considering there are six SATA ports, I believe on comes from the CPU.
Oddly enough, The N100 SATA ports seemed to be limiting overall performance. Connecting a single Samsung 870 Evo 2.5″ SATA SSD to each port, it only resulted in about 430 MB/sec on five of the six ports. The sixth port managed about 550 MB/sec which is about max performance of this SSD when connected to a traditional desktop SATA port (where it hits 560MB/sec). The N5105 on the other hand performed at about 550 MB/sec.
I also used an Orico M.2 Six SATA port adapter that uses the ASMedia ASM 1166 controller as kind of a control sample, because I know it performs at expected speeds. The Orico M.2 in both the N100 and N5105 performed as well as in a traditional desktop. So there is some limitation there.
While this may not seem concerning if you’re using hard drives because they only tend to run at about 250 MB/sec or slower, SSD’s could be problematic. But worse is the RAID performance.
OpenMediaVault
I set up a few scenarios, but I’ll only discuss the 6x RAID 0 and 12x RAID 60 (OMV) / Two 6x RAID Z2 VDevs (TrueNAS). I used ST500DM002 500GB SATA hard drives which performs at about 200 MB/sec sequential speeds when empty. A 6x RAID 0 should offer over 1000 MB/sec with these.
With the 6x RAID 0, the N100 only offered up about 500 MB/sec. On the N5105 it hit over 1000 MB/sec.
I also set up a 6x RAID 6 and 6x RAID 60. I built one RAID 6 at a time, then went back and built two RAID 6’s at a time to check if the system could handle it, then I merged them into an mdadm striped array for RAID 60.
Results from the RAID 6 build times:
Single RAID 6 Build Onboard SATA:
- N100: 127 Minutes
- N5105: 106 Minutes
Dual RAID 6 onboard SATA:
- N100: 145 Minutes
- N5105: 106 Minutes
Dual RAID 6 Orico M.2 Adapter:
- N100: 114 Minutes
- N5105: 106 Minutes
So you can see that the N5105 handled the RAID 6 single build and when building two RAID 6 arrays simulataneously, without a hitch. The N100 took quite a bit longer.
Regarding CPU usage during the builds, both hit about 50% CPU utilization throughout with the 15 minute load average peaking at about 4, although the N5105 jumped up to about 70% utilization and 15 minute load average of about 4.5 for a brief period. Either way, it seemed the system could handle it just fine.
UnRAID
For UnRAID I set up a 4x Data Disk + 2x Parity Disk scenario and measured the performance of a build, as well as a parity check. Results as follows:
Initial Sync Onboard SATA:
- N100: 77 Minutes
- N5105: 53 Minutes
Initial Sync Orico M.2 Adapter:
- N100: 53 Minutes
- N5105: 53 Minutes
Parity Check Onboard SATA:
- N100: 93 Minutes
- N5105: 54 Minutes
Parity Check Orico M.2 Adapter:
- N100: 60 Minutes
- N5105: 54 Minutes
So it appears the N100 SATA ports are causing slower performance here as well.
TrueNAS Scale
For TrueNAS Scale I created a six disk RAIDZ2 pool and did a 1TB file transfer over 2.5GbE as well as removing a disk and then performing a resilver after that 1TB of data was written.
File Transfer 1TB over 2.5GbE:
- N100: 80 Minutes
- N100 Orico: 78 Minutes
- N5105: 83 Minutes
Resilver 1TB Data:
- N100: 47 Minutes
- N100 Orico: 38 Minutes
- N5105: 38 Minutes
Here again, it seems the onboard SATA port resulted in reduced performance compared with teh N5105 and Orico M.2 adapter.
Power Draw
Power draw with a basic configuration of 1x M.2 PCIe SSD, 16GB RAM, 1x Ethernet cable connected, using a 500W EVGA Gold PSU resulted in about 20W while idle, and N100 would peak at about 40W under load, while the N5105 peaked at 30W power draw from the wall.
DisplayPort and HDMI
The single full sized DisplayPort and HDMI port on the N100 both managed to output 4k/60 no issue. The N5105 version, on the other hand, the DisplayPort managed 4k/60, but over HDMI could only manage 4k/50.
Fan Noise
Both boards included a cooling solution with a heatsink and fan. There is a way to modify the fan curve in the BIOS, but stock setting seemed fine. The fan at idle was not audible over any other noise in the office, even at load it was barely audible. So there’s not much concern on that front.
Final Thoughts
If you’re on a budget and looking for a NAS motherboard to support over the traditional 2 or 4 SATA ports that are usually offered on most ITX motherboards, these offer a good option. The reduced SATA performance of the N100 is a bit of a head scratcher considering both the N100 and N5105 use the same JMicron JMB585 controller chip. But the N100 does offer the 1x PCIe slot and general performance was slightly faster. So I guess it depends on what you’re looking for.
While I thought it might be just this specific board, the one I had to RMA also exhibited a similar result. Not sure if other vendor boards have the same issue or not.
So, I help this info was useful. You’ll probably find more details in the video, but I wouldn’t want to make anyone listen to my mumblings if they don’t have to.