I tripled my power bill building a massive home server before realizing two cheap NAS units are better


If you think your homelab is complete with just one NAS—think again. I run multiple NAS servers in my homelab, and it’s actually improved my workflow and homelabbing experience way more than I expected. Here’s why you should run more than one NAS in your homelab.

One NAS for speed, one NAS for capacity

It’s easier to build out two NAS systems then to have one do-it-all setup

A hand sliding a drive tray with a Seagate IronWolf 4TB hard drive into the Ugreen iDX6011 Pro NAS. Credit: Patrick Campanale / How-To Geek

When I started out building my homelab, I had just one NAS. That meant one system stored all of my data, and I really only had one way to access that data: slow. The system was running Unraid, and, while I did have 3TB of NVMe storage to work as a cache frontend for transferring files to the NAS, pulling data from the NAS was painfully slow thanks to the single-drive read speeds Unraid has.

This meant I really didn’t pull a lot of data from my NAS. It was primarily a storage destination where archived data went. Eventually, I got another NAS to supplement my main one—an all-SSD NAS. Having an all-SSD NAS gave me the ability to have one system designed for mass storage (my Unraid server) and another designated for fast storage access.



















Quiz
8 Questions · Test Your Knowledge

Interesting and unique NAS use cases
Trivia challenge

Beyond basic backups — how well do you know the surprising things a NAS can do?

MediaHome LabBackupNetworkingAutomation

Which popular open-source media server software is commonly self-hosted on a NAS to stream personal video libraries to any device?

Correct! Plex is one of the most popular apps for turning a NAS into a personal Netflix-style streaming server. It organizes your media with artwork and metadata and can transcode video on the fly for different devices and connections.

Not quite — the answer is Plex. While Kodi and VLC are great media players, Plex is specifically designed as a client-server platform that lets you stream your NAS library to phones, smart TVs, and browsers from anywhere in the world.

What is the name of the widely recommended data protection strategy that involves keeping three copies of data, on two different media types, with one copy offsite?

Correct! The 3-2-1 backup rule is a cornerstone of data protection strategy. A NAS plays a central role by acting as the second on-site copy, while cloud sync or an offsite drive satisfies the third copy requirement.

Not quite — the answer is the 3-2-1 backup rule. RAID is often mistaken for a backup, but it only protects against drive failure, not accidental deletion or ransomware. The 3-2-1 rule is the gold standard precisely because it covers multiple failure scenarios.

A NAS running a hypervisor or container platform like Docker can host a Pi-hole instance. What does Pi-hole primarily do?

Correct! Pi-hole acts as a DNS sinkhole, blocking known ad-serving and tracking domains before they ever reach your devices. Hosting it on a NAS via Docker means it runs 24/7 without needing a dedicated Raspberry Pi.

Not quite — the answer is that Pi-hole blocks ads at the DNS level. Rather than installing an ad blocker on every single device, Pi-hole protects your entire network, including smart TVs and phones, by intercepting ad domain requests before any data is loaded.

Many NAS manufacturers offer dedicated surveillance software packages. What is the primary function of these applications?

Correct! Synology Surveillance Station and QNAP’s QVR Pro are examples of NAS-based NVR (Network Video Recorder) solutions. They let you manage multiple IP cameras, set motion-triggered recording, and review footage without paying for a cloud subscription.

Not quite — the answer is managing and recording IP camera footage. A NAS can replace a dedicated NVR appliance entirely, storing days or weeks of footage locally. This is a compelling use case since it avoids ongoing cloud storage fees while keeping footage on hardware you control.

Which self-hosted application, commonly run on a NAS, automatically downloads TV show episodes and movies by integrating with torrent or Usenet indexers?

Correct! Radarr handles movies and Sonarr handles TV shows — together they form the backbone of a self-hosted media automation stack. They monitor release groups, grab new episodes automatically, and pass files directly to your Plex or Jellyfin library.

Not quite — the answer is Radarr and Sonarr. While Bazarr handles subtitles and Prowlarr manages indexers, Radarr and Sonarr are the core apps for automating movie and TV downloads respectively. They integrate with your NAS download client and media server for a seamless pipeline.

A NAS can be configured as a VPN server so that remote users can securely access the local network. Which VPN protocol, known for being modern and extremely fast, is supported by newer NAS operating systems like Synology DSM?

Correct! WireGuard is a modern VPN protocol praised for its lean codebase, high speeds, and strong encryption. Synology added WireGuard support to DSM, making it easier than ever to securely tunnel into your home network from anywhere without exposing your NAS directly to the internet.

Not quite — the answer is WireGuard. PPTP is outdated and considered insecure, while OpenVPN and L2TP/IPSec are reliable but more resource-intensive. WireGuard achieves better throughput with less overhead, which matters on the modest CPUs found in many NAS devices.

Nextcloud is a self-hosted platform frequently deployed on a NAS. Which major commercial cloud service does it most directly aim to replace?

Correct! Nextcloud provides file sync, document editing, calendar, contacts, and video calls — a direct alternative to Google Drive and Google Workspace. Running it on a NAS means your data never leaves your own hardware, which is a major privacy and cost advantage.

Not quite — the answer is Google Drive and Google Workspace. Nextcloud replicates the full productivity suite experience: shared folders, collaborative document editing, and mobile sync. When paired with a NAS, it becomes a powerful private cloud that rivals Google’s offering without any subscription fees.

Some photographers and videographers use a NAS as the central hub for a collaborative editing workflow. Which protocol, natively supported on macOS and optimized for high-bandwidth file access, makes a NAS behave like a fast local drive for video editing?

Correct! For video editing workflows, SMB Multichannel (or historically AFP on older Macs) allows a NAS to deliver the kind of sustained throughput needed to scrub through high-bitrate footage without copying files locally first. Pair this with a 2.5GbE or 10GbE network and a NAS can rival a dedicated SAN for small creative teams.

Not quite — the answer is SMB with Multichannel (or AFP on legacy Macs). FTP and WebDAV are too slow and latency-prone for real-time editing. SMB Multichannel bonds multiple network connections to boost throughput, which is why NAS vendors like Synology specifically market this feature to creative professionals editing 4K and 6K footage.

Challenge Complete

Your Score

/ 8

Thanks for playing!

Eventually, I got a NAS that gave me the best of both worlds—fast array read/writes and NVMe support. However, I still run my Unraid server in addition to that as it’s focused on purely archival information where I’m not needing to read a lot of data from, while my other NAS handles the information that I’m constantly reading/writing from.

Backing up to a local device is better then not backing up at all

If cloud backups are out of the question, having a second NAS is the next best thing

While off-site backups are the gold standard for keeping your data actually safe and secure, there’s another option: two local copies in separate rooms of your house. I’ve seen some homelabs handle backup this way. Simply build two nearly identical or identical NAS systems and then have the main system mirror to the secondary system.

Of course, this isn’t ideal because, in the event of a natural disaster, electrical problem, flood, or any other catastrophic damage, you would very likely lose both systems at the same time. However, having two NAS units where one backs up to the other does help prevent from data loss due to hardware failure, especially if you plan it right.

If you want to ensure the best chance of success, don’t buy your hard drives for both systems from the same distributor to have the best shot of getting drives not from the same batch. Also, make sure you don’t do a RAID rebuild on both at the same time. Only rebuild one RAID at a time.

Doing both of these things will make it so that way if you lose data from a RAID rebuild or hardware failure on one system, you very likely won’t suffer the same fate on the other system. Plus, if you want, you could always move that second NAS to a friend or family member’s house to actually have off-site backup.

Optimize your NAS devices for different workloads

One for transcoding, one for backups

A Western Digital WD Red Plus 4TB NAS HDD sitting on a wooden desk with 3D printers in the background. Credit: Patrick Campanale / How-To Geek

Every NAS has a different set of capabilities. Some are great at storing tons of data, others are great at handling transcoding and the like. So, lean into your NAS’s strengths.

For example, have one NAS that’s purely for running your homelab apps. This type of NAS doesn’t need a ton of storage, but having a mix of fast and slow storage gives you a lot of flexibility. You could mount slow drives for NVR storage, fast drives for Docker storage, and everything stays running 24/7 no problem. A NAS with these directives in mind will need a solid processor, a good amount of RAM, and likely the ability to handle hardware transcoding either through the iGPU or a dedicated graphics card.

Then, you have a NAS that’s purely for bulk storage. A NAS that only has to handle storage doesn’t need near the specs as one that has to handle a bunch of apps, too. You don’t need as powerful of a processor, there’s no transcoding need, and even RAM usage will drop pretty significantly.

Having two NAS systems that are separate from each other for these tasks also begins to diversify your homelab. Now that I have two primary NAS systems in my homelab, I’m able to really dial in how my workflow goes. My largest NAS, which is also my weakest NAS, is just a big bulk file server.

It handles all of my Plex data and that’s it. My other NAS, which doesn’t have nearly as much raw storage, but has way more compute power, is now a primary apps node for me, and also is where I put any of the files that I need fast access to thanks to its speedier array.

Being able to have a specific NAS for different workflows has become a core part of my homelab. In fact, I’m about to spin up a third NAS for one singular purpose: photo and document storage. This new NAS will be a small 2-bay unit that’s not fast or powerful, but designed with purpose in mind to give me a system to replace Google Photos and Google Drive for me.

  • UGREEN NASync DSP2800 thumbnail

    Brand

    UGREEN

    CPU

    Intel 12th Gen N-Series

    This cutting-edge network-attached storage device transforms how you store and access data via smartphones, laptops, tablets, and TVs anywhere with network access.


  • Synology DS425+ on a white background.

    Brand

    Synology

    CPU

    Intel Celeron J4125

    This four-bay NAS works great for home and small office use, and it comes with a three-year warranty from Synology.



Build your homelab out with purpose

I’ll be the first to admit that I didn’t build my homelab out in the smartest way possible. I went from one rack-mount server to three, and trippled my homelab’s electric cost and heat output without tripling its capacity or capability.

If I had to do a lot of it over again, I would start out with the route that I lay out above: two NAS systems that have specific purposes in my homelab. From there, I’d expand into mini PCs, as that’s where the magic really happens in my homelab now.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


The battle between AMD and NVIDIA rages on eternally, it seems, though it’s rather a one-sided battle in the desktop PC market, where NVIDIA holds something like 95%, and AMD most of what’s left apart from Intel’s (almost) 1%.

But as dominant and popular as NVIDIA is, AMD proponents could always raise the value argument. On a per-dollar basis, you get more value with an AMD card, and even better, you have the benefit of AMD “FineWine” which ensures your card will become even better with time.

What “FineWine” meant—and why it mattered

FineWine was something that AMD fans began to notice during the GCN (Graphics Core Next) architecture. Incidentally, the last AMD dedicated GPU I bought was the R9 390, which was of that lineage. Since then, all my AMD GPUs have been embedded in consoles or handheld PCs, but I digress.

The R9 390 is actually a good example of FineWine. Launched in 2015, like many AMD cards, the R9 390 had a rough start, and I sold mine in exchange for a stopgap card in the form of the RTX 2060, because I wanted to play Cyberpunk 2077 on PC, where it wasn’t broken the way it was on consoles. Even though, on paper, the raw power of the RTX 2060 wasn’t much more than a 390, the AMD card’s performance on my (then) 1080p monitor was a stuttery mess, whereas everything suddenly ran great on my 2060 the minute the AMD GPU was expunged from the system.

But, a decade later, that same game is perfectly playable on this card, as you can see in this TechLabUK video.

A lot of it is because the developers have kept patching and improving the game, but this is something you see across the board for AMD cards on various games. This is FineWine. Years later, with continued driver updates from AMD, the cards go from being a little worse than their NVIDIA equivalent at launch to being as good or even a little better in the long run.

Of course, that’s not super helpful to customers who buy hardware at launch, but it has given some AMD users computers with longer lifespans than you’d think, and made many used AMD cards an even better bargain.

Why AMD’s FineWine era worked

A bit of smoke and mirrors

The PULSE AMD Radeon RX 6800 XT next to an AMD RX 6600 XT Phantom Gaming D. Credit: Ismar Hrnjicevic / How-To Geek

FineWine wasn’t magic, of course. The phenomenon was the result of a mix of factors. AMD’s architectures were in some cases a little too forward-thinking for the APIs of the day. Massively parallel with a focus on compute, they’d only come into their own with DirectX 12 and more modern games. NVIDIA’s cards at the time were better optimized to run current games well. Over time, NVIDIA cards would make similar architectural changes, but with better timing.

The other reason FineWine was a thing came down to driver maturity. As a much smaller company with fewer resources, it seems that AMD had some trouble releasing cards with optimized drivers. So, over time, the card would start performing as intended.

In both cases, you could frame FineWine not as the card getting better, but rather getting “less worse” over time. If you set the bar low at launch, the only way is up. However, there’s a third factor to take into account as well. AMD dominates console gaming. The two major home console series have now run on AMD GPUs for two generations, and so games are developed with that hardware in mind. This also gives newer titles a bit of a leg up, though it’s hard to know exactly by how much.

How AMD moved on from FineWine

It seems worse, but it’s actually better

An AMD RX 9070 XT Gigabyte gaming graphics card. Credit: Ismar Hrnjicevic / How-To Geek

With the shift to RDNA architecture, AMD made a deliberate change in philosophy. Modern Radeon GPUs are designed to perform well right out of the gate. Reviews on day one are much closer to what you could expect years later. There are still decent gains to be had on RDNA cards with game-specific optimizations (Spider-Man on PC is a great example), but the golden age of FineWine seems to be in the past now.

That’s a good thing! Products should put their best foot forward on day one, so let’s not shed a tear for FineWine in that regard. So it’s not so much that AMD doesn’t care about improving the performance and stability of older cards over the years, it’s that the company is now better at its job, and so there’s less room for improvement.

Sapphire NITRO+ AMD Radeon RX 9070 XT GPU

Cooling Method

Air

GPU Speed

2520Mhz

The AMD Radeon RX 9070 XT from Sapphire features 16GB of DDR6 memory, two HDMI and two DisplayPorts, and an overengineered cooling setup that will keep the card cool and whisper quiet no matter the workload.


NVIDIA kept the idea—but changed the formula

It’s all about AI

It’s funny, but these days I think of NVIDIA cards as the ones with major longevity. Take the venerable GTX 1080 and 1080 Ti cards. These cards only lost game-ready driver support in 2025, which doesn’t immediately make them useless, it just means no more optimization for those chips. What an incredible run, getting a decade of relevant game performance from a GPU!

But, that’s not really NVIDIA’s take on FineWine. Instead, the company has taken to adding new and better features to its cards long after they’ve been launched. Starting with the 20-series, the presence of machine-learning hardware means that by improving the AI algorithms for technologies like DLSS, these cards have become more performant with better image quality over time.

While NVIDIA has made some features of its AI technology exclusive to each generation, so far all post 10-series GPUs benefit from every new generation of DLSS. Compare that to AMD which not only offers inferior versions of this new upscaling technology, but has locked the better, more usable versions to later cards, such as the case with FSR Redstone.


FineWine is an ethos, not a brand

In the case of my humble RTX 4060 laptop, the release of DLSS 4.5 has opened new possibilities, notably the ability to target a 4K output resolution, which was certainly not on the table when I first took this computer out of the box. We might not call it “FineWine,” but it sure smells like it to me!



Source link