I’m a homelab enthusiast, but I refuse to self host these 5 services


If you have a homelab, then you’ve likely considered self-hosting every service you use. I’ve thought about that too, but there are certain services that I outright refuse to host at home.

While I’ve tried self-hosting many things, with services like email and music streaming, I simply found that it’s better to pay for instead of self-host. There are a number of reasons why, so here’s a list of services that I simply won’t run in my homelab.

I’ll leave email to the professionals

Self-hosted email is my worst nightmare

Email icon with unread counter.

Everyone uses email, so self-hosting it is a no-brainer, right? Wrong. There are a number of reasons not to self-host your own email server, but the main reason I refuse to do it is that I like my emails to be seen by the recipients.

You see, most of the large email providers have spam filters that filter out emails which aren’t signed by another large email provider. That means if I self-hosted my own email server, the emails I sent could (and would likely) end up in someone’s spam box.

There’s just no easy way around this one. I rely on email to run my business and for corporate communication, and I can’t risk having my email land in a contact’s spam box, so I just stick to using Google Workspace for my email.

I need my cloud storage to be bulletproof

One less thing to worry about

Google Drive logo with some Google files next to it and several padlocks. Credit: Lucas Gouveia/How-To Geek
 

While I’ve thought about (and even tried) replacing my cloud storage with a self-hosted cloud service, I just can’t bring myself to do it. I rely on my cloud storage for a number of things.

I store just about my entire life in the cloud, which has pros and cons. One of the pros is I always have access to all of my files anywhere in the world, so long as I have an internet connection.

Self-hosting doesn’t provide me this reliability. If my home internet goes out, so would my cloud storage access. Power goes out? There goes my access. Server maintenance? Bye bye shared file downloads.

Since I rely on my cloud storage for so much, I need it to be as reliable as possible—and self-hosting just doesn’t deliver that.

Nextcloud logo and some Google logos scattered on the floor.


After Switching to Linux, This App Helped Me Drop Google for Good

One self-hosted app to replace them all?

There’s just no reason to host my own Git repo

I’ll keep using GitHub, thank you

A dot file with the GitHub logo in the center. Credit: Lucas Gouveia/How-To Geek

While others might have reason to self-host their own code revision repository, I don’t have that need. I simply choose to use GitHub for that, because it works, it’s free, and I know it.

I’ve used GitHub for over a decade at this point, and I’m very familiar with it, so I’m going to continue using it. The stuff I have on my GitHub is simple, and I don’t mind it being public.

Sure, a self-hosted git repository is more private, and I would have more control over it. However, it would add extra complication, make accessing it outside the network more difficult, and be susceptible to the same reliability problems that keep me from hosing my own cloud storage.

UGREEN NASync DSP2800 thumbnail

Brand

UGREEN

CPU

Intel 12th Gen N-Series

Memory

8GB (Upgradeable to 16GB)

Drive Bays

2 x 22TB

Ports

2.5GbE, USB-C, USB-A (x3)

This cutting-edge network-attached storage device transforms how you store and access data via smartphones, laptops, tablets, and TVs anywhere with network access.


Sometimes it’s best to host a website offsite

A small price to pay for reliability

I’ve self-hosted my own websites many, many times. It’s a great option for simple sites or services that can handle downtime, but it’s not always the best option.

My wife’s blog, my ecommerce store, and my own blog are all hosted off-site. Why? Again, it comes down to reliability.

When I was running my small business and handling dozens (or more) transactions per month, I needed a service that was up more than my home network is. Because I tinker on my home network, there’s definitely downtime related to that.

Logo of the Ghost content creation platform


How to Deploy a Ghost Blog With Docker

Ghost is a popular content creation platform that’s written in JavaScript with Node.

My business couldn’t be at the whim of me tinkering in my office, so I decided to keep those websites off-site.

Now, if I’m just running a site for fun, I have no problem hosting that at home. In fact, my home network does have probably a 98% uptime, but it’s the fact that I’m 1.99% shy of professional hosts that keeps me from hosting core websites at home.

Apple Music works better than anything I can do myself

It just works

The Apple Music app seen on an iPhone. Credit: Jason Montoya / How-To Geek

I run my own Plex server with all of my movies and media right here at home, and I love it. I won’t even consider not having a Plex server at this point.

I also run my own audiobook server and absolutely love it, so much so that I wouldn’t have it any other way now either.

However, when it comes to music, I pay for Apple Music through my Apple One Premiere membership without question. One reason is it works well in my all-Apple smart home, but it’s deeper than that.

Apple Music on an iPhone.


5 Reasons I’m Going Back to Apple Music From Spotify

The streaming service shuffle is getting old.

With movies and TV shows, I don’t mind sitting down to find the content I want and then source it all, rip it all, categorize it all. Movies are easy to do that with. Music? Not so much.

I enjoy finding new music as I listen, instead of before I listen. This is possible with Apple Music’s radio functionality, and that’s just something I can’t replicate at home. If I were to only listen to music that I had on my server, I’d rarely discover new music.

That alone keeps me from hosting my own music server. I know others that do this, but it’s just not for me.


I still put plenty of effort into other self-hostable services

While I won’t host these five services in my homelab, there are plenty of other things that I self-host! One of my favorite things is Scrypted, as it allows me to record all of my home security cameras to local storage instead of paying someone to use their cloud. It’s more private, offers AI features on all cameras, and keeps everything local.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


As I’m writing this, NVIDIA is the largest company in the world, with a market cap exceeding $4 trillion. Team Green is now the leader among the Magnificent Seven of the tech world, having surpassed them all in just a few short years.

The company has managed to reach these incredible heights with smart planning and by making the right moves for decades, the latest being the decision to sell shovels during the AI gold rush. Considering the current hardware landscape, there’s simply no reason for NVIDIA to rush a new gaming GPU generation for at least a few years. Here’s why.

Scarcity has become the new normal

Not even Nvidia is powerful enough to overcome market constraints

Global memory shortages have been a reality since late 2025, and they aren’t just affecting RAM and storage manufacturers. Rather, this impacts every company making any product that contains memory or storage—including graphics cards.

Since NVIDIA sells GPU and memory bundles to its partners, which they then solder onto PCBs and add cooling to create full-blown graphics cards, this means that NVIDIA doesn’t just have to battle other tech giants to secure a chunk of TSMC’s limited production capacity to produce its GPU chips. It also has to procure massive amounts of GPU memory, which has never been harder or more expensive to obtain.

While a company as large as NVIDIA certainly has long-term contracts that guarantee stable memory prices, those contracts aren’t going to last forever. The company has likely had to sign new ones, considering the GPU price surge that began at the beginning of 2026, with gaming graphics cards still being overpriced.

With GPU memory costing more than ever, NVIDIA has little reason to rush a new gaming GPU generation, because its gaming earnings are just a drop in the bucket compared to its total earnings.

NVIDIA is an AI company now

Gaming GPUs are taking a back seat

A graph showing NVIDIA revenue breakdown in the last few years. Credit: appeconomyinsights.com

NVIDIA’s gaming division had been its golden goose for decades, but come 2022, the company’s data center and AI division’s revenue started to balloon dramatically. By the beginning of fiscal year 2023, data center and AI revenue had surpassed that of the gaming division.

In fiscal year 2026 (which began on July 1, 2025, and ends on June 30, 2026), NVIDIA’s gaming revenue has contributed less than 8% of the company’s total earnings so far. On the other hand, the data center division has made almost 90% of NVIDIA’s total revenue in fiscal year 2026. What I’m trying to say is that NVIDIA is no longer a gaming company—it’s all about AI now.

Considering that we’re in the middle of the biggest memory shortage in history, and that its AI GPUs rake in almost ten times the revenue of gaming GPUs, there’s little reason for NVIDIA to funnel exorbitantly priced memory toward gaming GPUs. It’s much more profitable to put every memory chip they can get their hands on into AI GPU racks and continue receiving mountains of cash by selling them to AI behemoths.

The RTX 50 Super GPUs might never get released

A sign of times to come

NVIDIA’s RTX 50 Super series was supposed to increase memory capacity of its most popular gaming GPUs. The 16GB RTX 5080 was to be superseded by a 24GB RTX 5080 Super; the same fate would await the 16GB RTX 5070 Ti, while the 18GB RTX 5070 Super was to replace its 12GB non-Super sibling. But according to recent reports, NVIDIA has put it on ice.

The RTX 50 Super launch had been slated for this year’s CES in January, but after missing the show, it now looks like NVIDIA has delayed the lineup indefinitely. According to a recent report, NVIDIA doesn’t plan to launch a single new gaming GPU in 2026. Worse still, the RTX 60 series, which had been expected to debut sometime in 2027, has also been delayed.

A report by The Information (via Tom’s Hardware) states that NVIDIA had finalized the design and specs of its RTX 50 Super refresh, but the RAM-pocalypse threw a wrench into the works, forcing the company to “deprioritize RTX 50 Super production.” In other words, it’s exactly what I said a few paragraphs ago: selling enterprise GPU racks to AI companies is far more lucrative than selling comparatively cheaper GPUs to gamers, especially now that memory prices have been skyrocketing.

Before putting the RTX 50 series on ice, NVIDIA had already slashed its gaming GPU supply by about a fifth and started prioritizing models with less VRAM, like the 8GB versions of the RTX 5060 and RTX 5060 Ti, so this news isn’t that surprising.

So when can we expect RTX 60 GPUs?

Late 2028-ish?

A GPU with a pile of money around it. Credit: Lucas Gouveia / How-To Geek

The good news is that the RTX 60 series is definitely in the pipeline, and we will see it sooner or later. The bad news is that its release date is up in the air, and it’s best not to even think about pricing. The word on the street around CES 2026 was that NVIDIA would release the RTX 60 series in mid-2027, give or take a few months. But as of this writing, it’s increasingly likely we won’t see RTX 60 GPUs until 2028.

If you’ve been following the discussion around memory shortages, this won’t be surprising. In late 2025, the prognosis was that we wouldn’t see the end of the RAM-pocalypse until 2027, maybe 2028. But a recent statement by SK Hynix chairman (the company is one of the world’s three largest memory manufacturers) warns that the global memory shortage may last well into 2030.

If that turns out to be true, and if the global AI data center boom doesn’t slow down in the next few years, I wouldn’t be surprised if NVIDIA delays the RTX 60 GPUs as long as possible. There’s a good chance we won’t see them until the second half of 2028, and I wouldn’t be surprised if they miss that window as well if memory supply doesn’t recover by then. Data center GPUs are simply too profitable for NVIDIA to reserve a meaningful portion of memory for gaming graphics cards as long as shortages persist.


At least current-gen gaming GPUs are still a great option for any PC gamer

If there is a silver lining here, it is that current-gen gaming GPUs (NVIDIA RTX 50 and AMD Radeon RX 90) are still more than powerful enough for any current AAA title. Considering that Sony is reportedly delaying the PlayStation 6 and that global PC shipments are projected to see a sharp, double-digit decline in 2026, game developers have little incentive to push requirements beyond what current hardware can handle.

DLSS 5, on the other hand, may be the future of gaming, but no one likes it, and it will take a few years (and likely the arrival of the RTX 60 lineup) for it to mature and become usable on anything that’s not a heckin’ RTX 5090.

If you’re open to buying used GPUs, even last-gen gaming graphics cards offer tons of performance and are able to rein in any AAA game you throw at them. While we likely won’t get a new gaming GPU from NVIDIA for at least a few years, at least the ones we’ve got are great today and will continue to chew through any game for the foreseeable future.



Source link