3 insightful homelab projects to try this weekend (April 24


Well, it’s time for some more fun homelab projects to try out this weekend! Today, I’m showing you how to replace Google Analytics with your own self-hosted tool, a better homelab dashboard, and even how to build out your own wiki (because your homelab really does need one).

Self-host your own private Google Analytics alternative with Umami

Don’t let Google control your website analytics

Umami analytics screenshot showing page view counts and referrers with a graph. Credit: Umami

Whether you’re sick and tired of Google controlling everything, or you just want a simpler analytics interface, Umami is definitely worth checking out. I’ve personally used both self-hosted and hosted versions of Umami for analytics, and I love both methods—but self-hosting gives you the utmost control.

With Umami, you’re able to track all the relevant data you could want for a website. From page view count to visitor count, sources, environments (operating system), locations, regions, and much more, Umami really does deliver everything you need in a cleaner interface than Google Analytics.

One of my favorite features of Umami is the traffic system. It shows you a seven-day view broken down into all 24 hours. Each hour has a dot that corresponds to how many visitors you get within that timeframe. A larger dot means more visitors; no dot means no visitors. This is a simple thing that can technically be achieved with Google Analytics, it’s just not very easy to do so.

Umami is super simple to run, and a single instance can handle multiple websites for analytics tracking. You can install Umami in a number of ways, including through Docker Compose (with a sample file provided on the Umami GitHub) or by installing it through npm with instructions provided in the GitHub’s readme.

Build a homepage for your homelab with Dashy

Stop remembering IP addresses and ports

Once your homelab starts to grow beyond a few services or servers, it can become difficult to remember where everything is located. At this point, I have over half a dozen servers in my homelab and probably 50 or more services running—remembering how to access each one can be a nightmare. That’s where a homepage dashboard comes in, and this time around I’m specifically talking about Dashy.

I’ve used a lot of homelab dashboards over the years, but I’ve never really found one that I loved. Dashy might change that for me. The UI is simple and intuitive, allowing for web-based editing of bookmarks. My last homepage dashboard, Homepage, required manual editing of a YAML file, which sounds simple (and is), but it meant I had to SSH into the server and hand-write the code to add (or remove) an entry.

With Dashy, it’s all done through the web interface. Installing Dashy is simple and can be done through Docker with ease. However, I decided to go a completely different route for hosting my Dashy—Vercel. I already use Vercel for a few production websites, and hosting Dashy on Vercel means that my dashboard will stay up even if my homelab goes down.

Why is that important for me? Well, if I host my dashboard on a specific server, and that server is offline for any reason, then the map of my entire homelab is also offline. High availability helps with this, but hosting it off-site is even better. There are some downsides to hosting it on Vercel, but Dashy’s docs outline all the pros and cons for each deployment option.

Really, host Dashy however you want in your homelab, just make sure you do host it somewhere so you can stop memorizing IP addresses and ports for where your services live.

Deploy your own self-hosted wiki and knowledge base with Outline

Your homelab needs its own wiki

Outline knowledge base showing documents. Credit: Outline

While a homelab dashboard is great for quick links to services, there’s a lot more that goes into properly documenting a homelab than some bookmarks, and that’s where a proper knowledge base or wiki comes in. Outline is a great way to run your homelab’s knowledge base, and it deploys easily through Docker.

Why does your homelab need a knowledge base or wiki? Well, it’s the perfect place to document things like what servers get what IP addresses. As my homelab has grown, I’ve grown from just one reserved IP address to dozens. I now have blocks of 10 or 20 addresses that are reserved for VMs on specific machines.

I also have all of my control panels in blocks. For example, I have all of my Proxmox control panels on .11-.20 for IP addresses. Likewise, .21-.30 are reserved for storage servers. I have .31-.50 set aside for VMs running on a specific server, and so on.

I’ve memorized most of this by now, but I still have to reference my homelab documentation sometimes if I’m dealing with an edge case. Another time when I access my homelab documentation is when I have to re-deploy a service that was difficult to deploy in the first place. I typically try to document things as I go if it takes a while to configure, so if I ever have to reconfigure, it’s not as hard the second time.

The uses for a homelab wiki are endless, and you’ll find quite a few ways to use it that I’ve not even described here. You could also use Outline as a family knowledge base, storing important contacts, addresses, and more there for the entire family to access.

If you’re still writing things down on paper in your homelab, then give it a rest and deploy Outline today.

ACEMAGIC M5 mini PC.

Brand

ACEMAGIC

CPU

i7-14650HX

The ACEMAGIC M5 mini PC is perfect for setups that need a high-performance desktop with a small footprint. It boasts the Intel i7-14650HX 16-core 24-thread processor and 32GB DDR4 RAM (which is upgradable to 64GB). The pre-installed 1TB NVMe drive can be swapped out for a larger one though, and there’s a second NVMe slot for extra storage if needed.



Stop putting off starting your homelab

Running a homelab has been one of the best things I’ve done in my adult life. I got started with this hobby when I was a teenager without realizing it, but once I became an adult, it became evident that I really loved homelabbing. Spinning up servers, running services, learning networking, it’s all great fun.

I’ve learned so much that I’m able to apply it in both my personal and professional life now. I never dreamed that I would be writing about homelabbing like I am one day. Nor did I ever think I would be overseeing the rollout of a massive campus-wide new networking setup for a church as I am now.

I started homelabbing with just a few old computer parts, and now I have enterprise-level experience with both hardware and software, and it’s changed my life for the better. So, if you’ve been putting off starting your homelab, pull that old computer out of storage, fire it up, and get started. It’s more fun than you think.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


As I’m writing this, NVIDIA is the largest company in the world, with a market cap exceeding $4 trillion. Team Green is now the leader among the Magnificent Seven of the tech world, having surpassed them all in just a few short years.

The company has managed to reach these incredible heights with smart planning and by making the right moves for decades, the latest being the decision to sell shovels during the AI gold rush. Considering the current hardware landscape, there’s simply no reason for NVIDIA to rush a new gaming GPU generation for at least a few years. Here’s why.

Scarcity has become the new normal

Not even Nvidia is powerful enough to overcome market constraints

Global memory shortages have been a reality since late 2025, and they aren’t just affecting RAM and storage manufacturers. Rather, this impacts every company making any product that contains memory or storage—including graphics cards.

Since NVIDIA sells GPU and memory bundles to its partners, which they then solder onto PCBs and add cooling to create full-blown graphics cards, this means that NVIDIA doesn’t just have to battle other tech giants to secure a chunk of TSMC’s limited production capacity to produce its GPU chips. It also has to procure massive amounts of GPU memory, which has never been harder or more expensive to obtain.

While a company as large as NVIDIA certainly has long-term contracts that guarantee stable memory prices, those contracts aren’t going to last forever. The company has likely had to sign new ones, considering the GPU price surge that began at the beginning of 2026, with gaming graphics cards still being overpriced.

With GPU memory costing more than ever, NVIDIA has little reason to rush a new gaming GPU generation, because its gaming earnings are just a drop in the bucket compared to its total earnings.

NVIDIA is an AI company now

Gaming GPUs are taking a back seat

A graph showing NVIDIA revenue breakdown in the last few years. Credit: appeconomyinsights.com

NVIDIA’s gaming division had been its golden goose for decades, but come 2022, the company’s data center and AI division’s revenue started to balloon dramatically. By the beginning of fiscal year 2023, data center and AI revenue had surpassed that of the gaming division.

In fiscal year 2026 (which began on July 1, 2025, and ends on June 30, 2026), NVIDIA’s gaming revenue has contributed less than 8% of the company’s total earnings so far. On the other hand, the data center division has made almost 90% of NVIDIA’s total revenue in fiscal year 2026. What I’m trying to say is that NVIDIA is no longer a gaming company—it’s all about AI now.

Considering that we’re in the middle of the biggest memory shortage in history, and that its AI GPUs rake in almost ten times the revenue of gaming GPUs, there’s little reason for NVIDIA to funnel exorbitantly priced memory toward gaming GPUs. It’s much more profitable to put every memory chip they can get their hands on into AI GPU racks and continue receiving mountains of cash by selling them to AI behemoths.

The RTX 50 Super GPUs might never get released

A sign of times to come

NVIDIA’s RTX 50 Super series was supposed to increase memory capacity of its most popular gaming GPUs. The 16GB RTX 5080 was to be superseded by a 24GB RTX 5080 Super; the same fate would await the 16GB RTX 5070 Ti, while the 18GB RTX 5070 Super was to replace its 12GB non-Super sibling. But according to recent reports, NVIDIA has put it on ice.

The RTX 50 Super launch had been slated for this year’s CES in January, but after missing the show, it now looks like NVIDIA has delayed the lineup indefinitely. According to a recent report, NVIDIA doesn’t plan to launch a single new gaming GPU in 2026. Worse still, the RTX 60 series, which had been expected to debut sometime in 2027, has also been delayed.

A report by The Information (via Tom’s Hardware) states that NVIDIA had finalized the design and specs of its RTX 50 Super refresh, but the RAM-pocalypse threw a wrench into the works, forcing the company to “deprioritize RTX 50 Super production.” In other words, it’s exactly what I said a few paragraphs ago: selling enterprise GPU racks to AI companies is far more lucrative than selling comparatively cheaper GPUs to gamers, especially now that memory prices have been skyrocketing.

Before putting the RTX 50 series on ice, NVIDIA had already slashed its gaming GPU supply by about a fifth and started prioritizing models with less VRAM, like the 8GB versions of the RTX 5060 and RTX 5060 Ti, so this news isn’t that surprising.

So when can we expect RTX 60 GPUs?

Late 2028-ish?

A GPU with a pile of money around it. Credit: Lucas Gouveia / How-To Geek

The good news is that the RTX 60 series is definitely in the pipeline, and we will see it sooner or later. The bad news is that its release date is up in the air, and it’s best not to even think about pricing. The word on the street around CES 2026 was that NVIDIA would release the RTX 60 series in mid-2027, give or take a few months. But as of this writing, it’s increasingly likely we won’t see RTX 60 GPUs until 2028.

If you’ve been following the discussion around memory shortages, this won’t be surprising. In late 2025, the prognosis was that we wouldn’t see the end of the RAM-pocalypse until 2027, maybe 2028. But a recent statement by SK Hynix chairman (the company is one of the world’s three largest memory manufacturers) warns that the global memory shortage may last well into 2030.

If that turns out to be true, and if the global AI data center boom doesn’t slow down in the next few years, I wouldn’t be surprised if NVIDIA delays the RTX 60 GPUs as long as possible. There’s a good chance we won’t see them until the second half of 2028, and I wouldn’t be surprised if they miss that window as well if memory supply doesn’t recover by then. Data center GPUs are simply too profitable for NVIDIA to reserve a meaningful portion of memory for gaming graphics cards as long as shortages persist.


At least current-gen gaming GPUs are still a great option for any PC gamer

If there is a silver lining here, it is that current-gen gaming GPUs (NVIDIA RTX 50 and AMD Radeon RX 90) are still more than powerful enough for any current AAA title. Considering that Sony is reportedly delaying the PlayStation 6 and that global PC shipments are projected to see a sharp, double-digit decline in 2026, game developers have little incentive to push requirements beyond what current hardware can handle.

DLSS 5, on the other hand, may be the future of gaming, but no one likes it, and it will take a few years (and likely the arrival of the RTX 60 lineup) for it to mature and become usable on anything that’s not a heckin’ RTX 5090.

If you’re open to buying used GPUs, even last-gen gaming graphics cards offer tons of performance and are able to rein in any AAA game you throw at them. While we likely won’t get a new gaming GPU from NVIDIA for at least a few years, at least the ones we’ve got are great today and will continue to chew through any game for the foreseeable future.



Source link