Every few months, it’s worth asking yourself how much of your software spending is actually necessary. Open-source apps have quietly closed the gap with many paid tools, and in some cases, they’ve gone even further than their commercial counterparts in specific, practical ways.
If you have some free time this weekend, here are three apps worth trying first. Each one helps cut a different kind of expense, and at least one of them is probably relevant to something you’re already paying for.
All of these apps work on Linux, Windows, and macOS.
Handbrake
Shrink your storage bill by re-encoding your videos
HandBrake is a free, open-source video transcoder. You can use it to convert video files between formats, rip DVDs and Blu-rays, or compress footage into smaller file sizes using a more efficient codec. The obvious cost savings come from skipping paid tools like Adobe Media Encoder or Wondershare UniConverter. But for some people, HandBrake can save hundreds of dollars a year.
For example, I travel frequently, and after every trip I come back with around 1TB of raw footage. Four trips a year means roughly 4TB of video annually. That kind of storage adds up quickly—whether you’re buying external hard drives or paying for cloud storage through services like Google One or iCloud.
What HandBrake does is convert that footage from H.264—the default codec used by most cameras and phones—to H.265 (HEVC), which is significantly more efficient. In my experience, that conversion reduces file sizes by around 75%. So instead of storing 4TB of footage every year, I only need about 1TB while maintaining nearly identical visual quality. That translates directly into lower storage costs over time.
Upscayl
Stop paying the 4K tax
Upscayl is a free, open-source AI image upscaler that runs entirely offline. It uses local AI models to increase an image’s resolution, making low-resolution images look sharper and more detailed. You can technically upscale images by up to 16x, though the app itself recommends staying at 4x or lower since artifacts become more noticeable beyond that point.
The obvious financial benefit to using Upscayl is avoiding paid tools like Topaz Gigapixel. But there’s a more practical use case here for people like me who download a lot of wallpapers.
You’ve probably noticed that many wallpaper websites offer 1080p downloads for free but lock the 4K versions behind a paywall. That becomes even more frustrating if you use an ultrawide monitor, where high-resolution wallpapers are already harder to find. With Upscayl, you can download the free 1080p version and upscale it yourself. The results are often good enough for desktop use, even on large ultrawide displays, without the obvious pixelation you’d get from standard image scaling.
There’s also a useful angle for AI image generation. On many platforms, generating a 4K image costs significantly more credits than generating one at 1080p. However, you’re most likely paying for the composition, lighting, and prompt result—not the raw resolution itself. As such, a smarter workflow is to generate the image at 1080p and upscale it afterward with Upscayl. That lets you stretch your subscription credits much further while still ending up with a high-resolution image.
Ollama
Run basic LLM-powered automations for free
Ollama lets you run AI language models locally on your own hardware. That means no recurring subscriptions, no API fees, and no concerns about your data leaving your machine. However, most people who approach Ollama as a replacement for ChatGPT usually end up disappointed. Most local models that can realistically run on consumer hardware simply aren’t as capable as Claude or ChatGPT—even compared to the free tiers of those services.
But that’s also missing the point. Local LLMs don’t necessarily need to function as general-purpose chatbots. Where Ollama really shines is automation.
Ollama mimics OpenAI’s API structure, which makes it easy to plug local models into automation tools like n8n. Instead of paying per request to a cloud AI provider, you can run many of those workflows entirely on your own machine.
For example, I was previously spending around $5–$10 per month on OpenAI API credits for a handful of personal workflows:
- Turning transcribed voice notes into structured notes in Obsidian
- Creating Google Calendar events from voice commands
- Analyzing and renaming screenshots for my articles
- Reading photos of receipts and logging them into a budget spreadsheet
- Sending natural-language commands to Home Assistant
Now, none of these tasks require frontier-level intelligence, and I’ve been able to offload all of them to a local Qwen3.5 9B model running through Ollama. Using 4-bit quantization, I get roughly 20–30 tokens per second on an NVIDIA GeForce RTX 3060 with 12GB of VRAM.
These FOSS apps are even better if your PC has a GPU
While having a GPU isn’t technically necessary for running these FOSS apps, having one makes a lot of difference. You’ll see faster encodes, quicker upscales, and snappier inference. If you’ve got a discrete graphics card sitting in your machine, you’re already set up to get the most out of everything here.



