My favorite PDF tool keeps letting me down, so I switched to its open-source rival


For a long time, my relationship with PDF tools followed a predictable arc. I would try something bloated, uninstall it within a week, and go back to stitching together command line utilities like a person who clearly had too much patience and not enough time. Then I found Stirling-PDF, and for once, the story seemed to end differently.

It was simple and ran locally without being a document management system disguised as a utility. It felt like the kind of tool you install and forget about, which is usually the highest compliment in this category. For a while, it worked exactly like that, but then it started to drift.

When reliability stops being invisible

Subtle failures start breaking trust

The first issue I noticed was subtle enough that I assumed it was my fault. I merged a few PDFs, checked the output, and something felt off. One of the original files had been overwritten. No warning, prompt or indication that anything unusual had happened. That alone would have been concerning, but still explainable. Maybe I misclicked something or there was a naming collision I did not notice.

Merging multiple PDF page using Stirling-PDF.

Then it happened again and then splitting started behaving strangely. I would take a document, split it into two parts, and only one would actually be saved. The process would be complete without errors, which is arguably worse than failing loudly. Silent failure creates a kind of ambiguity that makes you distrust not just the tool, but your own workflow and once that happens, the entire value proposition collapses.

Control in exchange for predictability

There is an implicit contract when you use a local, self-hosted tool. You give up convenience, polish, and sometimes performance in exchange for control and predictability.

Word logo on a surface and the Google Docs logo on a lower surface.


This new open-source office suite wants to replace Google Docs and Microsoft Office

Euro-Office is based on OnlyOffice, with collaborative editing support.

Stirling-PDF originally honored that contract well, and it did not feel like a black box but a thin layer over reliable operations.

The recent behavior breaks that assumption. It is not that the tool is crashing or refusing to work. It is that it sometimes does the wrong thing without telling you, which keeps you on edge because you don’t know what exactly to expect.

It’s not just me

Others started noticing similar issues

At this point, I did what most people do when something feels off but not obviously broken. I looked around to see if others were noticing the same pattern.

The answer, unsurprisingly, was yes. People reported high CPU usage even when idle. Others mentioned unusually large container sizes. Some ran into memory consumption that did not align with what the tool was actually doing. There were also concerns about tracking behavior introduced at some point, which did not sit well with users who explicitly chose a self-hosted solution to avoid that class of problem.

None of these issues on their own would be fatal. Together, they start to form a pattern. The tool is no longer as lightweight, predictable, or transparent as it once was. At that point, the question shifts from “how do I fix this?” to “Should I still be using this?”

Looking for alternatives without lowering standards

Finding balance between simplicity and control

PDF file in Google Chrome


Why Did Web Browsers Become PDF Readers?

The quest to replace browser plug-ins had a few side effects.

Finding alternatives in this space is deceptively difficult. There are plenty of PDF tools, but most fall into one of two categories.

The first category is enterprise software that assumes you are managing workflows across teams and departments (often loaded with AI features). The second is web-based tools that promise convenience but require you to upload documents somewhere you do not control and don’t know what might happen to them.

What I wanted was something closer to the original spirit of Stirling-PDF. Local, simple, and reliable and that is how I ended up trying Bento PDF.

The appeal of doing less

Simpler design changes the experience

The design philosophy behind Bento PDF is noticeably different. Instead of running as a heavy backend service, it operates entirely in the browser. That sounds like a limitation at first, but in practice it simplifies a lot of things.

screenshot of Bento PDF user interface

There is no long-running container consuming resources in the background and there is no question about whether your files are being processed locally or somewhere else. Everything happens in the same place as you interact with the interface. That alone removes an entire class of problems and also changes how you think about the tool. Instead of being infrastructure, it becomes a utility again.

Where It Starts Feeling Effortless

Everything feels lighter in daily use

One of the more surprising differences is how light Bento PDF feels in day-to-day use. Operations that felt slightly heavy in Stirling-PDF now feel immediate.

This is not necessarily because it does less work but because the execution model is simpler without any overhead from a containerized backend, idle resource consumption, and fewer moving parts overall.

The result is not that dramatic in a benchmark sense, but it is noticeable in practice. You stop thinking about the tool again, which is where you want to be.

Anytype logo on a background with circles around them.


I replaced 4 paid subscriptions with one free app

I ditched Notion, Trello, Dropbox, and Google Photos. Anytype handles docs, boards, and galleries in one workspace—without a monthly subscription.

Trust is built on small guarantees

Predictability matters more than features

The biggest improvement, though, is not performance. It is predictability. When you merge files in Bento PDF, it produces a new file. It does not overwrite anything unless you explicitly choose to replace something. When you split a document, all expected outputs are generated.

These sound like trivial guarantees, but they are exactly what broke in my experience with Stirling-PDF. Once a tool violates these assumptions, every operation becomes a small risk. You start keeping backups of temporary files, and generally spend more time verifying than doing it. That overhead is easy to underestimate until it is gone.

Trade-offs still exist

Fewer features but more reliability

Switching tools is never a purely positive story. Bento PDF does not replicate every feature one-to-one. For example, comparison features in Stirling-PDF are more mature. Depending on your workflow, that might matter. If you rely on specific advanced operations, you may find gaps, but this is where priorities matter.

Synology DS425+ on a white background.

7/10

Brand

Synology

CPU

Intel Celeron J4125


I would rather have a smaller set of features that behave consistently than a larger set that occasionally fails in ways that are hard to detect.


Where I ended Up

Right now, my setup is simpler than it was before. There is one less container running and one less service to monitor.

More importantly, I am no longer thinking about my PDF tool at all. Which, in a strange way, is the best outcome possible.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


As I’m writing this, NVIDIA is the largest company in the world, with a market cap exceeding $4 trillion. Team Green is now the leader among the Magnificent Seven of the tech world, having surpassed them all in just a few short years.

The company has managed to reach these incredible heights with smart planning and by making the right moves for decades, the latest being the decision to sell shovels during the AI gold rush. Considering the current hardware landscape, there’s simply no reason for NVIDIA to rush a new gaming GPU generation for at least a few years. Here’s why.

Scarcity has become the new normal

Not even Nvidia is powerful enough to overcome market constraints

Global memory shortages have been a reality since late 2025, and they aren’t just affecting RAM and storage manufacturers. Rather, this impacts every company making any product that contains memory or storage—including graphics cards.

Since NVIDIA sells GPU and memory bundles to its partners, which they then solder onto PCBs and add cooling to create full-blown graphics cards, this means that NVIDIA doesn’t just have to battle other tech giants to secure a chunk of TSMC’s limited production capacity to produce its GPU chips. It also has to procure massive amounts of GPU memory, which has never been harder or more expensive to obtain.

While a company as large as NVIDIA certainly has long-term contracts that guarantee stable memory prices, those contracts aren’t going to last forever. The company has likely had to sign new ones, considering the GPU price surge that began at the beginning of 2026, with gaming graphics cards still being overpriced.

With GPU memory costing more than ever, NVIDIA has little reason to rush a new gaming GPU generation, because its gaming earnings are just a drop in the bucket compared to its total earnings.

NVIDIA is an AI company now

Gaming GPUs are taking a back seat

A graph showing NVIDIA revenue breakdown in the last few years. Credit: appeconomyinsights.com

NVIDIA’s gaming division had been its golden goose for decades, but come 2022, the company’s data center and AI division’s revenue started to balloon dramatically. By the beginning of fiscal year 2023, data center and AI revenue had surpassed that of the gaming division.

In fiscal year 2026 (which began on July 1, 2025, and ends on June 30, 2026), NVIDIA’s gaming revenue has contributed less than 8% of the company’s total earnings so far. On the other hand, the data center division has made almost 90% of NVIDIA’s total revenue in fiscal year 2026. What I’m trying to say is that NVIDIA is no longer a gaming company—it’s all about AI now.

Considering that we’re in the middle of the biggest memory shortage in history, and that its AI GPUs rake in almost ten times the revenue of gaming GPUs, there’s little reason for NVIDIA to funnel exorbitantly priced memory toward gaming GPUs. It’s much more profitable to put every memory chip they can get their hands on into AI GPU racks and continue receiving mountains of cash by selling them to AI behemoths.

The RTX 50 Super GPUs might never get released

A sign of times to come

NVIDIA’s RTX 50 Super series was supposed to increase memory capacity of its most popular gaming GPUs. The 16GB RTX 5080 was to be superseded by a 24GB RTX 5080 Super; the same fate would await the 16GB RTX 5070 Ti, while the 18GB RTX 5070 Super was to replace its 12GB non-Super sibling. But according to recent reports, NVIDIA has put it on ice.

The RTX 50 Super launch had been slated for this year’s CES in January, but after missing the show, it now looks like NVIDIA has delayed the lineup indefinitely. According to a recent report, NVIDIA doesn’t plan to launch a single new gaming GPU in 2026. Worse still, the RTX 60 series, which had been expected to debut sometime in 2027, has also been delayed.

A report by The Information (via Tom’s Hardware) states that NVIDIA had finalized the design and specs of its RTX 50 Super refresh, but the RAM-pocalypse threw a wrench into the works, forcing the company to “deprioritize RTX 50 Super production.” In other words, it’s exactly what I said a few paragraphs ago: selling enterprise GPU racks to AI companies is far more lucrative than selling comparatively cheaper GPUs to gamers, especially now that memory prices have been skyrocketing.

Before putting the RTX 50 series on ice, NVIDIA had already slashed its gaming GPU supply by about a fifth and started prioritizing models with less VRAM, like the 8GB versions of the RTX 5060 and RTX 5060 Ti, so this news isn’t that surprising.

So when can we expect RTX 60 GPUs?

Late 2028-ish?

A GPU with a pile of money around it. Credit: Lucas Gouveia / How-To Geek

The good news is that the RTX 60 series is definitely in the pipeline, and we will see it sooner or later. The bad news is that its release date is up in the air, and it’s best not to even think about pricing. The word on the street around CES 2026 was that NVIDIA would release the RTX 60 series in mid-2027, give or take a few months. But as of this writing, it’s increasingly likely we won’t see RTX 60 GPUs until 2028.

If you’ve been following the discussion around memory shortages, this won’t be surprising. In late 2025, the prognosis was that we wouldn’t see the end of the RAM-pocalypse until 2027, maybe 2028. But a recent statement by SK Hynix chairman (the company is one of the world’s three largest memory manufacturers) warns that the global memory shortage may last well into 2030.

If that turns out to be true, and if the global AI data center boom doesn’t slow down in the next few years, I wouldn’t be surprised if NVIDIA delays the RTX 60 GPUs as long as possible. There’s a good chance we won’t see them until the second half of 2028, and I wouldn’t be surprised if they miss that window as well if memory supply doesn’t recover by then. Data center GPUs are simply too profitable for NVIDIA to reserve a meaningful portion of memory for gaming graphics cards as long as shortages persist.


At least current-gen gaming GPUs are still a great option for any PC gamer

If there is a silver lining here, it is that current-gen gaming GPUs (NVIDIA RTX 50 and AMD Radeon RX 90) are still more than powerful enough for any current AAA title. Considering that Sony is reportedly delaying the PlayStation 6 and that global PC shipments are projected to see a sharp, double-digit decline in 2026, game developers have little incentive to push requirements beyond what current hardware can handle.

DLSS 5, on the other hand, may be the future of gaming, but no one likes it, and it will take a few years (and likely the arrival of the RTX 60 lineup) for it to mature and become usable on anything that’s not a heckin’ RTX 5090.

If you’re open to buying used GPUs, even last-gen gaming graphics cards offer tons of performance and are able to rein in any AAA game you throw at them. While we likely won’t get a new gaming GPU from NVIDIA for at least a few years, at least the ones we’ve got are great today and will continue to chew through any game for the foreseeable future.



Source link