The Galaxy Z TriFold is laid to rest. Here’s what I want its successor to fix


The Samsung Galaxy Z TriFold is, by almost every measure, a phone that shouldn’t exist in the first place, and yet here we are: a massive 10-inch screen, two hinges, and a price tag that might make your wallet cry. 

Samsung knew it was a first-generation device, which is why it kept production intentionally limited, a controlled showcase of engineering ambition rather than a full market rollout. 

However, “more hits than misses” is not the bar you set for a device that costs almost as much as two or three conventional smartphones. For now, the TriFold is gone, but its successor — the Galaxy Z TriFold 2 — is reportedly on the company’s roadmap, perhaps being sketched, argued over, and stress-tested in a lab. 

Spec Samsung Galaxy Z TriFold
Display 10-inch main (AMOLED, 120Hz) + 6.5-inch cover
Peak Brightness 1,600 nits (main) / 2,600 nits (cover)
Chipset Snapdragon 8 Elite for Galaxy
RAM / Storage 16GB RAM / 512GB or 1TB
Rear Cameras 200MP wide + 12MP ultrawide + 10MP 3x telephoto
Front Cameras 10MP (cover) + 10MP (foldable screen)
Battery / Charging 5,600mAh / 45W wired, 15W wireless
Ingress Protection IP48
Dimensions 3.9–4.2mm unfolded / 12.9mm folded / 309g

5 things that the Galaxy Z TriFold 2 desperately needs to fix

When the Galaxy Z TriFold 2 arrives, it needs to arrive differently, not just as a thinner, shinier version of the current-generation foldable, but more as a phone that earns its place in more pockets. 

Here is a list of things that need to change in the Galaxy Z TriFold 2, in my frank opinion, as they could seriously make the difference between a phone people admire (from a distance) and one that they actually want to buy. 

A thinner, more durable hinge and chassis

The original TriFold’s dual-hinge system was, in my opinion, an engineering marvel, but it was also the most obvious compromise. At 12.9mm thick when folded and weighing 309 grams, the TriFold seemed gargantuan compared to Samsung’s Fold 7. For those catching up, the Fold 7 measures 8.9 mm thick and weighs just 215 grams. 

Now, I understand that two hinges will always take up more space than one, which explains the TriFold’s thickness. However, this is where the single-fold Fold 7 feels more like a polished product, and the TriFold doesn’t. The good news is that the company already knows this. 

Recent rumors suggest that Samsung is developing an “entirely new hinge solution” from the ground up for the TriFold 2, with the objective of making it meaningfully slimmer. Thinness alone, however, is not enough. If the phone wants to be considered as a daily driver, it needs to survive the brutal reality of everyday life. 

Dust, drops, the unorganized items inside a bag, and the pressure that tight jeans pockets apply on a phone: the TriFold 2 must be able to survive all of this better than the TriFold, and slimming the hinge shouldn’t come at the cost of structural integrity

Phone Type Unfolded Thickness Folded Thickness Weight
Samsung Galaxy Z TriFold Tri-fold 3.9–4.2mm 12.9mm 309g
Huawei Mate XT Ultimate Tri-fold 3.6-4.8mm 12.8mm 298g
Samsung Galaxy Z Fold 7 Dual-fold 4.2mm 8.9mm 215g
Google Pixel 10 Pro Fold Dual-fold 5.2mm 10.8mm 257g

A better ingress protection rating

The Galaxy Z TriFold shipped with an IP48 rating, the same as the Fold 7, and already better than the Huawei Mate XT (which came with an IPX8 rating without any dust protection). 

However, “better than Huawei’s Mate XT” isn’t exactly a glorifying benchmark, especially when the Pixel 10 Pro Fold has become the first foldable to achieve a full IP68 rating, the same as conventional flagships. 

For a device positioned as the pinnacle of Samsung’s engineering, an IP48 feels less assuring. The TriFold 2, in my opinion, needs to match the IP68 as a baseline, and so does the Fold 7. 

Phone Type IP Rating Dust Protection Water Protection
Samsung Galaxy Z TriFold Tri-fold IP48 Partial (particles over 1mm) Up to 1.5m for 30 mins
Samsung Galaxy Z Fold 7 Dual-fold IP48 Partial (particles over 1mm) Up to 1.5m for 30 mins
Huawei Mate XT Ultimate Tri-fold IPX8 None Up to 1.5m for 30 mins
Google Pixel 10 Pro Fold Dual-fold IP68 Full (dust-tight) Up to 1.5m for 30 mins

Higher peak brightness for the inner display

Screen estate is the TriFold’s entire argument. It’s the reason you’re paying the premium, for the idea of fitting a large-screen foldable smartphone in your pocket (technically, you can). However, to me, it’s genuinely baffling that the phone’s main 10-inch screen peaks at just 1,600 nits, which is lesser than the Galaxy Z Fold 5’s inner screen from 2023. 

For context, the Galaxy Z Fold 7’s inner screen hits 2,600 nits, as does the Galaxy S26 Ultra, and the TriFold’s outer screen. And while these might sound like bare numbers, they’re very important when you’re using the smartphone outdoors, under direct sunlight. 

It’s the difference between holding the phone confidently in the street on a bright sunny day, and running into the shade to read the notification and replying. Given the company’s strong hold over its displays, I would really appreciate a brighter display for everyday use, on par with modern flagships and regular foldables. 

Phone Type Inner Display Brightness Cover Display Brightness
Samsung Galaxy Z TriFold Tri-fold 1,600 nits 2,600 nits
Samsung Galaxy Z Fold 7 Dual-fold 2,600 nits 2,600 nits
Google Pixel 10 Pro Fold Dual-fold 3,000 nits 3,000 nits

A more powerful chip for better multitasking

The Galaxy Z TriFold featured the Snapdragon 8 Elite chip, which, at the time, was the most powerful smartphone chip. However, due to thermal constraints, the device ran slower than the other 8 Elite-powered smartphones, such as the S25 Ultra.

While I’m not expecting the TriFold 2 to fix that issue entirely, given that it would also feature a thin chassis with very limited space for a dedicated cooling mechanism, a chipset upgrade could surely improve multitasking, gaming, and overall responsiveness. 

This year, the TriFold 2 should feature the Snapdragon 8 Elite Gen 5 chip, the one we’ve seen on the Galaxy S26 Ultra (globally) and the S26 and S26 Plus (in the U.S., China, and Japan). Even with thermal throttling, the chipset could surely unlock a meaningful performance upgrade. 

Phone Type Chipset Availability
Samsung Galaxy Z TriFold Tri-fold Snapdragon 8 Elite for Galaxy Global
Samsung Galaxy Z Fold 7 Dual-fold Snapdragon 8 Elite for Galaxy Global
Samsung Galaxy S25 Ultra Slab Snapdragon 8 Elite for Galaxy Global
Samsung Galaxy S26 Ultra Slab Snapdragon 8 Elite Gen 5 for Galaxy Global

The TriFold 2 desperately needs better selfie cameras

The Galaxy Z TriFold’s rear camera setup is still better. A 200MP main camera, a 10MP telephoto, and a 12MP ultrawide; all of these let users fiddle with multiple perspectives and zoom levels to get the picture they want without moving around too much. Selfie cameras, however, are a slightly different story. 

The TriFold’s selfie camera setup is quite symmetrical: a 10MP (f/2.2) on the cover screen and a 10MP (f/2.2) camera on the main 10-inch foldable screen. But in my opinion, it isn’t something buyers expect to get with one of the most expensive smartphones money can buy. 

While the selfie shots might not be the problem for most buyers, I’m not even sure whether they’re looking at it from a quality perspective; it’s the software that’s doing the heavy lifting there.

I appreciate the ultrawide field of view from the inner-screen sensor, I really do, as it helps get more people in a selfie, but I sincerely want Samsung to increase the resolution for both sensors. Additionally, selfie cameras could use slightly larger sensors for better low-light performance. 



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


The battle between AMD and NVIDIA rages on eternally, it seems, though it’s rather a one-sided battle in the desktop PC market, where NVIDIA holds something like 95%, and AMD most of what’s left apart from Intel’s (almost) 1%.

But as dominant and popular as NVIDIA is, AMD proponents could always raise the value argument. On a per-dollar basis, you get more value with an AMD card, and even better, you have the benefit of AMD “FineWine” which ensures your card will become even better with time.

What “FineWine” meant—and why it mattered

FineWine was something that AMD fans began to notice during the GCN (Graphics Core Next) architecture. Incidentally, the last AMD dedicated GPU I bought was the R9 390, which was of that lineage. Since then, all my AMD GPUs have been embedded in consoles or handheld PCs, but I digress.

The R9 390 is actually a good example of FineWine. Launched in 2015, like many AMD cards, the R9 390 had a rough start, and I sold mine in exchange for a stopgap card in the form of the RTX 2060, because I wanted to play Cyberpunk 2077 on PC, where it wasn’t broken the way it was on consoles. Even though, on paper, the raw power of the RTX 2060 wasn’t much more than a 390, the AMD card’s performance on my (then) 1080p monitor was a stuttery mess, whereas everything suddenly ran great on my 2060 the minute the AMD GPU was expunged from the system.

But, a decade later, that same game is perfectly playable on this card, as you can see in this TechLabUK video.

A lot of it is because the developers have kept patching and improving the game, but this is something you see across the board for AMD cards on various games. This is FineWine. Years later, with continued driver updates from AMD, the cards go from being a little worse than their NVIDIA equivalent at launch to being as good or even a little better in the long run.

Of course, that’s not super helpful to customers who buy hardware at launch, but it has given some AMD users computers with longer lifespans than you’d think, and made many used AMD cards an even better bargain.

Why AMD’s FineWine era worked

A bit of smoke and mirrors

The PULSE AMD Radeon RX 6800 XT next to an AMD RX 6600 XT Phantom Gaming D. Credit: Ismar Hrnjicevic / How-To Geek

FineWine wasn’t magic, of course. The phenomenon was the result of a mix of factors. AMD’s architectures were in some cases a little too forward-thinking for the APIs of the day. Massively parallel with a focus on compute, they’d only come into their own with DirectX 12 and more modern games. NVIDIA’s cards at the time were better optimized to run current games well. Over time, NVIDIA cards would make similar architectural changes, but with better timing.

The other reason FineWine was a thing came down to driver maturity. As a much smaller company with fewer resources, it seems that AMD had some trouble releasing cards with optimized drivers. So, over time, the card would start performing as intended.

In both cases, you could frame FineWine not as the card getting better, but rather getting “less worse” over time. If you set the bar low at launch, the only way is up. However, there’s a third factor to take into account as well. AMD dominates console gaming. The two major home console series have now run on AMD GPUs for two generations, and so games are developed with that hardware in mind. This also gives newer titles a bit of a leg up, though it’s hard to know exactly by how much.

How AMD moved on from FineWine

It seems worse, but it’s actually better

An AMD RX 9070 XT Gigabyte gaming graphics card. Credit: Ismar Hrnjicevic / How-To Geek

With the shift to RDNA architecture, AMD made a deliberate change in philosophy. Modern Radeon GPUs are designed to perform well right out of the gate. Reviews on day one are much closer to what you could expect years later. There are still decent gains to be had on RDNA cards with game-specific optimizations (Spider-Man on PC is a great example), but the golden age of FineWine seems to be in the past now.

That’s a good thing! Products should put their best foot forward on day one, so let’s not shed a tear for FineWine in that regard. So it’s not so much that AMD doesn’t care about improving the performance and stability of older cards over the years, it’s that the company is now better at its job, and so there’s less room for improvement.

Sapphire NITRO+ AMD Radeon RX 9070 XT GPU

Cooling Method

Air

GPU Speed

2520Mhz

The AMD Radeon RX 9070 XT from Sapphire features 16GB of DDR6 memory, two HDMI and two DisplayPorts, and an overengineered cooling setup that will keep the card cool and whisper quiet no matter the workload.


NVIDIA kept the idea—but changed the formula

It’s all about AI

It’s funny, but these days I think of NVIDIA cards as the ones with major longevity. Take the venerable GTX 1080 and 1080 Ti cards. These cards only lost game-ready driver support in 2025, which doesn’t immediately make them useless, it just means no more optimization for those chips. What an incredible run, getting a decade of relevant game performance from a GPU!

But, that’s not really NVIDIA’s take on FineWine. Instead, the company has taken to adding new and better features to its cards long after they’ve been launched. Starting with the 20-series, the presence of machine-learning hardware means that by improving the AI algorithms for technologies like DLSS, these cards have become more performant with better image quality over time.

While NVIDIA has made some features of its AI technology exclusive to each generation, so far all post 10-series GPUs benefit from every new generation of DLSS. Compare that to AMD which not only offers inferior versions of this new upscaling technology, but has locked the better, more usable versions to later cards, such as the case with FSR Redstone.


FineWine is an ethos, not a brand

In the case of my humble RTX 4060 laptop, the release of DLSS 4.5 has opened new possibilities, notably the ability to target a 4K output resolution, which was certainly not on the table when I first took this computer out of the box. We might not call it “FineWine,” but it sure smells like it to me!



Source link