I tested the cheapest HDMI 2.1 cables I could find, and they handled 4K gaming flawlessly


HDMI and DisplayPort cables are one of the most overpaid-for items in tech, especially by people who don’t know any better. From a non-techie’s perspective, it makes sense to assume that you need a premium cable to go with a $1,000 4K OLED TV. You can’t just plug in a $5 cable and expect everything to work, right?

Shockingly, that’s exactly what I did—I used a knock-off, unbranded, and uncertified HDMI cable to connect my gaming PC to my TV, and everything worked just fine. Allow me to explain.

A digital cable either works or it doesn’t

They’re about as binary as it gets

Two VGA cables intertwined. Credit: Tim Brookes / How-To Geek

Back in the day, when dinosaurs still roamed the Earth, and people used analog cables to connect their PCs and TV receivers to monitors and TVs, cable quality really did matter.

If an analog cable like VGA or RCA used poor-quality copper or had poor soldering, the signal could degrade, resulting in a dimmer and less stable image.

They were also extremely sensitive to interference. If you ran a poorly shielded cable near a power strip, you could easily experience issues like ghosting, flickering, or even horizontal or vertical lines running across the screen.



















Quiz
8 Questions · Test Your Knowledge

HDMI, DisplayPort, and beyond
Trivia challenge

From HDMI to DisplayPort — see how much you really know about the cables and connectors powering your screens.

HDMIDisplayPortConnectorsStandardsVideo Tech

Which version of HDMI first introduced support for 4K resolution at 60Hz?

That’s right! HDMI 2.0, released in 2013, was the first version to support 4K at 60Hz with up to 18 Gbps of bandwidth. HDMI 1.4 could do 4K but only at 30Hz, which felt noticeably choppy for most content.

Not quite — the answer is HDMI 2.0. While HDMI 1.4 introduced 4K support, it was capped at 30Hz. HDMI 2.0 bumped bandwidth to 18 Gbps, finally enabling smooth 4K at 60 frames per second.

What organization developed and maintains the DisplayPort standard?

Correct! VESA, the Video Electronics Standards Association, created DisplayPort and released version 1.0 back in 2006. VESA is the same body behind older standards like VGA and the monitor mounting pattern most people still use today.

The correct answer is VESA — the Video Electronics Standards Association. It’s easy to mix up standards bodies, but VESA has been behind DisplayPort since its debut in 2006, while the HDMI Forum manages the competing HDMI standard.

Which display connector type uses a 20-pin trapezoid-shaped design that cannot be inserted upside down?

Spot on! The standard DisplayPort connector uses a distinctive 20-pin asymmetric trapezoid shape, which means it only fits one way — no fumbling in the dark. This was a deliberate design choice to prevent accidental damage from forced insertion.

The answer is DisplayPort. Its 20-pin trapezoid shape has a notch on one side that prevents upside-down insertion, unlike HDMI which many people have accidentally tried to plug in backwards at least once. It’s one of DisplayPort’s underrated usability wins.

What is the maximum bandwidth supported by HDMI 2.1?

Excellent! HDMI 2.1 supports up to 48 Gbps of bandwidth, enabling features like 8K at 60Hz, 4K at 120Hz, and Variable Refresh Rate (VRR). It’s why HDMI 2.1 became a key selling point for PS5 and Xbox Series X gaming.

The correct answer is 48 Gbps. HDMI 2.0 topped out at 18 Gbps, but HDMI 2.1 nearly tripled that to 48 Gbps. This huge leap unlocked 4K/120Hz gaming and 8K video, making it the standard of choice for next-gen consoles and high-end TVs.

Which older display connector standard transmitted both analog and digital signals depending on the variant used?

Right! DVI came in several flavors — DVI-A carried only analog, DVI-D carried only digital, and DVI-I carried both. This made DVI a transitional standard that could bridge the gap between older analog CRT monitors and newer digital flat panels.

The answer is DVI. VGA was purely analog, but DVI was designed as a transitional format with multiple variants: DVI-A (analog only), DVI-D (digital only), and DVI-I (both). That versatility helped it bridge the CRT-to-LCD transition era of the early 2000s.

DisplayPort supports a feature called Multi-Stream Transport (MST). What does this allow?

Correct! Multi-Stream Transport lets you daisy-chain multiple monitors from a single DisplayPort output on your GPU, as long as each display supports MST passthrough. It’s a powerful feature for clean multi-monitor setups without needing extra ports on your graphics card.

The right answer is daisy-chaining multiple monitors. MST (Multi-Stream Transport) is one of DisplayPort’s most practical advantages over HDMI, letting a single port drive several displays in a chain. HDMI has no equivalent — each HDMI display requires its own dedicated port.

Thunderbolt 3 and Thunderbolt 4 use which physical connector form factor?

That’s right! Thunderbolt 3 and 4 both use the USB Type-C connector, which is why ports are often labeled with a lightning bolt symbol to distinguish them. Thunderbolt 4 can carry DisplayPort 2.0 signals, meaning a single small USB-C port can drive high-res external monitors.

The answer is USB Type-C. Intel’s Thunderbolt 3 moved away from the Mini DisplayPort connector used in Thunderbolt 1 and 2, adopting USB-C instead. This is why modern laptops can use the same tiny port for charging, data transfer, and connecting 4K displays.

Which display connection standard was specifically designed as a royalty-free replacement for VGA in computers, and explicitly prohibited from being used in consumer electronics like TVs?

Spot on! DisplayPort was intentionally designed for the PC market and its specification originally prohibited use in consumer electronics — a deliberate market split from HDMI. It’s royalty-free for manufacturers to implement, which helped drive its adoption across PC monitors and laptops.

The answer is DisplayPort. VESA designed DisplayPort specifically for the PC ecosystem as a royalty-free alternative to VGA and DVI, and the original spec actually barred its use in televisions and consumer AV gear — that territory was left to HDMI. This market split largely still holds today.

Challenge Complete

Your Score

/ 8

Thanks for playing!

Fortunately, it’s been many years since we’ve moved to newer standards like DisplayPort and HDMI, both of which are digital.

Digital signals can still suffer from interference and electrical noise, but because they encode data as 1s and 0s, they’re far more robust and easier to reconstruct at the receiving end. These cables also use differential signaling, meaning they send the signal across a pair of wires in opposite phases to help cancel out interference.

This all sounds complicated, but what matters in practice is simple: as long as transmission errors stay within a certain limit, a digital interface can correct or ignore them, and you still get a perfect image. On the flip side, if the cable is affected by too much interference or has a physical defect, you’ll quickly run into issues, like flickering, signal dropouts, or, most commonly, a black screen.

This is called the “cliff effect”: unlike an analog signal that gradually degrades, a digital signal holds up perfectly until it hits a limit, at which point quality drops off abruptly and falls off a cliff. In practice, this means that if you plug in an HDMI or DisplayPort cable and don’t notice flickering, artifacts, or signal drops, you’re already getting the best possible image quality.

The cheapest HDMI and DisplayPort cables really do work

Budget doesn’t have to break functionality

You’re probably wondering why I used the cheapest DisplayPort 1.4 and HDMI 2.1 cables to connect my TV and secondary monitor to my PC. The obvious answer is cost, but I also wanted to see just how bad a cable I could get away with before running into issues.

The reviews seemed decent, so I ordered a pretty cheap, unbranded HDMI 2.1 10-foot cable and a DP 1.4 10-foot cable off Temu for around $6 each. I essentially got these cables for half the price of a decent budget option like the Anker Certified Ultra High-Speed HDMI 2.1 cable, so not bad.

Surprisingly, these cheap cables aren’t the worst quality in the world. Although they’re significantly thinner than the OEM cables that came with my 240Hz LG monitor—which likely means they use thinner gauge wiring—they still have nylon braiding, gold plating (which doesn’t matter), and a rubber neck that protects them from sharp bends.

The DisplayPort cable completely lacks a locking mechanism, which means it doesn’t exactly inspire confidence when plugging it in, but once it’s fully inserted, it works just fine.

As for the signal, I wasn’t too worried about the DisplayPort cable for my secondary monitor, since it only needed to carry a 1080p signal at 100Hz. The required bandwidth is a fraction of what DisplayPort 1.4 can handle.

My TV, however, really gave the ultra-cheap and relatively long HDMI 2.1 cable a run for its money. I pushed it to its limits by running various games at 4K at 120Hz with FreeSync, 12-bit color depth, and HDR enabled, and to my surprise, the cable handled it all without issue. I never experienced signal dropouts or any other glitches with this cheap, frail cable.

Close-up of a hand plugging a DisplayPort cable into the connector.


Why treating DisplayPort like an HDMI cable is bottlenecking your gaming monitor

Simply plugging in a DisplayPort cable won’t unlock your monitor’s true performance

Better quality cables can still save you headaches

Going cheap can show its limits in edge cases

An Amazon Basics High-Speed HDMI Cable. Credit: Hannah Stryker / How-To Geek

Although I got lucky with my cheap cables, this doesn’t necessarily mean that you should buy the cheapest cables on the market. However, this doesn’t mean you should rush out and buy an $80 HDMI cable either—there’s a balancing act here.

First and foremost, if you want to reliably get the maximum bandwidth and features supported by a given version of DisplayPort or HDMI, you should use a certified cable.

For example, to take advantage of the full 48 Gbps bandwidth of HDMI 2.1, you need an “Ultra High Speed HDMI” certified cable, usually identified by an official certification label and QR code on the packaging. Without it, you may not reliably achieve full HDMI 2.1 performance, and depending on cable quality and length, you could be limited to lower resolutions or refresh rates (such as 4K at 60Hz instead of 120Hz).

Another advantage of a high-quality cable is physical durability. A thicker cable is generally harder to damage internally than a thin one.

The only time you should consider investing in a more expensive cable is when you’re running unusually long distances that exceed the normal passive limits for HDMI or DisplayPort, where active cables or optical solutions may be required.

An HDMI cable laying on a computer.


Why a long HDMI cable is the best thing I’ve bought in months

What’s the opposite of wireless? Wiremore?

Certified cables are cheap and easy to find—so buy them whenever you can

The simplest, safest choice

You should never overspend on a digital cable. If you’re buying an HDMI cable, you just need to make sure it’s the right length for your setup and that it’s an Ultra High Speed (HDMI 2.1) certified model from a reputable brand. The same goes for DisplayPort—the most reliable choice is a VESA Certified DisplayPort cable.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


As I’m writing this, NVIDIA is the largest company in the world, with a market cap exceeding $4 trillion. Team Green is now the leader among the Magnificent Seven of the tech world, having surpassed them all in just a few short years.

The company has managed to reach these incredible heights with smart planning and by making the right moves for decades, the latest being the decision to sell shovels during the AI gold rush. Considering the current hardware landscape, there’s simply no reason for NVIDIA to rush a new gaming GPU generation for at least a few years. Here’s why.

Scarcity has become the new normal

Not even Nvidia is powerful enough to overcome market constraints

Global memory shortages have been a reality since late 2025, and they aren’t just affecting RAM and storage manufacturers. Rather, this impacts every company making any product that contains memory or storage—including graphics cards.

Since NVIDIA sells GPU and memory bundles to its partners, which they then solder onto PCBs and add cooling to create full-blown graphics cards, this means that NVIDIA doesn’t just have to battle other tech giants to secure a chunk of TSMC’s limited production capacity to produce its GPU chips. It also has to procure massive amounts of GPU memory, which has never been harder or more expensive to obtain.

While a company as large as NVIDIA certainly has long-term contracts that guarantee stable memory prices, those contracts aren’t going to last forever. The company has likely had to sign new ones, considering the GPU price surge that began at the beginning of 2026, with gaming graphics cards still being overpriced.

With GPU memory costing more than ever, NVIDIA has little reason to rush a new gaming GPU generation, because its gaming earnings are just a drop in the bucket compared to its total earnings.

NVIDIA is an AI company now

Gaming GPUs are taking a back seat

A graph showing NVIDIA revenue breakdown in the last few years. Credit: appeconomyinsights.com

NVIDIA’s gaming division had been its golden goose for decades, but come 2022, the company’s data center and AI division’s revenue started to balloon dramatically. By the beginning of fiscal year 2023, data center and AI revenue had surpassed that of the gaming division.

In fiscal year 2026 (which began on July 1, 2025, and ends on June 30, 2026), NVIDIA’s gaming revenue has contributed less than 8% of the company’s total earnings so far. On the other hand, the data center division has made almost 90% of NVIDIA’s total revenue in fiscal year 2026. What I’m trying to say is that NVIDIA is no longer a gaming company—it’s all about AI now.

Considering that we’re in the middle of the biggest memory shortage in history, and that its AI GPUs rake in almost ten times the revenue of gaming GPUs, there’s little reason for NVIDIA to funnel exorbitantly priced memory toward gaming GPUs. It’s much more profitable to put every memory chip they can get their hands on into AI GPU racks and continue receiving mountains of cash by selling them to AI behemoths.

The RTX 50 Super GPUs might never get released

A sign of times to come

NVIDIA’s RTX 50 Super series was supposed to increase memory capacity of its most popular gaming GPUs. The 16GB RTX 5080 was to be superseded by a 24GB RTX 5080 Super; the same fate would await the 16GB RTX 5070 Ti, while the 18GB RTX 5070 Super was to replace its 12GB non-Super sibling. But according to recent reports, NVIDIA has put it on ice.

The RTX 50 Super launch had been slated for this year’s CES in January, but after missing the show, it now looks like NVIDIA has delayed the lineup indefinitely. According to a recent report, NVIDIA doesn’t plan to launch a single new gaming GPU in 2026. Worse still, the RTX 60 series, which had been expected to debut sometime in 2027, has also been delayed.

A report by The Information (via Tom’s Hardware) states that NVIDIA had finalized the design and specs of its RTX 50 Super refresh, but the RAM-pocalypse threw a wrench into the works, forcing the company to “deprioritize RTX 50 Super production.” In other words, it’s exactly what I said a few paragraphs ago: selling enterprise GPU racks to AI companies is far more lucrative than selling comparatively cheaper GPUs to gamers, especially now that memory prices have been skyrocketing.

Before putting the RTX 50 series on ice, NVIDIA had already slashed its gaming GPU supply by about a fifth and started prioritizing models with less VRAM, like the 8GB versions of the RTX 5060 and RTX 5060 Ti, so this news isn’t that surprising.

So when can we expect RTX 60 GPUs?

Late 2028-ish?

A GPU with a pile of money around it. Credit: Lucas Gouveia / How-To Geek

The good news is that the RTX 60 series is definitely in the pipeline, and we will see it sooner or later. The bad news is that its release date is up in the air, and it’s best not to even think about pricing. The word on the street around CES 2026 was that NVIDIA would release the RTX 60 series in mid-2027, give or take a few months. But as of this writing, it’s increasingly likely we won’t see RTX 60 GPUs until 2028.

If you’ve been following the discussion around memory shortages, this won’t be surprising. In late 2025, the prognosis was that we wouldn’t see the end of the RAM-pocalypse until 2027, maybe 2028. But a recent statement by SK Hynix chairman (the company is one of the world’s three largest memory manufacturers) warns that the global memory shortage may last well into 2030.

If that turns out to be true, and if the global AI data center boom doesn’t slow down in the next few years, I wouldn’t be surprised if NVIDIA delays the RTX 60 GPUs as long as possible. There’s a good chance we won’t see them until the second half of 2028, and I wouldn’t be surprised if they miss that window as well if memory supply doesn’t recover by then. Data center GPUs are simply too profitable for NVIDIA to reserve a meaningful portion of memory for gaming graphics cards as long as shortages persist.


At least current-gen gaming GPUs are still a great option for any PC gamer

If there is a silver lining here, it is that current-gen gaming GPUs (NVIDIA RTX 50 and AMD Radeon RX 90) are still more than powerful enough for any current AAA title. Considering that Sony is reportedly delaying the PlayStation 6 and that global PC shipments are projected to see a sharp, double-digit decline in 2026, game developers have little incentive to push requirements beyond what current hardware can handle.

DLSS 5, on the other hand, may be the future of gaming, but no one likes it, and it will take a few years (and likely the arrival of the RTX 60 lineup) for it to mature and become usable on anything that’s not a heckin’ RTX 5090.

If you’re open to buying used GPUs, even last-gen gaming graphics cards offer tons of performance and are able to rein in any AAA game you throw at them. While we likely won’t get a new gaming GPU from NVIDIA for at least a few years, at least the ones we’ve got are great today and will continue to chew through any game for the foreseeable future.



Source link