Why ’90s giant TVs looked amazing but were actually terrible to own


I clearly remember the first time I walked into someone’s house in the 1990s and saw a rear-projection TV. At the time, we had just upgraded to a 20-inch Panasonic TV, which was a big step up from the 11-inch TV we’d had up to that point.

So imagine how it felt to walk into someone’s house and see a TV with a screen at least 50 inches in size. And remember this was a TV with a 4:3 aspect ratio, so that’s even more surface area than a modern widescreen with the same diagonal measurement. The big picture impressed me, even if my dad’s friends only wanted to watch boring sports on them, but the truth is that these big 1990s TVs were actually pretty awful.

Why rear-projection TVs felt like the future

It still looks the part

Looking at a 1990s RPTV, at least while turned off, it wouldn’t look too out of place in a modern home. At least not from the front. I would not even see a flat-screen CRT TV until the early 2000s, so seeing these big flat TVs in the 1990s definitely felt like the future.

RPTVs were typically between 40 and 60 inches in size. To put that in perspective, the absolutely largest CRT TV ever made is the Sony Trinitron PVM-4300. This TV weighs 440 pounds!

Huge Sony PVM CRT wth specifcations. Credit: Sony

The biggest CRT TVs you’d typically see were about 30 inches, and the biggest CRT my parents ever owned clocked in at 29 inches.

Today, I am the proud owner of a 34-inch Trinitron, coming in at a back-breaking 165 pounds. That’s a 12.9-inch iPad on top, for reference.

Star Trek on CRT TV using an iPad. Credit: Sydney Louw Butler / How-To Geek

RPTVs allowed screen sizes to scale way beyond this limit without the enormous weight. Of course, virtually all 1990s and earlier RPTVs use CRT projection technology on the inside. In the early 2000s, we’d get models with DLP projection technology, and by the late 2000s, they were using laser projection, but none of that is relevant to the 1990s. This was the final hurrah of the CRT RPTV.

The illusion of quality people remember today

Rose-tinted TVs

If you look up forum posts or just ask people about their memories of these TVs, they can be surprisingly positive. I had the pleasure of watching a few LaserDisc movies on an RPTV back in the 1990s, and I remember it looking as good as DVD does today. Now, I actually own a LaserDisc player and high-end CRT TV in the here and now and know that it looks significantly worse than DVD.

The Escape from LA LaserDisc plays on a Pioneer LaserDisc player set on top of a Sony Trintiron TV. Credit: Sydney Louw Butler/How-To Geek

It’s all about reference points. LaserDisc had twice the resolution of VHS, so obviously I was going to be blown away by it, having never seen anything sharper back then, but despite using CRT technology on the inside, RPTVs had worse picture quality.

The picture quality was objectively terrible

Too many moving parts

Typically, a CRT RPTV would have three small CRT tubes on the inside, one for each primary color in the RGB system. Red, green, and blue.

If these were not perfectly aligned, the image would have chromatic aberrations and look fuzzy. Moreover, brightness compared to a normal CRT TV was a huge issue. You have to watch these in a darkened room, and even then, I clearly remember the picture being uneven.

These TVs needed frequent servicing to keep them aligned and operating as intended, but most of the people I knew who owned them clearly didn’t do this. As long as they got a nice big picture on game night, that was good enough.

If you actually cared about picture quality, then an RPTV was not the technology of choice; its only real advantage was size.

Screenshot 2025-07-01 at 9.21.03 AM

7/10

Brand

TCL

Display Size

85-inches

The 2025 model TCL QM6K Google TV delivers a stunningly clear and bright picture with a new Mini-LED panel, improved local dimming zones, Dolby Vision IQ, and a neat new Halo Control system for improved visuals. Get this TV and elevate your living room. 



The real reason they disappeared almost overnight

The moment that flat panel technology was ready, RPTVs were dead in the water. While there were some wall-mountable DLP RPTVs toward the end of the 2000s, even heavy plasma TVs were orders of magnitude less bulky or heavy.

Flat panel technology also blew the doors off screen-size limits. While a 55-inch RPTV is considered a “big screen TV” these days, a 55-inch TV is just an entry-level, mundane size. Funnily enough, when I bought my first plasma TV at 51 inches, I remember thinking that I finally had a TV as big as those RPTVs from my youth. It just had a much, much better picture.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


As I’m writing this, NVIDIA is the largest company in the world, with a market cap exceeding $4 trillion. Team Green is now the leader among the Magnificent Seven of the tech world, having surpassed them all in just a few short years.

The company has managed to reach these incredible heights with smart planning and by making the right moves for decades, the latest being the decision to sell shovels during the AI gold rush. Considering the current hardware landscape, there’s simply no reason for NVIDIA to rush a new gaming GPU generation for at least a few years. Here’s why.

Scarcity has become the new normal

Not even Nvidia is powerful enough to overcome market constraints

Global memory shortages have been a reality since late 2025, and they aren’t just affecting RAM and storage manufacturers. Rather, this impacts every company making any product that contains memory or storage—including graphics cards.

Since NVIDIA sells GPU and memory bundles to its partners, which they then solder onto PCBs and add cooling to create full-blown graphics cards, this means that NVIDIA doesn’t just have to battle other tech giants to secure a chunk of TSMC’s limited production capacity to produce its GPU chips. It also has to procure massive amounts of GPU memory, which has never been harder or more expensive to obtain.

While a company as large as NVIDIA certainly has long-term contracts that guarantee stable memory prices, those contracts aren’t going to last forever. The company has likely had to sign new ones, considering the GPU price surge that began at the beginning of 2026, with gaming graphics cards still being overpriced.

With GPU memory costing more than ever, NVIDIA has little reason to rush a new gaming GPU generation, because its gaming earnings are just a drop in the bucket compared to its total earnings.

NVIDIA is an AI company now

Gaming GPUs are taking a back seat

A graph showing NVIDIA revenue breakdown in the last few years. Credit: appeconomyinsights.com

NVIDIA’s gaming division had been its golden goose for decades, but come 2022, the company’s data center and AI division’s revenue started to balloon dramatically. By the beginning of fiscal year 2023, data center and AI revenue had surpassed that of the gaming division.

In fiscal year 2026 (which began on July 1, 2025, and ends on June 30, 2026), NVIDIA’s gaming revenue has contributed less than 8% of the company’s total earnings so far. On the other hand, the data center division has made almost 90% of NVIDIA’s total revenue in fiscal year 2026. What I’m trying to say is that NVIDIA is no longer a gaming company—it’s all about AI now.

Considering that we’re in the middle of the biggest memory shortage in history, and that its AI GPUs rake in almost ten times the revenue of gaming GPUs, there’s little reason for NVIDIA to funnel exorbitantly priced memory toward gaming GPUs. It’s much more profitable to put every memory chip they can get their hands on into AI GPU racks and continue receiving mountains of cash by selling them to AI behemoths.

The RTX 50 Super GPUs might never get released

A sign of times to come

NVIDIA’s RTX 50 Super series was supposed to increase memory capacity of its most popular gaming GPUs. The 16GB RTX 5080 was to be superseded by a 24GB RTX 5080 Super; the same fate would await the 16GB RTX 5070 Ti, while the 18GB RTX 5070 Super was to replace its 12GB non-Super sibling. But according to recent reports, NVIDIA has put it on ice.

The RTX 50 Super launch had been slated for this year’s CES in January, but after missing the show, it now looks like NVIDIA has delayed the lineup indefinitely. According to a recent report, NVIDIA doesn’t plan to launch a single new gaming GPU in 2026. Worse still, the RTX 60 series, which had been expected to debut sometime in 2027, has also been delayed.

A report by The Information (via Tom’s Hardware) states that NVIDIA had finalized the design and specs of its RTX 50 Super refresh, but the RAM-pocalypse threw a wrench into the works, forcing the company to “deprioritize RTX 50 Super production.” In other words, it’s exactly what I said a few paragraphs ago: selling enterprise GPU racks to AI companies is far more lucrative than selling comparatively cheaper GPUs to gamers, especially now that memory prices have been skyrocketing.

Before putting the RTX 50 series on ice, NVIDIA had already slashed its gaming GPU supply by about a fifth and started prioritizing models with less VRAM, like the 8GB versions of the RTX 5060 and RTX 5060 Ti, so this news isn’t that surprising.

So when can we expect RTX 60 GPUs?

Late 2028-ish?

A GPU with a pile of money around it. Credit: Lucas Gouveia / How-To Geek

The good news is that the RTX 60 series is definitely in the pipeline, and we will see it sooner or later. The bad news is that its release date is up in the air, and it’s best not to even think about pricing. The word on the street around CES 2026 was that NVIDIA would release the RTX 60 series in mid-2027, give or take a few months. But as of this writing, it’s increasingly likely we won’t see RTX 60 GPUs until 2028.

If you’ve been following the discussion around memory shortages, this won’t be surprising. In late 2025, the prognosis was that we wouldn’t see the end of the RAM-pocalypse until 2027, maybe 2028. But a recent statement by SK Hynix chairman (the company is one of the world’s three largest memory manufacturers) warns that the global memory shortage may last well into 2030.

If that turns out to be true, and if the global AI data center boom doesn’t slow down in the next few years, I wouldn’t be surprised if NVIDIA delays the RTX 60 GPUs as long as possible. There’s a good chance we won’t see them until the second half of 2028, and I wouldn’t be surprised if they miss that window as well if memory supply doesn’t recover by then. Data center GPUs are simply too profitable for NVIDIA to reserve a meaningful portion of memory for gaming graphics cards as long as shortages persist.


At least current-gen gaming GPUs are still a great option for any PC gamer

If there is a silver lining here, it is that current-gen gaming GPUs (NVIDIA RTX 50 and AMD Radeon RX 90) are still more than powerful enough for any current AAA title. Considering that Sony is reportedly delaying the PlayStation 6 and that global PC shipments are projected to see a sharp, double-digit decline in 2026, game developers have little incentive to push requirements beyond what current hardware can handle.

DLSS 5, on the other hand, may be the future of gaming, but no one likes it, and it will take a few years (and likely the arrival of the RTX 60 lineup) for it to mature and become usable on anything that’s not a heckin’ RTX 5090.

If you’re open to buying used GPUs, even last-gen gaming graphics cards offer tons of performance and are able to rein in any AAA game you throw at them. While we likely won’t get a new gaming GPU from NVIDIA for at least a few years, at least the ones we’ve got are great today and will continue to chew through any game for the foreseeable future.



Source link