PivotTables are slowing down your Excel workflow—here’s what to use instead


PivotTables are the Swiss Army knife of Excel, but let’s be real—you wouldn’t use a pocketknife to build a house. While they’re great for a quick-and-dirty glance at your numbers, over-relying on them for complex reporting leads to scaling headaches and broken references. Here’s why it’s time to stop making them your default reporting tool.

How a shortcut became the industry standard

If you’ve spent more than five minutes in an office, you’ve probably heard the classic advice: “Just throw it into a PivotTable.” Since their introduction, PivotTables have become shorthand for Excel competence—and for good reason. In many ways, they’re the ultimate workplace party trick, since they solve a very real problem: summarizing large datasets quickly without writing a single line of code or a complex formula.

Before PivotTables changed the game, analysis often meant building long chains of SUMIF or SUMIFS formulas, helper columns, or manually restructuring your data to get even basic summaries. PivotTables collapsed much of that work into a simple drag-and-drop interface. But this ease of use caused a cultural shift. We stopped seeing them as a “let’s see what’s in this data” tool and started treating them as a permanent reporting layer.

Illustration of the Microsoft Excel logo centered over a stylized spreadsheet background with floating shapes and cells.


How to use the PivotTable Fields pane in Microsoft Excel

Get your head around Excel’s clunky field list options.

PivotTables struggle with complex reporting logic

The hidden cost of drag-and-drop simplicity

The Excel Insert Calculated Field dialog box showing a manual formula for Tax being applied to the PivotTable.

PivotTables work best when they’re used for what they are: snapshots of structured data. They excel at aggregation—sums, counts, and averages—but start to fall apart when logic becomes layered or conditional. For example, if you’re trying to calculate something like a weighted rolling average that excludes holidays and weekends, you’re effectively forcing a reporting engine to do the job of a data architect.

Then there’s the “refresh” problem. Historically, PivotTables are notorious for being needy—if you didn’t manually hit refresh, your report could easily become out of date. Yes, Microsoft is finally addressing this with an Auto Refresh feature that allows PivotTables to update as soon as the source data changes, but this doesn’t solve the babysitting problem entirely. If your data structure changes—like renamed headers or shifted ranges—the whole thing can still break. It keeps your workflow dependent on a tool that’s easily rattled by minor changes.

PivotTables become too slow and hard to manage

Bigger isn’t always better for pivots

The Excel Pivot Table Options dialog box with the Data tab selected, highlighting the 'Save source data with file' checkbox.

We’ve all seen “The Spreadsheet”: it’s 50MB, takes three minutes to open, and is packed with dozens of PivotTables all competing for memory and performance. At scale, PivotTables can become heavy because they use a Pivot Cache—an internal structure that stores a compressed copy of the source data to speed up aggregation and filtering. While this cache improves performance in many cases, it can also increase workbook size and memory usage, especially when multiple PivotTables rely on separate caches.

As datasets grow into the hundreds of thousands of rows, the interface itself becomes a bottleneck. Trying to find one specific customer ID in a tiny drop-down menu isn’t analysis—it’s an endurance test.

Illustration of the Microsoft Excel logo with green bar charts and a cursor over a spreadsheet background.


Why Excel files get huge, and the 5 fixes that shrank mine from 50MB to 2MB

Excel files bloat from ghost used ranges, formatting, pivot caches, and images. These quick fixes massively cut the size of my workbook.

PivotTables make it difficult to audit and verify your numbers

The hidden danger of black box logic

An Excel PivotTable with a tiny filter icon on the Row Labels header, indicating hidden data that isn't immediately obvious to an auditor.

One of the biggest frustrations when inheriting an Excel spreadsheet is the lack of transparency inside a PivotTable. If someone writes a formula, you can select the cell and trace the logic. If they build a PivotTable, however, that logic is often buried under layers of menus. Did they manually group certain dates together? Is there a subtle, hidden filter applied to a field? Did they create a “Calculated Field” months ago that’s now skewing the totals?

When you use a PivotTable for high-stakes recurring reports, you’re essentially creating a black box. Your teammates (and your future self) have to click through multiple dialog boxes just to see how the numbers were derived. In a professional environment where accuracy is everything, having invisible logic is a liability. It doesn’t mean you should never use them, but it does mean they can be the wrong choice if things need to be easy to audit, document, and share between team members.

Moving towards a 21st-century workflow

If PivotTables start feeling like something you’re constantly fighting, it’s usually a sign that the workflow (not the tool) is doing too much at once. A more modern Excel approach separates concerns.

Power Query handles data transformation before it ever reaches a worksheet, allowing you to build repeatable steps that merge and structure your data at the source.

For more structured analysis, the Data Model (via Power Pivot) lets you build relational datasets inside Excel. Instead of flattening everything into a single table, you define relationships between tables and use measures to perform calculations. This makes PivotTables built on the Data Model significantly more scalable and expressive than traditional range-based pivots.

On the presentation side, dynamic array functions like FILTER, UNIQUE, and SORT let you build live ranges that update automatically as data changes—no manual refresh. And if you really need that classic pivot layout, the PIVOTBY function lets you build a pivot using a formula, giving you the best of both worlds without the cache bloat.

The Excel logo with a table in the background and 'PivotTables' and 'PIVOTBY' written next to it.


Why I’m swapping my PivotTables for the PIVOTBY function in Excel

PivotTables are static relics; PIVOTBY is the responsive, resilient, and fully auditable future of the modern Excel professional.


Use the right tool for the right stage

I’m not arguing that PivotTables are “bad”—they’re not. The issue is how easily they become the default choice before the problem is even clear. They’re excellent for exploration, but they were never meant to be the foundation of reporting systems. The real skill isn’t knowing how to use them—it’s recognizing when the problem has already outgrown them.

OS

Windows, macOS, iPhone, iPad, Android

Free trial

1 month

Microsoft 365 includes access to Office apps like Word, Excel, and PowerPoint on up to five devices, 1 TB of OneDrive storage, and more.




Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


The battle between AMD and NVIDIA rages on eternally, it seems, though it’s rather a one-sided battle in the desktop PC market, where NVIDIA holds something like 95%, and AMD most of what’s left apart from Intel’s (almost) 1%.

But as dominant and popular as NVIDIA is, AMD proponents could always raise the value argument. On a per-dollar basis, you get more value with an AMD card, and even better, you have the benefit of AMD “FineWine” which ensures your card will become even better with time.

What “FineWine” meant—and why it mattered

FineWine was something that AMD fans began to notice during the GCN (Graphics Core Next) architecture. Incidentally, the last AMD dedicated GPU I bought was the R9 390, which was of that lineage. Since then, all my AMD GPUs have been embedded in consoles or handheld PCs, but I digress.

The R9 390 is actually a good example of FineWine. Launched in 2015, like many AMD cards, the R9 390 had a rough start, and I sold mine in exchange for a stopgap card in the form of the RTX 2060, because I wanted to play Cyberpunk 2077 on PC, where it wasn’t broken the way it was on consoles. Even though, on paper, the raw power of the RTX 2060 wasn’t much more than a 390, the AMD card’s performance on my (then) 1080p monitor was a stuttery mess, whereas everything suddenly ran great on my 2060 the minute the AMD GPU was expunged from the system.

But, a decade later, that same game is perfectly playable on this card, as you can see in this TechLabUK video.

A lot of it is because the developers have kept patching and improving the game, but this is something you see across the board for AMD cards on various games. This is FineWine. Years later, with continued driver updates from AMD, the cards go from being a little worse than their NVIDIA equivalent at launch to being as good or even a little better in the long run.

Of course, that’s not super helpful to customers who buy hardware at launch, but it has given some AMD users computers with longer lifespans than you’d think, and made many used AMD cards an even better bargain.

Why AMD’s FineWine era worked

A bit of smoke and mirrors

The PULSE AMD Radeon RX 6800 XT next to an AMD RX 6600 XT Phantom Gaming D. Credit: Ismar Hrnjicevic / How-To Geek

FineWine wasn’t magic, of course. The phenomenon was the result of a mix of factors. AMD’s architectures were in some cases a little too forward-thinking for the APIs of the day. Massively parallel with a focus on compute, they’d only come into their own with DirectX 12 and more modern games. NVIDIA’s cards at the time were better optimized to run current games well. Over time, NVIDIA cards would make similar architectural changes, but with better timing.

The other reason FineWine was a thing came down to driver maturity. As a much smaller company with fewer resources, it seems that AMD had some trouble releasing cards with optimized drivers. So, over time, the card would start performing as intended.

In both cases, you could frame FineWine not as the card getting better, but rather getting “less worse” over time. If you set the bar low at launch, the only way is up. However, there’s a third factor to take into account as well. AMD dominates console gaming. The two major home console series have now run on AMD GPUs for two generations, and so games are developed with that hardware in mind. This also gives newer titles a bit of a leg up, though it’s hard to know exactly by how much.

How AMD moved on from FineWine

It seems worse, but it’s actually better

An AMD RX 9070 XT Gigabyte gaming graphics card. Credit: Ismar Hrnjicevic / How-To Geek

With the shift to RDNA architecture, AMD made a deliberate change in philosophy. Modern Radeon GPUs are designed to perform well right out of the gate. Reviews on day one are much closer to what you could expect years later. There are still decent gains to be had on RDNA cards with game-specific optimizations (Spider-Man on PC is a great example), but the golden age of FineWine seems to be in the past now.

That’s a good thing! Products should put their best foot forward on day one, so let’s not shed a tear for FineWine in that regard. So it’s not so much that AMD doesn’t care about improving the performance and stability of older cards over the years, it’s that the company is now better at its job, and so there’s less room for improvement.

Sapphire NITRO+ AMD Radeon RX 9070 XT GPU

Cooling Method

Air

GPU Speed

2520Mhz

The AMD Radeon RX 9070 XT from Sapphire features 16GB of DDR6 memory, two HDMI and two DisplayPorts, and an overengineered cooling setup that will keep the card cool and whisper quiet no matter the workload.


NVIDIA kept the idea—but changed the formula

It’s all about AI

It’s funny, but these days I think of NVIDIA cards as the ones with major longevity. Take the venerable GTX 1080 and 1080 Ti cards. These cards only lost game-ready driver support in 2025, which doesn’t immediately make them useless, it just means no more optimization for those chips. What an incredible run, getting a decade of relevant game performance from a GPU!

But, that’s not really NVIDIA’s take on FineWine. Instead, the company has taken to adding new and better features to its cards long after they’ve been launched. Starting with the 20-series, the presence of machine-learning hardware means that by improving the AI algorithms for technologies like DLSS, these cards have become more performant with better image quality over time.

While NVIDIA has made some features of its AI technology exclusive to each generation, so far all post 10-series GPUs benefit from every new generation of DLSS. Compare that to AMD which not only offers inferior versions of this new upscaling technology, but has locked the better, more usable versions to later cards, such as the case with FSR Redstone.


FineWine is an ethos, not a brand

In the case of my humble RTX 4060 laptop, the release of DLSS 4.5 has opened new possibilities, notably the ability to target a 4K output resolution, which was certainly not on the table when I first took this computer out of the box. We might not call it “FineWine,” but it sure smells like it to me!



Source link