Your phone’s Bluetooth audio quality depends on one hidden setting nobody knows about


Bluetooth is a fantastic feature on mobile phones, but if you use it to listen to your favorite tunes regularly, you might notice it sometimes sounds dead, tinny, or just plain bad. The culprit might actually be the default codec your phone is using for Bluetooth.

Fortunately, there are a few ways to improve it, typically available right on your device.

Bluetooth quality improvements begin with the codecs

Not all Bluetooth audio is the same

The bluetooth logo against a black background with the words "all bluetooth is not created equally"

The media files you use every day require a codec to play back properly. This is usually software (but can occasionally be hardware). It basically takes data, compresses it, and encodes it. The codec decodes the file into a playable format at the end point of the transmission.

Think of it like a translator that takes one language (music or video in this case) and converts it to another (music being played back via streaming or Bluetooth). It’s an essential function for media files, regardless of device or format.

Bluetooth audio can vary depending on the bit rate the codec uses for transmission. Bluetooth uses the SBC (Subband Coding) codec as its default and fallback, which just happens to have the lowest bit rate. It’s lossy, like an MP3 file, which just means it loses some data when it compresses the audio.

A Bluetooth turntable with a Bluetooth icon with noise.


Bluetooth Turntables Make Vinyl Sound Worse, but Hear Me Out

Doesn’t a Bluetooth record player defeat the point?

LDAC is a codec worth knowing about

Developed by Sony, adopted by Android

Your Bluetooth settings are probably already set to SBC by default, although some devices might default to AAC.

AAC (advanced audio coding) is a good middle-of-the-road option, but LDAC (a format developed by Sony) is the setting to use for better audio. LHDC is another high-resolution format that sits alongside LDAC as a high-quality alternative to SBC.

LDAC supports 990 Kbps at 32 bits/96 kHz. If that means nothing to you, it just means it’s still high-resolution, sounds good, and doesn’t lose as much data during compression. It’s definitely better than SBC detail-wise.

Of course, using LDAC with cheap, low-end earbuds or a lousy speaker probably won’t make much of a difference.

If your device supports LDAC or LHDC, it’s worth using

Enabling LDAC and LHDC is easier than you think

The Android Oreo logo sits atop the LDAC codec

Plenty of devices can be used as audio receivers, from speakers to headphones, to your laptop. Improving audio quality and tweaking settings is half the fun of using modern tech. Depending on your comfort level (and whether your device is supported), you can access and change codec settings yourself through your device’s developer options. I’m working with two Android phones, specifically a Motorola Moto G 2025 and a Samsung Galaxy Z Fold 5.

That said, here’s how to see if your device supports LDAC and enable it if it does:

  1. Look for an LDAC or LHDC logo or spec sheet in your device’s instructions.
  2. Check phone settings through developer options/settings
  3. Every phone has a different way of accessing the developer options. For the Motorola Moto G, you just go to Settings – About – Device Identifiers and then tap “Build Number” eight times.
  4. “Developer Options” will be listed under “System”
  5. Find “Bluetooth Audio Codec” and make the adjustment. In this case, I’ve enabled “LHDC”

The Samsung Galaxy Z Fold 5 has a slightly different way to access the developer options:

  1. Go to “Settings”
  2. Select “About Phone” and go to “Software Information”
  3. Tap the “build number” about seven times
  4. Check the codec under “developer options”
  5. Repeat the same steps as above
  6. The audio sounds pretty good through a pair of Everyday Earbuds, but your mileage may vary

Your specific device will vary, obviously, so make sure you look up (preferably via the manual or by asking support) how to adjust these options for your device. LDAC should be supported on all Android Oreo phones and above, which is pretty cool.

But what if your device doesn’t support LDAC? Don’t worry, there are plenty of other codec options available, like AAC, aptX, and aptX HD. Apple prefers the AAC codec, for instance, and it works great for their iOS devices. AptX and aptX HD are Qualcomm’s alternatives to SBC (they’re typically better quality, and most Android devices support them). Whatever your case may be, all of them pretty much beat SBC, but I like LDAC the best.

If you’re interested in learning more about LDAC in a much more technical sense, the source code is available at The Android Open-Source Project.

A Bluetooth / USB / 2.4G switch on the back of a keyboard along with a USB and 2.4GHz dongle.


2.4GHz vs Bluetooth: Which Wireless Technology Is Better?

2.4GHz is great as long as you don’t lose that tiny dongle.


One setting change can make a significant difference

Bose Ultra Open Earbuds in charging case Credit: Tyler Hayes / How-To Geek

​​​​​​​

Although everyone’s mileage with Bluetooth will vary, there are definitely advantages to changing your codec settings. Results will vary based on headphone quality and source material, of course.

For me, LDAC will be my go-to on any new Android device, and I’ll be looking to enable it any time I set up my Bluetooth settings on future phones and devices.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


The battle between AMD and NVIDIA rages on eternally, it seems, though it’s rather a one-sided battle in the desktop PC market, where NVIDIA holds something like 95%, and AMD most of what’s left apart from Intel’s (almost) 1%.

But as dominant and popular as NVIDIA is, AMD proponents could always raise the value argument. On a per-dollar basis, you get more value with an AMD card, and even better, you have the benefit of AMD “FineWine” which ensures your card will become even better with time.

What “FineWine” meant—and why it mattered

FineWine was something that AMD fans began to notice during the GCN (Graphics Core Next) architecture. Incidentally, the last AMD dedicated GPU I bought was the R9 390, which was of that lineage. Since then, all my AMD GPUs have been embedded in consoles or handheld PCs, but I digress.

The R9 390 is actually a good example of FineWine. Launched in 2015, like many AMD cards, the R9 390 had a rough start, and I sold mine in exchange for a stopgap card in the form of the RTX 2060, because I wanted to play Cyberpunk 2077 on PC, where it wasn’t broken the way it was on consoles. Even though, on paper, the raw power of the RTX 2060 wasn’t much more than a 390, the AMD card’s performance on my (then) 1080p monitor was a stuttery mess, whereas everything suddenly ran great on my 2060 the minute the AMD GPU was expunged from the system.

But, a decade later, that same game is perfectly playable on this card, as you can see in this TechLabUK video.

A lot of it is because the developers have kept patching and improving the game, but this is something you see across the board for AMD cards on various games. This is FineWine. Years later, with continued driver updates from AMD, the cards go from being a little worse than their NVIDIA equivalent at launch to being as good or even a little better in the long run.

Of course, that’s not super helpful to customers who buy hardware at launch, but it has given some AMD users computers with longer lifespans than you’d think, and made many used AMD cards an even better bargain.

Why AMD’s FineWine era worked

A bit of smoke and mirrors

The PULSE AMD Radeon RX 6800 XT next to an AMD RX 6600 XT Phantom Gaming D. Credit: Ismar Hrnjicevic / How-To Geek

FineWine wasn’t magic, of course. The phenomenon was the result of a mix of factors. AMD’s architectures were in some cases a little too forward-thinking for the APIs of the day. Massively parallel with a focus on compute, they’d only come into their own with DirectX 12 and more modern games. NVIDIA’s cards at the time were better optimized to run current games well. Over time, NVIDIA cards would make similar architectural changes, but with better timing.

The other reason FineWine was a thing came down to driver maturity. As a much smaller company with fewer resources, it seems that AMD had some trouble releasing cards with optimized drivers. So, over time, the card would start performing as intended.

In both cases, you could frame FineWine not as the card getting better, but rather getting “less worse” over time. If you set the bar low at launch, the only way is up. However, there’s a third factor to take into account as well. AMD dominates console gaming. The two major home console series have now run on AMD GPUs for two generations, and so games are developed with that hardware in mind. This also gives newer titles a bit of a leg up, though it’s hard to know exactly by how much.

How AMD moved on from FineWine

It seems worse, but it’s actually better

An AMD RX 9070 XT Gigabyte gaming graphics card. Credit: Ismar Hrnjicevic / How-To Geek

With the shift to RDNA architecture, AMD made a deliberate change in philosophy. Modern Radeon GPUs are designed to perform well right out of the gate. Reviews on day one are much closer to what you could expect years later. There are still decent gains to be had on RDNA cards with game-specific optimizations (Spider-Man on PC is a great example), but the golden age of FineWine seems to be in the past now.

That’s a good thing! Products should put their best foot forward on day one, so let’s not shed a tear for FineWine in that regard. So it’s not so much that AMD doesn’t care about improving the performance and stability of older cards over the years, it’s that the company is now better at its job, and so there’s less room for improvement.

Sapphire NITRO+ AMD Radeon RX 9070 XT GPU

Cooling Method

Air

GPU Speed

2520Mhz

The AMD Radeon RX 9070 XT from Sapphire features 16GB of DDR6 memory, two HDMI and two DisplayPorts, and an overengineered cooling setup that will keep the card cool and whisper quiet no matter the workload.


NVIDIA kept the idea—but changed the formula

It’s all about AI

It’s funny, but these days I think of NVIDIA cards as the ones with major longevity. Take the venerable GTX 1080 and 1080 Ti cards. These cards only lost game-ready driver support in 2025, which doesn’t immediately make them useless, it just means no more optimization for those chips. What an incredible run, getting a decade of relevant game performance from a GPU!

But, that’s not really NVIDIA’s take on FineWine. Instead, the company has taken to adding new and better features to its cards long after they’ve been launched. Starting with the 20-series, the presence of machine-learning hardware means that by improving the AI algorithms for technologies like DLSS, these cards have become more performant with better image quality over time.

While NVIDIA has made some features of its AI technology exclusive to each generation, so far all post 10-series GPUs benefit from every new generation of DLSS. Compare that to AMD which not only offers inferior versions of this new upscaling technology, but has locked the better, more usable versions to later cards, such as the case with FSR Redstone.


FineWine is an ethos, not a brand

In the case of my humble RTX 4060 laptop, the release of DLSS 4.5 has opened new possibilities, notably the ability to target a 4K output resolution, which was certainly not on the table when I first took this computer out of the box. We might not call it “FineWine,” but it sure smells like it to me!



Source link