Why I use both ChatGPT and local LLMs (and you should too)


You can run a local LLM on your computer that can do many of the same things as AI assistants such as ChatGPT, Gemini, and Claude. A local LLM can do a lot, but most still lag behind proprietary cloud-based LLMs in many ways. That’s why I use both.

Local LLMs can do a lot

You don’t need the most powerful hardware to use them

A screenshot of LM Studio app landing page

If you’ve never tried running a local LLM because you’re worried about lacking the necessary hardware, you may be pleasantly surprised. You don’t necessarily need an expensive GPU with a huge amount of VRAM. You can run smaller models on fairly modest hardware and still get reasonable results.

My M2 MacBook Air only has 8GB of RAM, for example, but I’m able to run models that can do useful tasks such as summarizing or rewriting text, analyzing uploaded images, and even helping with coding. The more powerful your computer, the larger the models you can run, but smaller models can still do a job.

If you’re not sure what LLMs your computer can handle, there are some useful tools that can help. Utilities such as llm-checker and llmfit can scan your hardware and list models that are compatible. You can then choose the best model that your hardware can run.

ASUS TUF RTX 5070 12GB GPU.

Graphics RAM Size

12GB

Brand

ASUS

The ASUS TUF Gaming GeForce RTX 5070 12GB graphics card is designed to take your gaming setup to the next level. As the latest from NVIDIA, you’re getting PCIe 5.0 compatibility, HDMI and DisplayPort 2.1 ports, NVIDIA Blackwell architecture, and DLSS 4 technologies packed into this mid-range GPU.


Local LLMs still can’t match the big players

It’s no surprise that the major models usually win

Claude Code terminal running on an iPad with a keyboard case on a wooden desk. Credit: Patrick Campanale / How-To Geek

A local LLM can sometimes be faster for simple tasks, thanks to the reduced latency of a local model. No matter how powerful your hardware is, however, for large, complex tasks, local LLMs currently can’t match the speed of a proprietary cloud-based LLM such as ChatGPT, Claude, and Gemini. Even if you have the most powerful GPU you can get, these companies can spread their models over a cluster of thousands of GPUs, all working together, effectively giving them terabytes of pooled VRAM.

It means that a proprietary LLM will often be able to outperform your local LLM. While your local LLM may be a sports car, proprietary LLMs are Formula 1 cars. These companies have spent billions on the infrastructure behind their platforms, and in a race, there’s only going to be one winner.

You can have the best of both worlds

Use a local LLM alongside ChatGPT, Claude, or Gemini

A man sitting on the living room floor using his laptop, with a chatbot next to him. Credit: Prostock-studio / Borodacheva Marina / Shutterstock

That’s not to say that local LLMs don’t have their place. Even if a local LLM can’t quite match the performance of a proprietary model, it can still perform many of the same tasks well.

If you want to summarize documents, for example, this is something that a good local LLM can often do well, as long as the text fits in the local LLM’s context window. Local LLMs are also very good for creative writing tasks, extracting data from images, and standard logic or math problems.

Where proprietary LLMs win is in more complex tasks. If you want a cross-document analysis of 100 different documents, a local LLM may not be able to handle the context, but some proprietary LLMs can. While local LLMs can do logic and math, proprietary LLMs can usually handle much more complex tasks.

Some local LLMs can handle coding tasks almost as well as proprietary models. Proprietary LLMs tend to pull ahead when you need to do something more involved, such as redesigning an app’s entire architecture.

While a local LLM can’t yet match a proprietary LLM at complex tasks, there’s a lot you can do locally. That’s why I use both a local LLM and cloud-based proprietary models and choose which one to use based on the task I want to complete.

The benefits of local LLMs

Keep your data local and private

If proprietary LLMs are superior, why bother using a local LLM at all? There are a number of reasons why using a local LLM can be a better option.

One of the biggest benefits of using a local LLM is privacy. When you use a proprietary LLM, everything you type and every file you upload is sent to the LLM’s servers. If you don’t want to share private information, a local LLM can help you avoid sending data to third-party servers.

For example, I used a local LLM to find sensitive information in bank statements such as account numbers, people’s names, or addresses. I was then able to redact this information before I uploaded the files to a proprietary LLM that could analyze my spending habits based on the bank statements.

Local LLMs can also be useful for cases where proprietary LLMs are overly strict and refuse to give answers. You can find local models that don’t include such strict filters. For example, if you’re using an LLM for a role-playing game, a proprietary model might refuse to describe a sword fight, while some local models would have no such qualms.

I also use a local LLM for tasks that don’t require the power of a proprietary LLM. I have a Claude subscription, which comes with usage limits. Rather than wasting this usage on simple tasks, I can use the local LLM instead and save my Claude usage for more complex stuff, such as coding.


A local LLM can be a helpful tool

Local LLMs may not be able to match the power of proprietary models just yet, but there are multiple ways you can use them. Combining a local LLM with a cloud-based model such as Claude or ChatGPT gives you the best of both worlds: privacy when you want it and power when you need it.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


Spotify aims to provide a consistent listening experience that uses minimal data. As a result, your audio quality might be less than ideal, especially if you’re using a pair of high-fidelity headphones or high-end speakers. Here’s how to fix that.

Switch audio streaming quality to Very High or Lossless

The default audio streaming quality in both the mobile and desktop Spotify apps is set to Automatic, which usually keeps the audio quality at Normal, which is only 96 Kbps. Even though Spotify uses the Ogg Vorbis codec, which is superior to MP3, OGG files exhibit slight (but noticeable) digital noise, poor bass detail, dull treble, and a narrow soundstage at 96 Kbps.

Even worse, Spotify is aggressive about adjusting the automatic bitrate. Even though 4G is more than fast enough to stream high-quality OGG files, even with a weak signal, Spotify may still drop the quality to Low, which has a bitrate of just 24 Kb/s. You will notice such a sharp drop in quality, even on a pair of bottom-of-the-barrel headphones.

To rectify this, open the Spotify app, tap your user image, open “Settings and privacy,” and tap the “Media Quality” menu. Once there, set Wi-Fi streaming quality and cellular streaming quality to “Very high” or “Lossless.”

I recommend setting cellular streaming quality to Very high and reserving Lossless for Wi-Fi, since lossless streaming is very data-intensive. One hour of streaming lossless files can take up to 1GB of data, as well as a good chunk of your phone’s storage, because Spotify caches files you’re frequently streaming. Besides, you’ll struggle to notice the difference unless you’re listening to music on a wired pair of high-end headphones or speakers; wireless connection just doesn’t have the bandwidth needed to convey the full fidelity of Spotify lossless audio.

You might opt for High quality if you have a capped data plan, but I recommend doing so only if you stream hours upon hours’ worth of music every single day over a cellular network. For instance, I burn through about 8 GB of data per month on average while streaming about two hours of very high-quality music over a cellular network each day.

Illustration of a headphone with various music icons around.


How Audio Compression Works and Why It Can Affect Your Music Quality

Feeling the squeeze when listening to your favorite song?

Set audio download quality to Very high or Lossless

If you tend to download songs and albums for offline listening, you should also set the audio download quality to “Very high” or “Lossless.” This setting is located just under the audio streaming quality section.

The audio download quality menu in Spotify's mobile app.

If you’ve got enough free storage on your phone, opt for the latter, but if you’d rather save storage space, set it to Very high. You’ll hardly hear the difference, but lossless files are about five times larger than the 320 Kb/s OGG files Spotify offers at its Very high quality setting, and they can quickly fill up your phone’s storage.

Adjust video streaming quality at your discretion

The last section of the Media quality menu is Video streaming quality. This sets the quality of video podcasts and music videos available for certain songs. Since I care about neither, I set it to “Very high” on Wi-Fi and “Normal” on cellular, but you should tweak the two options at your discretion because songs sound notably better at higher video streaming quality levels.

If you often watch videos over cellular and have unlimited data, feel free to toggle video quality to very high.

Make sure Data Saver mode is disabled

Even if your audio quality is set to Very high or Lossless, Spotify will switch to low-quality streaming if the app’s Data saver mode is enabled. This option is located in the Data saving and offline menu. Open the menu, then set it to “Always off,” or choose “Automatic” to have Spotify’s Data Saver mode kick in alongside your phone’s Data Saver mode.

You can also enable volume normalization and play around with the built-in equalizer

Spotify logo in the center of the screen with an equalizer in front. Credit: Lucas Gouveia / How-To Geek

Last but not least, there are two additional features you can play with to improve your listening experience. The first is volume normalization, which sets the same loudness for every track you’re listening to. This can be handy because different albums are mastered at different loudness levels, with newer music usually being louder.

Since I’m an album-oriented listener, I keep the option disabled. I can just play an album and set the audio volume accordingly, and I don’t really mind louder songs when listening to playlists, artists, or song radios.

But if you can’t stand one song being quiet and the next rattling the windows, visit the Playback menu, enable “Volume normalization,” and set it to “Quiet” or “Normal.” The “Loud” option can digitally compress files, and neither Spotify nor I recommend using it. This also happens with “Quiet” and “Normal,” since both adjust the decibel level of the master recording for each song, but the compression level is much lower and extremely hard to notice.

Before I end this, I should also mention that you can access the equalizer directly from the Spotify app, where you can fine-tune your music listening experience or pick one of the available equalizer presets. If your phone has a built-in equalizer, Spotify will open it; if it doesn’t, you can use Spotify’s. On my phone (a Samsung Galaxy S21 FE), I can only use One UI’s built-in equalizer.

To open the equalizer, open “Playback,” then hit the “Equalizer” button. Now you can equalize your audio to your heart’s content.


Adjusting just a few settings can have a drastic impact on your Spotify listening experience. If you aren’t satisfied with Spotify’s sound quality, make sure to adjust the audio before jumping ship. You should also check the sound quality settings from time to time, as Spotify can reset them during app updates.​​​​​​​

Three phones with a Spotify screen and the logo in the center.


These 8 Spotify Features Are My Favorite Hidden Gems

Look for these now.



Source link