5 things you never thought a $5 ESP32 could do


The ESP32 is an inexpensive and versatile microcontroller that is normally associated with cheap smart home devices and small DIY gadgets. But this sliver of silicon can do much more than you probably thought, as evidenced by projects that push the board to its limit.

Host a web server

HelloESP web server in a frame by Tech1k on GitHub. Credit: Tech1k / GitHub

HelloESP is a website hosted on a $10 ESP32 development board with a paltry 520KB of RAM. Initially deployed in 2022, the project exists to see how far a cheap microcontroller can be pushed. The original design lasted 500 days before burning out, but in mid-2026, the project author rebuilt the project, and it’s now up for all to see.

You can check out the brief on the project’s GitHub page. The full bill of materials includes the ESP32 DOIT DevKit V1, BME280 and CCS811 sensor arrays that gather environmental data, a 128×64 OLED panel that displays the server’s status, a microSD card, and two LEDs.

The project is completely open source, and you can build something similar yourself. This feels like a project you’d undertake for the sake of it rather than a serious attempt to host a website, but it’s still an impressive feat. The author has implemented some form of redundancy via a Cloudflare Worker that shows an offline page if the server goes down.

A polyphonic audio synthesizer with 80 voices

You don’t need to know a lot about synthesizers to understand that using an ESP32 to power an 80-voice polyphonic synthesizer with crystal clear audio is an achievement. The ESP32Synth project is capable of rendering more than 364 voices, but the documentation refers to this mode as “The Abyss” in that it results in poor latency, audio jitter, and other problems. Hence, 80 is the safe limit before things start falling apart.

For this project, you’ll need a dual-core ESP32 like the classic or S3 model (single-core C3s and S2s aren’t recommended), an external digital-to-audio converter (DAC) that can interface with the board’s I2C pin, and some understanding of how a synthesizer works in order to make music.

There are plenty more ESP32-powered synthesizers out there, like MothSynth and esp32_basic_synth, but none that go quite as hard as this one (that I could find).

Responsive radar-powered predictive lighting

The ESP32 is often used to power mmWave presence sensors in the smart home, which are essentially small radar scanners that track the position of people with a high degree of accuracy. This is good for feeding back into a smart home platform like Home Assistant, to prevent the lights from going out while you’re still in the room (among other things).

So what if you could take data from a sensor, process it, and then use it to turn on nearby LED lights like some sort of magic trick? It turns out you can, and you can use a single ESP32 board for both sides of the equation, with a response time in the milliseconds.

Perhaps the most impressive of these examples is LightTrack-VISION thanks to the author’s impressive video demo posted on Reddit. This particular project used an ESP32-C3 SuperMini and an LD2410B radar sensor, while the similar AmbiSense project (which has better documentation) can also be set up to use multiple ESP32s for added accuracy.

Run local AI models with image recognition

ESP32-powered AI meter reader by jomjol on GitHub. Credit: jomjol / GitHub

\One of the most interesting ESP32-powered smart home projects that I’ve seen recently is a magnetometer-based sensor that measures gas and water meters. It’s highly accurate and senses movement in the mechanism that your utility company uses, but it doesn’t exactly “read” the meter.

But thanks to lightweight AI frameworks like Tensorflow Lite, you can run an AI model on an ESP32 board with a camera that reads a meter and pushes this information back to a platform like Home Assistant via MQTT.

The meter reader project uses an ESP32-CAM module, but there are other ways you can use this technology. One Instructables tutorial focuses more generally on using an ESP32-S3 with an external camera module for more general image recognition tasks.

Install a “real” operating system with apps

MicroPythonOS operating system for ESP32 and other embedded devices. Credit: MicroPythonOS

Unlike the Raspberry Pi, ESP32 microcontrollers don’t run a “proper” OS that you can interact with in the traditional sense. Instead, you flash it with code, and it performs its job until you unplug it or flash it again. But thanks to the work of some very dedicated individuals, you can now install “proper” operating systems on an ESP32 complete with apps.

Tactility and MicroPythonOS are two examples of this in action. Each has a graphical user interface, built-in and external applications, app stores, over-the-air updates, and support for touchscreen input. You can install them on displays embedded with ESP32 microcontrollers like the Cheap Yellow Display family of devices and LilyGO smart watches, or build something yourself.

Each can be installed via a web installer using a supported browser like Chrome.


Looking for some more practical projects? Here are seven ESP32 projects you can do in an hour.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


The battle between AMD and NVIDIA rages on eternally, it seems, though it’s rather a one-sided battle in the desktop PC market, where NVIDIA holds something like 95%, and AMD most of what’s left apart from Intel’s (almost) 1%.

But as dominant and popular as NVIDIA is, AMD proponents could always raise the value argument. On a per-dollar basis, you get more value with an AMD card, and even better, you have the benefit of AMD “FineWine” which ensures your card will become even better with time.

What “FineWine” meant—and why it mattered

FineWine was something that AMD fans began to notice during the GCN (Graphics Core Next) architecture. Incidentally, the last AMD dedicated GPU I bought was the R9 390, which was of that lineage. Since then, all my AMD GPUs have been embedded in consoles or handheld PCs, but I digress.

The R9 390 is actually a good example of FineWine. Launched in 2015, like many AMD cards, the R9 390 had a rough start, and I sold mine in exchange for a stopgap card in the form of the RTX 2060, because I wanted to play Cyberpunk 2077 on PC, where it wasn’t broken the way it was on consoles. Even though, on paper, the raw power of the RTX 2060 wasn’t much more than a 390, the AMD card’s performance on my (then) 1080p monitor was a stuttery mess, whereas everything suddenly ran great on my 2060 the minute the AMD GPU was expunged from the system.

But, a decade later, that same game is perfectly playable on this card, as you can see in this TechLabUK video.

A lot of it is because the developers have kept patching and improving the game, but this is something you see across the board for AMD cards on various games. This is FineWine. Years later, with continued driver updates from AMD, the cards go from being a little worse than their NVIDIA equivalent at launch to being as good or even a little better in the long run.

Of course, that’s not super helpful to customers who buy hardware at launch, but it has given some AMD users computers with longer lifespans than you’d think, and made many used AMD cards an even better bargain.

Why AMD’s FineWine era worked

A bit of smoke and mirrors

The PULSE AMD Radeon RX 6800 XT next to an AMD RX 6600 XT Phantom Gaming D. Credit: Ismar Hrnjicevic / How-To Geek

FineWine wasn’t magic, of course. The phenomenon was the result of a mix of factors. AMD’s architectures were in some cases a little too forward-thinking for the APIs of the day. Massively parallel with a focus on compute, they’d only come into their own with DirectX 12 and more modern games. NVIDIA’s cards at the time were better optimized to run current games well. Over time, NVIDIA cards would make similar architectural changes, but with better timing.

The other reason FineWine was a thing came down to driver maturity. As a much smaller company with fewer resources, it seems that AMD had some trouble releasing cards with optimized drivers. So, over time, the card would start performing as intended.

In both cases, you could frame FineWine not as the card getting better, but rather getting “less worse” over time. If you set the bar low at launch, the only way is up. However, there’s a third factor to take into account as well. AMD dominates console gaming. The two major home console series have now run on AMD GPUs for two generations, and so games are developed with that hardware in mind. This also gives newer titles a bit of a leg up, though it’s hard to know exactly by how much.

How AMD moved on from FineWine

It seems worse, but it’s actually better

An AMD RX 9070 XT Gigabyte gaming graphics card. Credit: Ismar Hrnjicevic / How-To Geek

With the shift to RDNA architecture, AMD made a deliberate change in philosophy. Modern Radeon GPUs are designed to perform well right out of the gate. Reviews on day one are much closer to what you could expect years later. There are still decent gains to be had on RDNA cards with game-specific optimizations (Spider-Man on PC is a great example), but the golden age of FineWine seems to be in the past now.

That’s a good thing! Products should put their best foot forward on day one, so let’s not shed a tear for FineWine in that regard. So it’s not so much that AMD doesn’t care about improving the performance and stability of older cards over the years, it’s that the company is now better at its job, and so there’s less room for improvement.

Sapphire NITRO+ AMD Radeon RX 9070 XT GPU

Cooling Method

Air

GPU Speed

2520Mhz

The AMD Radeon RX 9070 XT from Sapphire features 16GB of DDR6 memory, two HDMI and two DisplayPorts, and an overengineered cooling setup that will keep the card cool and whisper quiet no matter the workload.


NVIDIA kept the idea—but changed the formula

It’s all about AI

It’s funny, but these days I think of NVIDIA cards as the ones with major longevity. Take the venerable GTX 1080 and 1080 Ti cards. These cards only lost game-ready driver support in 2025, which doesn’t immediately make them useless, it just means no more optimization for those chips. What an incredible run, getting a decade of relevant game performance from a GPU!

But, that’s not really NVIDIA’s take on FineWine. Instead, the company has taken to adding new and better features to its cards long after they’ve been launched. Starting with the 20-series, the presence of machine-learning hardware means that by improving the AI algorithms for technologies like DLSS, these cards have become more performant with better image quality over time.

While NVIDIA has made some features of its AI technology exclusive to each generation, so far all post 10-series GPUs benefit from every new generation of DLSS. Compare that to AMD which not only offers inferior versions of this new upscaling technology, but has locked the better, more usable versions to later cards, such as the case with FSR Redstone.


FineWine is an ethos, not a brand

In the case of my humble RTX 4060 laptop, the release of DLSS 4.5 has opened new possibilities, notably the ability to target a 4K output resolution, which was certainly not on the table when I first took this computer out of the box. We might not call it “FineWine,” but it sure smells like it to me!



Source link