My smart thermostat’s placement was wasting energy—here’s the $9 fix that worked instantly


Smart thermostats are a great way to save money. They give you control over your heating, allowing you to change the target temperature based on the time of day or turn off the heating when you’re away from home. While my smart thermostat was saving me money, I was still using more energy than I needed to.

2


Do You Really Need a Smart Thermostat?

Are smart thermostats really that special? For the average person, a $40 programmable thermostat might be a better option.

The placement problem that’s costing you money

Your thermostat may be in the wrong place

Ecobee Smart Thermostat Premium installed next to a piggy bank and cactus. Credit: Ecobee

Many of the most popular smart thermostats on the market are a single device with a temperature sensor built in. You stick the device on a wall somewhere, and the temperature is measured at this location.

The problem is that we tend to place the smart thermostat where it’s most convenient to use. For example, you might place it on the wall in a hallway so it’s easily accessible and doesn’t ruin the look of your living room.

The trouble with this placement is that your smart thermostat will heat your hallway to the perfect temperature, since that’s where the temperature sensor is located. Your hallway may be smaller than other rooms, so it may heat up more quickly, or it might heat up more slowly if it’s open-plan or opens onto the stairway.

This means that the rooms that you spend the most time in, such as your living room, aren’t being heated optimally, and this can cause you to turn the heating up or run it for longer, wasting energy.

Screenshot 2025-03-03 VExbER7G@2x

7/10

Integrations

Google Assistant, Apple HomeKit, Amazon Alexa

Connectivity

C, R, G/PEK, Y1, OB*, W1 (* accepts heat pump OB wire, W2, or Y2)

The Ecobee Smart Thermostat Essential provides built-in energy savings and easy temperature control on its sleek touchscreen and convenient app.


A temperature node is a cheap fix

Small enough to place wherever you want

An Aqara Temperature and Humidity Sensor lying on a table. Credit: Aqara

There’s a very simple and inexpensive solution to this problem. You can buy a cheap temperature sensor for as little as $10 that measures the ambient temperature. These devices are usually relatively small and don’t need to be fixed to the wall, so you can place them almost anywhere you want.

Instead of relying on a temperature reading from the sensor in your smart thermostat, you can measure the temperature in a much more useful location, such as your living room. The portable nature of a temperature sensor means that you can avoid some of the problems that can plague smart thermostats, such as being in direct sunlight, close to radiators, or on a cold external wall.

The Aqara Temperature and Humidity Sensor with a phone showing the associated app.

Dimensions (exterior)

36x36x9 mm

Compatibility

Apple Home, Aqara Home, Home Assistant (via ZHA or Zigbee2MQTT)

This small but mighty Zigbee sensor can measure both temperature and humidity. It’s battery-powered, so you can place it almost anywhere. You’ll need an Aqara hub to use it with the native app, or you can connect it directly to Home Assistant using integrations such as Zigbee2MQTT.


Offset your temperature in standard smart home systems

Make your thermostat match your temperature node

Your temperature node will tell you the current temperature at its specific location. How you use that information will depend on the smart home system that you’re using.

If you’re using a popular closed-source smart home system such as Alexa, Google Home, or Apple Home, your options are fairly limited. With the basic automation options available in some of these systems, you may struggle to build an automation that can control your heating based on the reading from the temperature node instead of the smart thermostat itself.

However, you can still use this information to your benefit. For example, if you want to heat your home to 68°F, your smart thermostat will turn off when the temperature in your hallway reaches 68°F. Using your temperature node, you can see that when your hallway reaches 68°F, your living room is only reaching 64°F.

You can then set a temperature offset of 4°F in your smart thermostat app. This will ensure that when temperature reading from the hallway shows as 68°F, it’s actually the living room temperature that will be close to this value. It may not match exactly, but it should allow you to get your living room much closer to your target temperature.

Use a temperature node as the source of truth

Home Assistant makes things more accurate

A Home Assistant menu showing you the temperature you can set for areas. Credit: Home Assistant

Things are a lot better if you use a more capable smart home system, such as Home Assistant. Home Assistant lets you create much more complex automations, so that you can take control of your heating rather than relying on the smart thermostat to do everything.

This is exactly what I did. Instead of relying on the temperature measured at my smart thermostat, I added a cheap temperature node to Home Assistant and created my own automation to control my heating.

My smart thermostat still does the work of turning the heating on and off, but when this happens is now determined by the temperature from my Zigbee sensor, which is in a much more optimal location. My living room now heats to the temperature that I want, regardless of what the smart thermostat is measuring in the hallway, and it works far more efficiently.

The beauty of this method is that you can add further temperature nodes to your home and take an average across all of them to make things even more accurate. You can also place them in different rooms and have your heating dependent on your living room temperature during the day and your bedroom temperature during the night.


Save even more money with your smart thermostat

Smart thermostats can save you money, but the all-in-one design of popular models isn’t the most efficient. Some models offer additional remote temperature sensors, but these are often fairly expensive. With the addition of a cheap temperature node, you can make your heating more accurate and save yourself even more money.



Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


As I’m writing this, NVIDIA is the largest company in the world, with a market cap exceeding $4 trillion. Team Green is now the leader among the Magnificent Seven of the tech world, having surpassed them all in just a few short years.

The company has managed to reach these incredible heights with smart planning and by making the right moves for decades, the latest being the decision to sell shovels during the AI gold rush. Considering the current hardware landscape, there’s simply no reason for NVIDIA to rush a new gaming GPU generation for at least a few years. Here’s why.

Scarcity has become the new normal

Not even Nvidia is powerful enough to overcome market constraints

Global memory shortages have been a reality since late 2025, and they aren’t just affecting RAM and storage manufacturers. Rather, this impacts every company making any product that contains memory or storage—including graphics cards.

Since NVIDIA sells GPU and memory bundles to its partners, which they then solder onto PCBs and add cooling to create full-blown graphics cards, this means that NVIDIA doesn’t just have to battle other tech giants to secure a chunk of TSMC’s limited production capacity to produce its GPU chips. It also has to procure massive amounts of GPU memory, which has never been harder or more expensive to obtain.

While a company as large as NVIDIA certainly has long-term contracts that guarantee stable memory prices, those contracts aren’t going to last forever. The company has likely had to sign new ones, considering the GPU price surge that began at the beginning of 2026, with gaming graphics cards still being overpriced.

With GPU memory costing more than ever, NVIDIA has little reason to rush a new gaming GPU generation, because its gaming earnings are just a drop in the bucket compared to its total earnings.

NVIDIA is an AI company now

Gaming GPUs are taking a back seat

A graph showing NVIDIA revenue breakdown in the last few years. Credit: appeconomyinsights.com

NVIDIA’s gaming division had been its golden goose for decades, but come 2022, the company’s data center and AI division’s revenue started to balloon dramatically. By the beginning of fiscal year 2023, data center and AI revenue had surpassed that of the gaming division.

In fiscal year 2026 (which began on July 1, 2025, and ends on June 30, 2026), NVIDIA’s gaming revenue has contributed less than 8% of the company’s total earnings so far. On the other hand, the data center division has made almost 90% of NVIDIA’s total revenue in fiscal year 2026. What I’m trying to say is that NVIDIA is no longer a gaming company—it’s all about AI now.

Considering that we’re in the middle of the biggest memory shortage in history, and that its AI GPUs rake in almost ten times the revenue of gaming GPUs, there’s little reason for NVIDIA to funnel exorbitantly priced memory toward gaming GPUs. It’s much more profitable to put every memory chip they can get their hands on into AI GPU racks and continue receiving mountains of cash by selling them to AI behemoths.

The RTX 50 Super GPUs might never get released

A sign of times to come

NVIDIA’s RTX 50 Super series was supposed to increase memory capacity of its most popular gaming GPUs. The 16GB RTX 5080 was to be superseded by a 24GB RTX 5080 Super; the same fate would await the 16GB RTX 5070 Ti, while the 18GB RTX 5070 Super was to replace its 12GB non-Super sibling. But according to recent reports, NVIDIA has put it on ice.

The RTX 50 Super launch had been slated for this year’s CES in January, but after missing the show, it now looks like NVIDIA has delayed the lineup indefinitely. According to a recent report, NVIDIA doesn’t plan to launch a single new gaming GPU in 2026. Worse still, the RTX 60 series, which had been expected to debut sometime in 2027, has also been delayed.

A report by The Information (via Tom’s Hardware) states that NVIDIA had finalized the design and specs of its RTX 50 Super refresh, but the RAM-pocalypse threw a wrench into the works, forcing the company to “deprioritize RTX 50 Super production.” In other words, it’s exactly what I said a few paragraphs ago: selling enterprise GPU racks to AI companies is far more lucrative than selling comparatively cheaper GPUs to gamers, especially now that memory prices have been skyrocketing.

Before putting the RTX 50 series on ice, NVIDIA had already slashed its gaming GPU supply by about a fifth and started prioritizing models with less VRAM, like the 8GB versions of the RTX 5060 and RTX 5060 Ti, so this news isn’t that surprising.

So when can we expect RTX 60 GPUs?

Late 2028-ish?

A GPU with a pile of money around it. Credit: Lucas Gouveia / How-To Geek

The good news is that the RTX 60 series is definitely in the pipeline, and we will see it sooner or later. The bad news is that its release date is up in the air, and it’s best not to even think about pricing. The word on the street around CES 2026 was that NVIDIA would release the RTX 60 series in mid-2027, give or take a few months. But as of this writing, it’s increasingly likely we won’t see RTX 60 GPUs until 2028.

If you’ve been following the discussion around memory shortages, this won’t be surprising. In late 2025, the prognosis was that we wouldn’t see the end of the RAM-pocalypse until 2027, maybe 2028. But a recent statement by SK Hynix chairman (the company is one of the world’s three largest memory manufacturers) warns that the global memory shortage may last well into 2030.

If that turns out to be true, and if the global AI data center boom doesn’t slow down in the next few years, I wouldn’t be surprised if NVIDIA delays the RTX 60 GPUs as long as possible. There’s a good chance we won’t see them until the second half of 2028, and I wouldn’t be surprised if they miss that window as well if memory supply doesn’t recover by then. Data center GPUs are simply too profitable for NVIDIA to reserve a meaningful portion of memory for gaming graphics cards as long as shortages persist.


At least current-gen gaming GPUs are still a great option for any PC gamer

If there is a silver lining here, it is that current-gen gaming GPUs (NVIDIA RTX 50 and AMD Radeon RX 90) are still more than powerful enough for any current AAA title. Considering that Sony is reportedly delaying the PlayStation 6 and that global PC shipments are projected to see a sharp, double-digit decline in 2026, game developers have little incentive to push requirements beyond what current hardware can handle.

DLSS 5, on the other hand, may be the future of gaming, but no one likes it, and it will take a few years (and likely the arrival of the RTX 60 lineup) for it to mature and become usable on anything that’s not a heckin’ RTX 5090.

If you’re open to buying used GPUs, even last-gen gaming graphics cards offer tons of performance and are able to rein in any AAA game you throw at them. While we likely won’t get a new gaming GPU from NVIDIA for at least a few years, at least the ones we’ve got are great today and will continue to chew through any game for the foreseeable future.



Source link